Jamie Whittingham
Active Member
So we recently decided to migrate our entire database to a new MariaDB Galera Cluster and we were excited with the dev results we had seen but we faced one big issue ... how to import a mysqldump file that was over 100 GB is size.
We tried several times to use the native mysql command line tool but always seemed to run into issues and we decided that we would break out database.sql file into multiple parts. Now, if you still have your production database server running then this is easy enough and here is a little script we found online to help with this task.
Save this as a standard .sh file and off you go.
However, if you have the MailWizz Backup Manager generated database.sql file then you will run into some issues with large files .... so how does one account for this scenario?
Well ... the answer is to split your database.sql file into tables and that can be done by following these simple steps
1. Install nodejs
2. Install mysqldumpsplit
3. Run mysqldumpsplit
This will break the main database.sql file into smaller table_name.sql files which are easier to work with. Now this is a great step forward but no one wants to set there and get them in the correct order etc so we have written a small PHP script to handle this part.
Edit the vars at the top and name the file "script.php" and place it next to database.sql
This script will then process each SQL file and exclude the main database.sql file and import them into your database.
Hope this helps
We tried several times to use the native mysql command line tool but always seemed to run into issues and we decided that we would break out database.sql file into multiple parts. Now, if you still have your production database server running then this is easy enough and here is a little script we found online to help with this task.
Code:
#!/bin/bash
# dump-tables-mysql.sh
# Descr: Dump MySQL table data into separate SQL files for a specified database.
# Usage: Run without args for usage info.
# Author: @Trutane
# Ref: http://stackoverflow.com/q/3669121/138325
# Notes:
# * Script will prompt for password for db access.
# * Output files are compressed and saved in the current working dir, unless DIR is
# specified on command-line.
[ $# -lt 3 ] && echo "Usage: $(basename $0) <DB_HOST> <DB_USER> <DB_NAME> [<DIR>]" && exit 1
DB_host=$1
DB_user=$2
DB=$3
DIR=$4
[ -n "$DIR" ] || DIR=.
test -d $DIR || mkdir -p $DIR
echo -n "DB password: "
read -s DB_pass
echo
echo "Dumping tables into separate SQL command files for database '$DB' into dir=$DIR"
tbl_count=0
for t in $(mysql -NBA -h $DB_host -u $DB_user -p$DB_pass -D $DB -e 'show tables')
do
echo "DUMPING TABLE: $DB.$t"
mysqldump -h $DB_host -u $DB_user -p$DB_pass $DB $t | gzip > $DIR/$DB.$t.sql.gz
tbl_count=$(( tbl_count + 1 ))
done
echo "$tbl_count tables dumped from database '$DB' into dir=$DIR"
Save this as a standard .sh file and off you go.
However, if you have the MailWizz Backup Manager generated database.sql file then you will run into some issues with large files .... so how does one account for this scenario?
Well ... the answer is to split your database.sql file into tables and that can be done by following these simple steps
1. Install nodejs
2. Install mysqldumpsplit
Code:
npm install mysqldumpsplit
Code:
mysqldumpsplit database.sql
This will break the main database.sql file into smaller table_name.sql files which are easier to work with. Now this is a great step forward but no one wants to set there and get them in the correct order etc so we have written a small PHP script to handle this part.
PHP:
<?php
$database_user = 'mailwizz_db_user';
$database_name = 'mailwizz_installation';
$folder = '/root/sql_playground/';
if ($handle = opendir($folder)) {
while (false !== ($entry = readdir($handle))) {
if ($entry != "database.sql" && $entry != "script.php" && $entry != "." && $entry != "..") {
// tit bits
echo "Working on ".$folder.$entry."\n";
// disable key check
shell_exec("sed -i '1i SET FOREIGN_KEY_CHECKS=0;' ".$folder.$entry);
// enable key check
shell_exec('echo "SET FOREIGN_KEY_CHECKS=1;" >> '.$folder.$entry);
// mysql import
shell_exec("mysql -u".$database_user." -p ".$database_name." < ".$folder.$entry);
}
}
closedir($handle);
}
Edit the vars at the top and name the file "script.php" and place it next to database.sql
This script will then process each SQL file and exclude the main database.sql file and import them into your database.
Hope this helps