Recently I was working on a large web-application and had to port it to new server. The main challenge was how to take the backup of the huge database and restore it on new server. Phpmyadmin was not able to take the backup of the database because of timeout issue with Apache. So I decided to use mysql’s utility ‘mysqldump’ for the database backup. I also want to transfer data to new server so I have to compress the database dump. It was quite easy
Backup MySQL Database
$mysqldump -u {username} -p{password} {database} | gzip -9 > [database.sql.gz]
We can use scp utility to transfer this database dump to new server. if you are not aware of scp then please google and you shall find lots of information about the usage
We can easy extract the data by using gzip
$gzip database.sql.gz
Restore MySQL Database
This part is tricky as mysql dump size grow. A simple command for restoring database is
$mysql -u {username} -p{password} {database} < database.sql
or use this in case you have compressed file
gunzip < {database.sql.gz} |mysql -u {username} -p{password} {database}
This should have done the magic of restoring the database but NO for larger database. For restoring large database, you must change configurations file my.cnf or specify the –max_allow_packet =1G during restore command.
[mysqld] section max_allowed_packet= 1G
Please note that mysql allows up to 1G for max_allowed_packet.
If your packet size(not database size) is more than 1G then consider –extended-insert off during database dump. It shall take more time for database restore but it would work.
I would like to see your comments on this.