We have a production database that needs to be backed up every 2 hours. There are two destinations actually; one is a staging server (on the same DB host) that we use for testing and the other is a copy of the database that we just need to make sure is in pristine backed up condition.
We want to do this efficiently, however we are finding that mysqldump locks the database such that the entire webapp hangs for ~15 minutes during a dump due to the size of the database.
How can we get around this and do so in the most efficient manner possible?
As Serge said, Percona Xtrabackup is the appropriate backup utility in your scenario where you can take backup without locking the tables. You can get more information here.
So this does indeed seem to be the right solution, however I’m running into a problem simply trying to make my first backup. I created a folder called backup in my ~ folder. I then proceeded to craft the following command: innobackupex --user=username password=user_password ~/backup
Unfortunately after running this I get the following error:
xtrabackup: Error: Please set parameter ‘datadir’
innobackupex: Error: ibbackup child process has died at /usr/bin/innobackupex line 389.
Is ‘datadir’ option is set in my.cnf under mysqld section ? If not please set datadir parameter in my.cnf under mysqld section or if it’s already set and my.cnf exists on non-standard location you can pass --defaults-file=/path/to/my.cnf option to innobackupex command.