I am anal with backups. Personally, I probably over-do it with the amount of redundancy, but I think the over-kill is justified. I spend hours and hours each week building up World War II Database, for example, and I do not want to chance a hard drive failure to destroy hours of work. Backup is important. Period.
With that said, it is actually very easy to backup a MySQL database, which is what powers World War II Database in the backend. The following command line illustrates how to create a MySQL database dump file which will contain your latest data.
mysqldump --user=username --password=password --host=localhost db1 --opt > ~/backups/db1dump.sql mysqldump --user=username --password=password --host=localhost --databases [db1 db4 db9] --opt > ~/backups/dbsdump.sql mysqldump --user=username --password=password --host=localhost --all-databases --opt > ~/backups/dballdump.sql
The first example above dumps only one database, “db1”, to the file “db1dump.sql” in the specified directory. The second example illustrates how we can backup data from multiple databases. If we just wish to dump out all data across all database, the third example shows how we can do so.
Once we have chosen the method we want and have the command line we wish to use, we should schedule it to run regularly. For windows, the built-in “Scheduled Task” utility works well for this. For Unix/Linux, “cron” could be used to generate the command line. Do not forget it is probably wise to transfer a copy of your database dump file to a different machine immediately to prevent a hard drive failure wiping out both your database and backups at the same time.
In the dreadful event that our MySQL database goes bad and we are forced to establish a new database machine, we just have to retrieve our most recent database dump file and run the following command on the new install.
mysql --user=username --password=password --host=localhost db1 < db1dump.sql