Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


A simple cPanel/MYSQL backup solution.
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

A simple cPanel/MYSQL backup solution.

Ash_HawkridgeAsh_Hawkridge Member
edited December 2012 in General

I was just wondering if anybody had a quick and simple method of backing up MYSQL databases for cPanel accounts, possibly by cron or something?

We want to back our WHMCS/Wordpress databases up every 15 minutes or so, since WHMCS only has a once daily option available (As does the new cPanel/WHM backup system).

Opinions and solutions appreciated :)

Comments

  • 15 minutes looks a little over-necessary.

  • Ash_HawkridgeAsh_Hawkridge Member
    edited December 2012

    @gubbyte said: 15 minutes looks a little over-necessary.

    I gather you have never had to trace missing transactions back to WHMCS when a couple hours of data is missing from a backup, its not nice :P

    The other reason is that we have a second webserver on "Hot-standy" if you will, with an exact copy of our website on it, so if our main server ever goes down all we need to do is update the DB and switch the A record for our domain.

  • Here's the script I use on a 15 minute cronjob.

    #!/bin/sh
    NOW=$(date +"%d-%m-%Y")
    MUSER="LLAMA"
    MPASS="VERYLONGPASSWORD"
    MDB="DATABASE"
    MYSQL="$(which mysql)"
    MYSQLDUMP="$(which mysqldump)"
    
    FILE=/backup/mysql/store/mysql-$db.$NOW-$(date +"%T").sql
    $MYSQLDUMP -u $MUSER -p$MPASS $MDB > $FILE
    s3cmd put /backup/mysql/store/ s3://gatsbybackup/mysql/
    rm -f /backup/mysql/store/*.sql
    

    Dumps the database from MySQL to a directory, then S3cmd pushes all files in the cue up to S3.
    Once they're safe and sound, all database backups are deleted from the cue folder.

    You could very, very easily change this to replace s3cmd with rsync

  • Awmusic12635Awmusic12635 Member, Host Rep

    If you don't mind paying, i usually use Bacula4Hosts

  • @GetKVM_Ash

    http://www.hostliketoast.com/developer-channel/

    Scroll down to -- cPanel and Webhosting Tools

    I've been using the full backup for over a week pushing to a BlueVM box, works fine, put it in a cron job. They also have just the MySQL portion as well.

  • R1Soft <3

  • jhjh Member

    Have to love R1soft. But if you don't want to buy a licence:

    #!/bin/sh
    
    This script will backup one or more mySQL databases
    
    and then optionally email them and/or FTP them
    
    This script will create a different backup file for each database by day of the week
    
    i.e. 1-dbname1.sql.gz for database=dbname1 on Monday (day=1)
    
    This is a trick so that you never have more than 7 days worth of backups on your FTP server.
    
    as the weeks rotate, the files from the same day of the prev week are overwritten.
    
    /bin/sh /home/user/directory/scriptname.sh > /dev/null
    
    #
    
    ===> site-specific variables - customize for your site
    
    List all of the MySQL databases that you want to backup in here,
    
    each seperated by a space
    
    If not run by root, only one db per script instance
    
    databases="mydbname"
    
    Directory where you want the backup files to be placed
    
    backupdir=/home/mydomain/backups
    
    MySQL dump command, use the full path name here
    
    mysqldumpcmd=/usr/bin/mysqldump
    
    MySQL Username and password
    
    userpassword=" --user=myuser --password=mypasswd"
    
    MySQL dump options
    
    dumpoptions=" --quick --add-drop-table --add-locks --extended-insert --lock-tables"
    
    Unix Commands
    
    gzip=/bin/gzip
    uuencode=/usr/bin/uuencode
    mail=/bin/mail Send Backup? Would you like the backup emailed to you? Set to "y" if you do sendbackup="n"
    subject="mySQL Backup"
    mailto="[email protected]" ===> site-specific variables for FTP ftpbackup="y"
    ftpserver="myftpserver.com"
    ftpuser="myftpuser"
    ftppasswd="myftppasswd" If you are keeping the backups in a subdir to your FTP root ftpdir="forums" ===> END site-specific variables - customize for your site # Get the Day of the Week (0-6) This allows to save one backup for each day of the week Just alter the date command if you want to use a timestamp DOW=date +%w Create our backup directory if not already there mkdir -p ${backupdir}
    if [ ! -d ${backupdir} ]
    then
    echo "Not a directory: ${backupdir}"
    exit 1
    fi Dump all of our databases echo "Dumping MySQL Databases"
    for database in $databases
    do
    $mysqldumpcmd $userpassword $dumpoptions $database > ${backupdir}/${DOW}-${database}.sql
    done Compress all of our backup files echo "Compressing Dump Files"
    for database in $databases
    do
    rm -f ${backupdir}/${DOW}-${database}.sql.gz
    $gzip ${backupdir}/${DOW}-${database}.sql
    done Send the backups via email if [ $sendbackup = "y" ]
    then
    for database in $databases
    do
    $uuencode ${backupdir}/${DOW}-${database}.sql.gz > ${backupdir}/${database}.sql.gz.uu
    $mail -s "$subject : $database" $mailto < ${backupdir}/${DOW}-${database}.sql.gz.uu
    done
    fi FTP it to the off-site server echo "FTP file to $ftpserver FTP server"
    if [ $ftpbackup = "y" ]
    then
    for database in $databases
    do
    echo "==> ${backupdir}/${DOW}-${database}.sql.gz"
    ftp -n $ftpserver <<EOF
    user $ftpuser $ftppasswd
    bin
    prompt
    cd $ftpdir
    lcd ${backupdir}
    put ${DOW}-${database}.sql.gz
    quit
    EOF
    done
    fi And we're done ls -l ${backupdir}
    echo "Dump Complete!"
    exit
  • Thank you all guys, going to give them all a try and see which works best, will update with results!

  • sleddogsleddog Member
    edited December 2012

    mysqldump'ing every 15 minutes could be a disaster, if tables are locked when someone is midway through a purchase.

    Look carefully at the mysqldump options, like --lock-tables and --single-transaction (for InnoDB). Or replication....

  • Why are you hosting your website on cpanel!
    We use https://github.com/meskyanichi/backup to backup ours but its not cpanel.

  • Here's a backup script I posted a while back on my blog:
    http://www.georgetasioulis.com/cpanel-backup-script-to-amazon-s3/

    You can modify it to not backup the account's home directory by adding the --skiphomedir parameter on line 32.

    If you want to go the R1Soft way, you can setup a VPS on a different location than your cPanel server and install CDP Enterprise Server there. Then install the CDP Agent on your cPanel server and 15-minute backups won't be a problem :)
    For a years worth of backups you need about 2x the storage space of your source server's capacity.
    PM me for cheap licenses :)

  • @Spencer said: Why are you hosting your website on cpanel!

    Why not, we manage multiple domains from one dedicated cPanel server?

  • @George_Fusioned

    Thank you George, will check that out too.

Sign In or Register to comment.