Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Server Backups
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Server Backups

decaydecay Member

Hey Guys,

Im quite new here, and a server noob. Well, i been reading these forums/website here and there in the past, and purchased couple of vps deals from here :)

For starters, im a Linux noob, so every time with a vps i install something similar to virtualmin and manage everything through that.

Im at a point where i need to make a decent production server. It doesnt have to be amazing, so a stable vps should do it. Havent decided which to use yet, but thats on my todo list (i currently have 2 vps's from chicago vps, and urpad i think for my dev work hosting/testing).

So the question is, when i setup the new production vps, i would need a decent backup system. I know virtualmin can easily plug in and do automatic backups etc. So do you guys have any suggestions on where is a good place to do backups? Any cheap services? Frequency of backups?

Main things that would be hosted on the server are going to be wordpress websites and git/svn repos. So i doubt they will be very big. Currently i have couple of 100 mb's, but im sure this will grow. And i do wish to host my own dropbox equivalent (owncloud) to manage my projects etc too. If/when i do, this will start getting decently big.

So what are your recommendations? Can i just buy one of those cheap website hosts (that claim to have unlimited bandwidth/storage etc) and use its ftp as a backup server (might be against their t&c's though)? Or is there a better option?

Thanks in advance...

Comments

  • wdqwdq Member

    Most of those "unlimited" hosts have strict TOS rules that say that you can't use the storage for storing massive amounts of data, especially backups.

    Backupsy has really cheap $5 for 250GB and $7 for 500GB backup VPS's in several locations. If you install something like FreeNAS or OpenMediaVault it's pretty easy to backup to the VPS.

    I'd say backup as often as you can, and keep old copies of backups as well.

  • nixcomnixcom Member
    edited August 2013

    You should consider to use rsync instead of FTP for backups. Rsync uses ssh so it is more secure if you have important files to move on. I don't have any specific url to give you but a quick search on Google should help you out.

  • Note: thinking of backups should happen while the server is being prepared, not some time afterwards.

    Don't ever mess with 'unlimited' hosts. First, usually they do not allow storing backups. Second, I would rely upon their facilities.

    Use VPSes/other hosting plans devised for backup storage, such as Backupsy, BQBackup, XSBackup. Also, consider using Amazon S3 ('normal' and Glacier), as extremely reliable storage.

    You will be offered other backup service providers' URLs, I am sure.

    As for software, you can start with using BackupPC or similar free script to set up and perform regular backups.

    Don't forget, though, that database backups should be created first, prior to be stored on a backup facility.

  • DroidzoneDroidzone Member
    edited August 2013

    I'll tell you what I use.

    I have 8 production sites hosted on a cheap and increasingly unreliable vps. Every 4 hours, a Perl script dumps the sql off WordPress into files in the server. Every eight hours these are rsyncd into a backup vps at Backupsy. Every 24 hours, a tar.bz2 archive of all hosted sites is created, which is again rsyncd.

    Every month, my script running on the backup server creates a list of files older than 30 days and samples them, in a per week basis, and queues them for deletion. A backup per week for these archives is retained. Every 90 days, the remotely old archives are dumped from the backup server, along with database backups older than 60 days, retaining one database backup per 14 days, indefinitely.

    An additional monthly backup is retained on another ftp.

    Everything is automated and set to run via cron and a series of.configuration files, processed by my script. I've never used a third party solution. I don't believe in them and can't afford them.

  • @nixcom said:
    You should consider to use rsync instead of FTP for backups. Rsync uses ssh so it is more secure if you have important files to move on. I don't have any specific url to give you but a quick search on Google should help you out.

    Additionally, rsync only transfers changed/added/deleted files, instead of transferring everything all the time, which greatly speeds up the backup process.

    You might need full remote snapshot backups every now and then (once a month? once a quarter?) if you're paranoid, but I usually plan on using rsync to sync the www/code folders and databases from my live servers to a backup server nightly, then I use full local backups from the hosting provider if the server is important enough to warrant it.

    Here's a pretty simple/good overview of using rsync for backups: http://www.mikerubel.org/computers/rsync_snapshots/

  • Awesome, thanks for the advice guys... I will browse through those providers/plans you guys suggested and have a look... Any more suggestions/deals always welcome :)

    Im curious, you guys use any control panels? I guess, other than cpanel, im refering to the free/cheap ones like virtualmin? What about the backup options built into those control panels? One would think that they would probably uses a pretty secure system underneath (i could be wrong). And they seem to backup the entire virtualhost (which includes all the domain specific settings, dbs, domain specific user data/accounts, mail aliases, and the files). Thoughts on this?

  • PatsPats Member

    @joelgm said:
    I'll tell you what I use.

    i love the automation.. :)

  • Also always consider the speed of the network in case you need to restore something :) Not to realize the speed from your backup to the server is 200KB.s and you need you restore "only" 300GB.

  • @qhoster said:
    Also always consider the speed of the network in case you need to restore something :) Not to realize the speed from your backup to the server is 200KB.s and you need you restore "only" 300GB.

    Haha, yeah true. Grabbed a Backupsy 250gb ones. Going to have it take care of all the backups and all my git repo's and own cloud stuff (since the git/owncloud stuff will be backed up locally on my end, they dont really need an actual backup i dont think)...

    Im also thinking of putting the backups in an own cloud folder and sync-ing them automatically on my local computer for offsite backup. (thoughts?)

    Now i need a good/new production server.

  • I use virtualmin on all my boxes, I use the scheduled backups over SSH which auto deletes after x days and emails me on any backup failures. It does backup all the features of the virtualhost and I also use the csf firewall module as it has an easy interface.

  • I use Unison to backup things to all of my other servers. It runs hourly via cron as a regular, non-root user. I keep all my local server configurations in ~/etc when possible (like lighttpd includes, monit includes, etc), then run Unison every hour. It's better than rsync directly because it can do a two way sync (which means I can modify the files on any server and the newest version ends up on all of them eventually). Not to mention each pair of servers uses it's own configuration file, so no crazy command line options to remember. Here's a good link to try it out. http://www.howtoforge.com/setting-up-unison-file-synchronization-between-two-servers-on-debian-squeeze

Sign In or Register to comment.