New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
How transfer very big site?
chiccorosso
Member
in Help
hello,
i need to change server for a site of 250gb
II can't do any backup even on extarnal server because there isn't enough space to do that.
Plesk migrator not work
I try to download by ftp but it's very slow and not download all files
how can i move?
Comments
rsync is your friend.
===============
Also, personally me prefer a b2 because of the price.
Moving (download / upload / store) 250GB of data will be cost around 3-4$ with blackbaze. Wich is pretty nice.
https://www.backblaze.com/b2/cloud-storage-pricing.html
What will I do in your case?
He will definitely fail at your second steps.
work with plesk?
Assuming that "site" means some directory/ies and everything in them you can do this (on the "old" server:
Notes:
I had prepared a full example but %%§$"/§ Cloudflare is f*cking up everything. I've had enough. Sorry.
I would just rsync it...
The problem is that he doesn't have enough space and most of the comments above suggest create bzip/tar files and move it...
You don't have enough space to create backup so you can't move the whole settings to the new server.
You will have to create per site settings for new site, then rsync the content and move db to the new server.
Another choice is asking if your current provider provide local renting and mounting another disk on it. Then mount the drive, do the backup and move it to the new server.
Use rsync to create identical copy ot the site/server elswhere.
If it contains only static files, then, it will be no issue. If there are mysql databases also, there are many routes you can follow.
For example, instead of creating backup of the entire site, create only mysql dumbs. Then, copy static files and restore the mysql databases to the new server.
You haven't told us what kindy of infrastructure you have. Is it a website like Wordpress or Joomla? If yes, another trick you can follow to create backup via app (e.g. akeeba backup) and you have not enough space for that on the initial server, is to move via rsync certain folders that usually takes all the space (e.g. images and videos) to the backup server. This way, you gain a lot of space in the initial node. Do the backup via app, move the site and then, restore to the original place in the new node the moved static files (images and videos).
which type of website. which consume 250 gb.
it is video type website.
No. Some solutions (like mine) need (almost) no space but transmit the output directly to the new server.
The reason I prefer my solution over some others (e.g. rsync) is that it's simple, secure, and COMPRESSES - which you definitely want when transferring 250 GB.
In fact I myself don't use tars "built-in" compression but (properly parametrized) zstd because compression is probably the single most important factor in that kind of use case but I wanted to keep it simple for OP and e.g. bzip2 isn't too bad.
rsync over ssh (rsync -avze ssh /bla root@destination:/bla) is also encrypted and compressed. And it won't transfer things didn't change (unlike tar).
Space for what? Rsync doesn't need additional space on the source server and can transfer files compressed and encrypted.
Re-read my whole sentence please?
oh boy.. this is going to be complex.. and I am not talking about moving the site.
You can directly pipe tar over ssh. It won't use any space on the source.
That is actually interesting. If you could deliver some specific infos @chiccorosso
There can be different backup/migrate strategies depending on your case.
no need to compression you can use file copy and transfer with rsync
At low hardware compression may be worsen copy
You could just use Pastebin for those
https://www.digitalocean.com/community/tutorials/migrate-your-current-vps-linode-rackspace-aws-ec2-to-digitalocean
or
http://www.webhostingtalk.com/showthread.php?t=1141280
The thing here is that rsync alone won't transfer mysql or other databases (at least, correctly). He needs to dumb them. Also, we don't know if he just want to copy a website or all the infrastructure (e.g. web panel, settings, special configs etc.).
OP has disappeared. Either he found out what to do, or he dont need it anymore (I guess).
You could rsync then stop all the services and rsync everything again. This way the dbs and other open files are transferred correctly.
Depending on what makes up that 250GB of data, you could do it in stages
If some of the data is made up of individually large files i.e. zip, tar.gz, video or other megabyte/gigabyte sized files, then you can start moving the largest of those files over to another server via rsync as folks have mentioned. For example if you had 50x 1GB sized files = 50GB of data can be rsync'd over to another server and then removed from the existing server. This frees up 50GB of disk space for you.
If freeing up largest files via rsync move gives you enough free disk space now, you can use whatever means to transfer the rest of your data normally
There is ways/tool for any halfway decent SQL server to dump tables and/or database(s) to stdout which can then be piped into compression and then into ssh.
The real problem is integrity incl. the question what happens to data that have been entered after the transfer. One solution is to simply put the DB in read only mode (web site works except no new entries) and dump it.
If one prefers rsync and the DB isn't too big a multi phase process like the one suggested by @eva2000 might work but put the web site out of order somewhat longer. Like: first transfer a lot of non-DB stuff to make room for the DB dump.
Sounds like OP is using Windows and Plesk. Does rsync and ssh work in Windows? Genuine question as I haven't touched Windows in a decade or more.
Yeah one method (once you have freed up some disk space first) would be to use mydumper for parallel multi-threaded database backup for individual tables to sql files https://github.com/maxbube/mydumper and then instead of normal compression methods like gzip, you could use facebook's zstd https://github.com/facebook/zstd with data dictionary training which can improve zstd compression ratios by 3.5x so you end up with much smaller individual table sql compressed files which you can rsync across to another server.
Or even without zstd data dictionary training there are other compression tools which much better compression ratios which result in smaller compressed file sizes. I benchmarked various compression tools including ztsd vs gzip vs pigz vs brotli etc at https://community.centminmod.com/threads/round-2-compression-comparison-benchmarks-zstd-vs-brotli-vs-pigz-vs-bzip2-vs-xz-etc.15126/. This is all provided you first free up some disk space by transferring the largest individual files first via rsync and removing their local copies on existing server. Example compression ratio for uncompressed to compressed size for gzip level 9 at 3.0767 vs for lbzip2 level 9 at 3.8784 vs pxz level 6 at 4.2768 vs zstd level 19 at 3.9695 vs brotli level 6 at 3.6224
Also log files are good candidate for compression to free up disk space if you still need to keep the log files. If you do not need the log files, just delete them.
scp -r -P 22 root@IP:/home/old/ /home/new
Quote: http://www.zxsdw.com/index.php/archives/1138/
Bad idea. For one without -C flag there is no (gzip) compression. More importantly though the -r flag follows symbolic links.