New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Extract 8GB file on 6.6GB storage left
Suggest me a workaround... I have an 8GB Tar Backup to be deflated on a small Aruba 1GB-VPS with 20GB Storage. All Pkgs and OS installed (cleaned and cache purged) I have this left:
$ df -h
Filesystem Size Used Avail Use% Mounted on
/dev/dm-0 18G 10G 6.6G 61% /
udev 10M 0 10M 0% /dev
tmpfs 201M 4.6M 196M 3% /run
tmpfs 501M 0 501M 0% /dev/shm
tmpfs 5.0M 0 5.0M 0% /run/lock
tmpfs 501M 0 501M 0% /sys/fs/cgroup
/dev/sda1 461M 34M 405M 8% /boot
I don't need the Tar ball, can discard it after it is deflated. Can't Expand my storage drive or mount any other... Is there a way around this ?
Comments
try turning off the reserve space.
It should get you a couple GB probably.
Francisco
If it's compressible, compress it first to free up the disk space, then untar it where the uncompressed tarball will be streaming in the memory. A fast algorithm like gzip should do.
Have you tried streaming the tarball through SSH?
Transfer the tarball to another server, create an "extracted" directory in the root folder of the target server, and run something like
Side note, don't know what options you need to extract the tar with, probably should verify the flags before you begin extracting using my command.
Wont a extracted 8GB tarball be larger than 6.6GB?
Tarball is on the server already, 10GB used - 8GB = 2GB used once tarball is off server
So about 16GB free
Once the tarball extracts, he will delete it, thus providing enough space. He wants to simultaneously extract and delete the tarball, so he'll have enough room to hold the contents.
Put it on a different server, mount the file system remotely.
So, you want magic?
This.
dislike, harming a little bunny :c
Nobody seems to care about the box
Tar on it's own isnt compressing anything, it should be relatively the same size, from what I understand anyway. Bundled with Gzip and Bzip you get compression from the tarball output. So you end up decompressing the Gzip'd tarball and then restoring from the tarball.
^this looks best.
You could stream it from http:// and untar it?
Assuming you don't have another server (or an http:// accessible place to store 8GB) you can do something like this:
Get yourself a free Pcloud account. It's a cloud storage service with free plan, no file size limit and (importantly here) WebDAV support
curl RemoteTarball | tar xvz
You can see examples of how to use cURL with webdav among replies here:
https://www.lowendtalk.com/discussion/67181/transip-stack-free-1tb-owncloud
Check this for Pcloud WebDAV settings info:
https://freevps.us/thread-17961.html
It should work.
just thinking outside the box here,
wouldn't midnight commander be able to browse the .tar from where you could Move things out ?
or does mc actually decompress the whole archive first ?
I like this approach, at least as long as it isn't compressed archive, mc should be able to browse without hooking up much space or memory. yet I did experience problems while trying to move files from a tar with mc some times ago. haven't looked much further into it, but maybe worth a try to see if that was only an irregular or easy to fix issue...
as alternative I'd guess this backup tar had to be laying around somewhere else before so just use the method @ALinuxNinja suggested from there.
or instead just mount that place (or the aruba-box vice versa) and untar from there...
http://superuser.com/questions/660788/unarchive-file-while-reducing-size-of-archive
cower
drivE or mount any other... Is there a way around this ?
Do you have fuse enabled?
Just mount google drive ..push the tar to there and then extract it locally.
Well, Thanks for all your suggestions.
I had thought of streaming it over SSH but the source VPS is a crawler at network speed.
The other FUSE suggestions are doable, however the amount of extra pkgs to be installed are a bit invasive for a 1GB VPS... I'd like to keep it mean and clean.
I extracted the TAR to one of my hetzner servers and then made 3 Zip Archives of it ... then copied them one by one, extracting them and deleting them right away.
Now I need a cloud Sync solution for this 7-8 GB File set... so that the Primary server and this new one are always in sync, plus a 3rd cloud copy is also safe. I always craved for a decent Linux client for Google Drive or other services. Which one do you prefer and suggest ?
RClone with Google Drive or Amazon Cloud Drive ?
Yandex.Disk?
Any others to consider ? is there a good Linux GUI client (I run a basix LXDE Shell)
This https://www.lowendtalk.com/discussion/comment/1771259/#Comment_1771259 100% tested and working. You don't need a remote server.
http://pastebin.com/jwfSVzbV