Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Extract 8GB file on 6.6GB storage left
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Extract 8GB file on 6.6GB storage left

mehargagsmehargags Member
edited July 2016 in Help

Suggest me a workaround... I have an 8GB Tar Backup to be deflated on a small Aruba 1GB-VPS with 20GB Storage. All Pkgs and OS installed (cleaned and cache purged) I have this left:

$ df -h
    Filesystem      Size  Used Avail Use% Mounted on
    /dev/dm-0        18G   10G  6.6G  61% /
    udev             10M     0   10M   0% /dev
    tmpfs           201M  4.6M  196M   3% /run
    tmpfs           501M     0  501M   0% /dev/shm
    tmpfs           5.0M     0  5.0M   0% /run/lock
    tmpfs           501M     0  501M   0% /sys/fs/cgroup
    /dev/sda1       461M   34M  405M   8% /boot

I don't need the Tar ball, can discard it after it is deflated. Can't Expand my storage drive or mount any other... Is there a way around this ?

Comments

  • FranciscoFrancisco Top Host, Host Rep, Veteran

    try turning off the reserve space.

    tune2fs -m 0 /dev/dm-0

    It should get you a couple GB probably.

    Francisco

    Thanked by 1netomx
  • If it's compressible, compress it first to free up the disk space, then untar it where the uncompressed tarball will be streaming in the memory. A fast algorithm like gzip should do.

    Thanked by 1mehargags
  • edited July 2016

    Have you tried streaming the tarball through SSH?

    Transfer the tarball to another server, create an "extracted" directory in the root folder of the target server, and run something like

    cat backup.tar.gz | ssh root@aruba-cloud-ip "cd extracted; tar zxvf -"
    

    Side note, don't know what options you need to extract the tar with, probably should verify the flags before you begin extracting using my command.

    Thanked by 2Shot2 ehab
  • NixtrenNixtren Member
    edited July 2016

    Wont a extracted 8GB tarball be larger than 6.6GB?

    Thanked by 1Abdussamad
  • edited July 2016

    @Nixtren said:
    Wont a extracted 8GB tarball be larger than 6.6GB?

    Tarball is on the server already, 10GB used - 8GB = 2GB used once tarball is off server

    So about 16GB free

    Thanked by 1Nixtren
  • @Nixtren said:
    Wont a extracted 8GB tarball be larger than 6.6GB?

    Once the tarball extracts, he will delete it, thus providing enough space. He wants to simultaneously extract and delete the tarball, so he'll have enough room to hold the contents.

    Thanked by 1Nixtren
  • Put it on a different server, mount the file system remotely.

  • pbgbenpbgben Member, Host Rep

    So, you want magic?

    Thanked by 2Abdussamad vedran
  • Shot2Shot2 Member

    @ALinuxNinja said:
    Have you tried streaming the tarball through SSH?

    Transfer the tarball to another server, create an "extracted" directory in the root folder of the target server, and run

    cat backup.tar.gz | ssh root@aruba-cloud-ip "cd extracted; tar zxvf -"
    

    This.

  • netomxnetomx Moderator, Veteran

    @pbgben said:
    So, you want magic?

    dislike, harming a little bunny :c

  • NomadNomad Member

    @netomx said:

    @pbgben said:
    So, you want magic?

    dislike, harming a little bunny :c

    Nobody seems to care about the box :(

    Thanked by 2Abdussamad webcraft
  • Tar on it's own isnt compressing anything, it should be relatively the same size, from what I understand anyway. Bundled with Gzip and Bzip you get compression from the tarball output. So you end up decompressing the Gzip'd tarball and then restoring from the tarball.

  • ehabehab Member

    @ALinuxNinja said:
    Have you tried streaming the tarball through SSH?

    Transfer the tarball to another server, create an "extracted" directory in the root folder of the target server, and run

    cat backup.tar.gz | ssh root@aruba-cloud-ip "cd extracted; tar zxvf -"
    

    ^this looks best.

  • You could stream it from http:// and untar it?

  • farsighterfarsighter Member
    edited July 2016

    Assuming you don't have another server (or an http:// accessible place to store 8GB) you can do something like this:

    Get yourself a free Pcloud account. It's a cloud storage service with free plan, no file size limit and (importantly here) WebDAV support

    1. Use cURL command to upload your tarball to Pcloud, over webdav
    2. Delete tarball from VPS
    3. Use cURL again to download the tarball from Pcloud over webdav - and simply pipe output to 'tar xvz' command (which will extracts it) like this:

    curl RemoteTarball | tar xvz

    You can see examples of how to use cURL with webdav among replies here:
    https://www.lowendtalk.com/discussion/67181/transip-stack-free-1tb-owncloud

    Check this for Pcloud WebDAV settings info:
    https://freevps.us/thread-17961.html

    It should work.

    Thanked by 1PandaRain
  • just thinking outside the box here,
    wouldn't midnight commander be able to browse the .tar from where you could Move things out ?
    or does mc actually decompress the whole archive first ?

    Thanked by 1Falzo
  • FalzoFalzo Member
    edited July 2016

    @reddwarf said:

    I like this approach, at least as long as it isn't compressed archive, mc should be able to browse without hooking up much space or memory. yet I did experience problems while trying to move files from a tar with mc some times ago. haven't looked much further into it, but maybe worth a try to see if that was only an irregular or easy to fix issue...

    as alternative I'd guess this backup tar had to be laying around somewhere else before so just use the method @ALinuxNinja suggested from there.
    or instead just mount that place (or the aruba-box vice versa) and untar from there...

  • @pbgben said:
    So, you want magic?

    cower

    Thanked by 1netomx
  • noamannoaman Member

    @mehargags said:
    Suggest me a workaround... I have an 8GB Tar Backup to be deflated on a small Aruba 1GB-VPS with 20GB Storage. All Pkgs and OS installed (cleaned and cache purged) I have this left:

    $ df -h
        Filesystem      Size  Used Avail Use% Mounted on
        /dev/dm-0        18G   10G  6.6G  61% /
        udev             10M     0   10M   0% /dev
        tmpfs           201M  4.6M  196M   3% /run
        tmpfs           501M     0  501M   0% /dev/shm
        tmpfs           5.0M     0  5.0M   0% /run/lock
        tmpfs           501M     0  501M   0% /sys/fs/cgroup
        /dev/sda1       461M   34M  405M   8% /boot
    

    I don't need the Tar ball, can discard it after it is deflated. Can't Expand my storage

    drivE or mount any other... Is there a way around this ?

    Do you have fuse enabled?

    Just mount google drive ..push the tar to there and then extract it locally.

  • mehargagsmehargags Member
    edited July 2016

    Well, Thanks for all your suggestions.
    I had thought of streaming it over SSH but the source VPS is a crawler at network speed.

    The other FUSE suggestions are doable, however the amount of extra pkgs to be installed are a bit invasive for a 1GB VPS... I'd like to keep it mean and clean.

    I extracted the TAR to one of my hetzner servers and then made 3 Zip Archives of it ... then copied them one by one, extracting them and deleting them right away.

    Now I need a cloud Sync solution for this 7-8 GB File set... so that the Primary server and this new one are always in sync, plus a 3rd cloud copy is also safe. I always craved for a decent Linux client for Google Drive or other services. Which one do you prefer and suggest ?

    RClone with Google Drive or Amazon Cloud Drive ?
    Yandex.Disk?

    Any others to consider ? is there a good Linux GUI client (I run a basix LXDE Shell)

  • Thanked by 1ehab
Sign In or Register to comment.