Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Can I create a virtual machine with >32GB RAM coming from the hosts SSD?
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Can I create a virtual machine with >32GB RAM coming from the hosts SSD?

chrispchrisp Member

Hi LET,

I am running a computation program, that needs a lot of RAM and before looking into renting a cluster that would be my first idea to do it locally. I know SSD is slower than RAM, but I don't care about that fact. Please note, that my computer runs Windows as well and I don't want to look into some complicated virtualisation techniques.

Can you imagine how such a scenario could work?

Comments

  • Not in an "uncomplicated" way.

    I find it hard to believe that you have a "computations" program that requires more than 32GB of ram that isn't already capable of making efficient use of files....

    Thanked by 1UltranetUSA
  • forthcloudforthcloud Member
    edited March 2014

    Typical 1600MHz DDR3 RAM sticks have about 15-20 GB/s bandwidth. SSDs are no way near that, even if you put them in a RAID 10 raid it'll be around 1 GB/s. SSDs are also ~100 times slower than RAM in terms of the latency.

    Using SSDs as RAM will be significantly less efficient than using real RAM, and SSDs have a finite number of reads/writes, after which they will die. If you use them as RAM, they will die within a few months instead of years.

    Thanked by 2howardsl2 Mark_R
  • @sycotic: This is an R script basically and it is loading a lot of files to do statistical calculations, so no actual software for processing large files natively. Normally that runs fine, but I have a dataset now that is really large.

    @forthcloud: For this purpose the computation has to run only once. This is not a good solution, I am well aware of that. It is just that I don't have the possibilty to transfer large files to a server that quick, so I'd prefer the hacky local way if possible.

  • @chrisp

    Is it possible for you to break the files down into parts and get them processed in multiple computers (ie. servers/VPS)?

  • Unfortunately not. All lines are related and can't be splitted. A large temporary table is generated during the process and that will be further processed against another large file. In the worst case I would have to optimise the script of course, but I thought emulating RAM in a VM is rather not that hard with tools like VMWare and so on.

    I have only 8GB of Ram, but what happens beyond that point? I can select it so somehow it works I guess. Memory swapping can't really work with 4x the real memory, can it?

  • And where does it swap to? The hard drive in the end?

  • @chrisp said:
    And where does it swap to? The hard drive in the end?

    Yes.

  • Oh but that doesn't work in reality :(

    Ideas?

  • How long does your script run? A c3.4xlarge EC2 instance with 30GB RAM costs just 1.20$/hour

    Thanked by 2netomx Gunter
  • @chrisp said:
    It is just that I don't have the possibilty to transfer large files to a server that quick, so I'd prefer the hacky local way if possible.

    @gsrdgrdghd

  • @gsrdgrdghd said:
    How long does your script run? A c3.4xlarge EC2 instance with 30GB RAM costs just 1.20$/hour

    This. If you have the capacity to do this, go for it. Amazon is a very popular platform for computations and researchers who don't need a dedicated cluster.

  • dnomdnom Member

    Can't you just add swap from the SSD?

  • MunMun Member

    Please dont use your SSD for this, it is going to literally kill it.

    Buy a dedicated server for a month or hourly from incero instead.

    http://incero.com/

    Mun

  • +1 for Amazon. I was going to suggest consider using a GPU also, depending on the calculation.

  • @ricardo said:
    +1 for Amazon. I was going to suggest consider using a GPU also, depending on the calculation.

    I doubt you can fit the data to GPU, the CPU to GPU transfer is slow.

  • If it's the RAM that you only need, you can actually purchase a L5520 server with tons of ram off EBAY (used go for about $300) and colocate it for $40 bucks imho. Then you can use those ram on your other projects.

  • @chrisp said:
    sycotic: This is an R script basically and it is loading a lot of files to do statistical calculations, so no actual software for processing large files natively. Normally that runs fine, but I have a dataset now that is really large.

    You should probably rewrite your script to handle the data correctly/efficiently. If your data set is truly that large your computations are typically going to be bound by disk IO anyways and you should do your processing as the data comes off the disk anyways.

    If you really don't want to do that then a virtual instance on EC2 is probably your best bet. You can get some spot instances at the ram size you need for dirt cheap at the right time.

Sign In or Register to comment.