Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


4 x 4090 GPUs in a 4U Rackmount (A100 alternative solution)
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

4 x 4090 GPUs in a 4U Rackmount (A100 alternative solution)

CloudNinjasCloudNinjas Member, Patron Provider

Looking for a good GPU solution for AI? We have designed a new box with 4 x 4090 GPUs in a 4U Rackmount. Good alternative to the A100 and at a much lower price point. Contact [email protected] if interested in a quote. Take care.

Comments

  • bdlbdl Member
    edited October 2023

    Take care with wooden floors.

  • Any chance we can ogle the build before talking price?

    Trying to understand the advantage of this build vs. say, separate Ryzen 7000 systems with single 4090 each as an end user (not hosting provider), as the gpu's can't share vram anyway (no nvlink, cmiiw). What kind of cpu and disks backplane?

    I guess the Ryzen systems are not rack-mountable if wanting to use gpu. But with rental this would be the provider's problem.

    What kind of power should I request from my colo provider if I want to deploy your box? Any special temperature requirement?

    Never managed the lifecycle of gpu systems before, mostly in and out of clouds. Trying to calculate how much I can save with ownership / committed use. Feel free to change my mind :D

  • @woteti said:
    Trying to understand the advantage of this build vs. say, separate Ryzen 7000 systems with single 4090 each as an end user (not hosting provider), as the gpu's can't share vram anyway (no nvlink, cmiiw). What kind of cpu and disks backplane?

    NVLink just allows faster interconnects, it doesn't pool the VRAM such that you can run models that don't fit on a single GPU without any parallelization strategy. You'd still need to use something like DeepSpeed. https://huggingface.co/docs/transformers/perf_train_gpu_many

    Thanked by 1woteti
  • @CloudNinjas said: box with 4 x 4090 GPUs

    appreciate the offer and effort but maybe more suitable for HET (highendtalk) ;)

  • @yusra said:
    appreciate the offer and effort but maybe more suitable for HET (highendtalk) ;)

    For gpu systems, 4x4090 is sorta lowend hardware :D . HET would be 8xA100 or 8xH100.

Sign In or Register to comment.