Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Dedicated servers with high end GPUs for machine learning (Europe, Belgium or close location)
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Dedicated servers with high end GPUs for machine learning (Europe, Belgium or close location)

We have everything in Google Cloud at work, and in our region (Belgium) the highest end GPUs are not available and even if they were, they cost a shit load of money so it wouldn't be convenient for us. With the GPUs available I couldn't even get GPT-J to run because the video memory is not enough.

Can anyone recommend a European provider of dedicated servers with powerful GPUs?

Thanks in advance!

Comments

  • CalmDownCalmDown Member
    edited June 2023
  • dataforestdataforest Member, Patron Provider

    We can offer several GPUs, depends on the budget. For a customer we deploy 2x AMD Epyc 9654 with 2x RTX A5500, 60 TB NVMe and 100 GBit/s

  • @PHP_Friends said:
    We can offer several GPUs, depends on the budget. For a customer we deploy 2x AMD Epyc 9654 with 2x RTX A5500, 60 TB NVMe and 100 GBit/s

    Need to read about that GPU to get an idea as I have only tried the ones available in GCP. What kind of pricing could you make for these?

  • dataforestdataforest Member, Patron Provider

    Depends on the exact hardware, can you clarify this once so we can get a suitable offer ready for you?

  • @PHP_Friends said:
    We can offer several GPUs, depends on the budget. For a customer we deploy 2x AMD Epyc 9654 with 2x RTX A5500, 60 TB NVMe and 100 GBit/s

    Just checked, these cards seem to have "only" 24GB of RAM. This is not enough for fine tuning the GPT-J model with our data. Do you have anything with more memory? 40GB should be fine.

  • @PHP_Friends said:
    Depends on the exact hardware, can you clarify this once so we can get a suitable offer ready for you?

    I've been testing with cloud instances with 16-32 cores, 128GB of RAM and a few different GPUs. The bottleneck was always the GPU. I can't say which CPUs GCP uses exactly

  • dataforestdataforest Member, Patron Provider

    @vitobotta said:

    @PHP_Friends said:
    We can offer several GPUs, depends on the budget. For a customer we deploy 2x AMD Epyc 9654 with 2x RTX A5500, 60 TB NVMe and 100 GBit/s

    Just checked, these cards seem to have "only" 24GB of RAM. This is not enough for fine tuning the GPT-J model with our data. Do you have anything with more memory? 40GB should be fine.

    Maybe RTX A6000, lets talk via PM :)

  • @PHP_Friends said:

    @vitobotta said:

    @PHP_Friends said:
    We can offer several GPUs, depends on the budget. For a customer we deploy 2x AMD Epyc 9654 with 2x RTX A5500, 60 TB NVMe and 100 GBit/s

    Just checked, these cards seem to have "only" 24GB of RAM. This is not enough for fine tuning the GPT-J model with our data. Do you have anything with more memory? 40GB should be fine.

    Maybe RTX A6000, lets talk via PM :)

    Sure

  • srch07srch07 Member

    @lentro this is your queue for Tensordock :smile:

    Thanked by 1BasToTheMax
  • DataCrunch, their server are hosted in finland https://datacrunch.io/

  • @shelfchair said:
    DataCrunch, their server are hosted in finland https://datacrunch.io/

    This also looks interesting, thanks!

  • lentrolentro Member, Host Rep

    @srch07 said:
    @lentro this is your queue for Tensordock :smile:

    Thanks for the mention!

    @vitobotta at TensorDock, we're creating a cloud computing marketplace where hosting providers offload excess stock to us, and then you get to access that at industry-leading prices.

    We have A6000s with 48 GB of VRAM from $0.47/hr and A100 80GBs from $1.27/hr.
    https://marketplace.tensordock.com/order_list

    (We'll be launching interruptible instances publicly very very soon, so if that's still too expensive, then you can save even more :) )

  • @lentro said:

    @srch07 said:
    @lentro this is your queue for Tensordock :smile:

    Thanks for the mention!

    @vitobotta at TensorDock, we're creating a cloud computing marketplace where hosting providers offload excess stock to us, and then you get to access that at industry-leading prices.

    We have A6000s with 48 GB of VRAM from $0.47/hr and A100 80GBs from $1.27/hr.
    https://marketplace.tensordock.com/order_list

    (We'll be launching interruptible instances publicly very very soon, so if that's still too expensive, then you can save even more :) )

    Thanks! For now I am collecting info and will discuss with the team hopefully this coming week as I am keen to start this project :)

    Thanked by 1lentro
  • I'll vouch for PHP-Friends, they're production ready

    Thanked by 1dataforest
Sign In or Register to comment.