Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Cheapest enterprise / high core servers? - Page 2
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Cheapest enterprise / high core servers?

2»

Comments

  • ts001ts001 Member

    @Alienighted said:
    This all depend on provider quality which you need,
    You can get same specs paying 1/3 of this.

    Where would I find that please?

  • randvegetarandvegeta Member, Host Rep

    @ts001 said:

    @randvegeta said:

    ts001 said: Unless anyone knows of better than this, this is probably what I'll be going with.

    What are your other requirements? Like disk and RAM? If you have minimum disk/ram requirements, I can do better in Lithuania.

    We're in the USA so I'm not sure this would work

    Does it matter where the server is for computational work load?

  • @ts001 said:

    @Alienighted said:
    This all depend on provider quality which you need,
    You can get same specs paying 1/3 of this.

    Where would I find that please?

    I sent you a PM.

  • williewillie Member

    ts001 said: We're using vanilla Python at the moment and I'm not sure what it would require to put our calculations on the GPU as well, other than OpenCL. We have a mathematical algorithm that pounds the crap out of whatever thread it's running on and we have an instance of that algorithm for each concurrent user. Thus 150 concurrent users = 150 threads = 75 cores.

    What is the application if you don't mind my asking? The usual DL libs like Pytorch and Tensorflow have gpu bindings. There's also Intel optimizations for numpy/scipy:

    https://software.intel.com/en-us/blogs/python-optimized

    If you're running a separate algorithm instance for each user it sounds like you can do fine with small servers, or maybe hourly VM instances from providers with lots of gear.

    If you're really literally just running pure Python code then check into at least using Cython or Numpy or spinning off the heavy numerics to an external program.
    Python is great but it's more for glueing components together than running heavy computation directly.

    Thanked by 1ts001
  • ts001ts001 Member

    @willie said:

    ts001 said: We're using vanilla Python at the moment and I'm not sure what it would require to put our calculations on the GPU as well, other than OpenCL. We have a mathematical algorithm that pounds the crap out of whatever thread it's running on and we have an instance of that algorithm for each concurrent user. Thus 150 concurrent users = 150 threads = 75 cores.

    What is the application if you don't mind my asking? The usual DL libs like Pytorch and Tensorflow have gpu bindings. There's also Intel optimizations for numpy/scipy:

    https://software.intel.com/en-us/blogs/python-optimized

    If you're running a separate algorithm instance for each user it sounds like you can do fine with small servers, or maybe hourly VM instances from providers with lots of gear.

    If you're really literally just running pure Python code then check into at least using Cython or Numpy or spinning off the heavy numerics to an external program.
    Python is great but it's more for glueing components together than running heavy computation directly.

    We're doing facial recognition stuff. This looks very interesting, going to look into it. We are indeed using Numpy and also using DLib.

  • ts001ts001 Member

    @randvegeta said:

    @ts001 said:

    @randvegeta said:

    ts001 said: Unless anyone knows of better than this, this is probably what I'll be going with.

    What are your other requirements? Like disk and RAM? If you have minimum disk/ram requirements, I can do better in Lithuania.

    We're in the USA so I'm not sure this would work

    Does it matter where the server is for computational work load?

    Same server is also being used for a web API and static assets that need to be passed to clients in the USA (for simplicity sake) so I would assume it does matter

  • williewillie Member

    ts001 said:
    We're doing facial recognition stuff.

    You should probably use pytorch, which has good cuda bindings.

    Thanked by 1ts001
  • @ts001 said:
    We're not using a library that can utilize GPU cores unfortunately :(. I wish though, because it would solve a lot of problems for us. We're using vanilla Python at the moment and I'm not sure what it would require to put our calculations on the GPU as well, other than OpenCL. We have a mathematical algorithm that pounds the crap out of whatever thread it's running on and we have an instance of that algorithm for each concurrent user. Thus 150 concurrent users = 150 threads = 75 cores.

    That is too bad, the drop in crypto prices recently has created a surplus of GPU's on the market (especially used ones). I just bought a lot of used GeForce 1080ti video cards for under $350 per card and used them to build a half-rack cluster of data-processing power.

    Anyway, if you want to build the ultimate system, buy a HP c3000 Blade Chassis and 8 HP ProLiant BL680c G7 Blade Servers that have quad CPU's. Each blade will have a total of 32 cores, for a total chassis core count of 256 cores (512 if you use hyperthreading) at over 2ghz per core. Total cost would probably be around 5k or 6k depending on how much memory you toss into each blade.

    If you want a more traditional setup and don't need that many cores, try this 4u Dell PowerEdge R910. It has 4 8-core processors for a total of 32 (64 with hyperthreading) for about $600. You'll just need to toss in a hard drive.

    Thanked by 1ts001
  • williewillie Member

    DeftNerd said: I just bought a lot of used GeForce 1080ti video cards for under $350 per card

    WHAT? Where? I don't see them anywhere near that cheap around here, despite the 1180 supposedly being on its way. If they've really gotten that affordable maybe I should buy one instead of using paperspace.

  • @willie said:

    DeftNerd said: I just bought a lot of used GeForce 1080ti video cards for under $350 per card

    WHAT? Where? I don't see them anywhere near that cheap around here, despite the 1180 supposedly being on its way. If they've really gotten that affordable maybe I should buy one instead of using paperspace.

    I'm in a few cryptocurrency mining slack and discord rooms and every now-and-then a miner gets out of the business.

  • williewillie Member

    DeftNerd said:

    I'm in a few cryptocurrency mining slack and discord rooms and every now-and-then a miner gets out of the business.

    Well ok but that seems way below market price for those cards. They're $650+ on craigslist around here. Haven't checked ebay. Flipping them could be more profitable than mining with them.

Sign In or Register to comment.