Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Looking for best and budgetable server for running Llama2-70b
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Looking for best and budgetable server for running Llama2-70b

We are planning to do some development works on Llama2-70b which needs GPU. For 70B models, we advise you to select "GPU [2xlarge] - 2x Nvidia A100" with bitsandbytes quantization enabled or "GPU [4xlarge] - 4x Nvidia A100".

Can some one here suggest me some providers and best plan with them for running this?
Currently we are on development stage. Once we are able to achieve our goal. We will be looking for production instance as well.

Comments

Sign In or Register to comment.