Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Object storage without ridiculous bandwidth costs?
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Object storage without ridiculous bandwidth costs?

OVH, Online.net, Hetzner have free unlimited bandwidth but from the same providers, HTTP based object storage costs 0,02€/GB outgoing bandwidth. For my project I would need to send out 20-50 TB/m of small files depending on the client update frequency.

Of course I could do this with simple dedicated server and nginx webserver over HTTP but why is there not any ready made solutions with unlimited bandwidth?

For example, something like this: https://www.scaleway.com/en/object-storage/
Why can't I pay 30€/m for example for unlimited or close to 30 TB bandwidth?

What makes this more expensive than unlimited bandwidth dedicated server? It's essentially the same bandwidth from the same provider. In fact this should be cheaper than full dedi with unmetered port as you only get access to some UI/SFTP to upload the files, not the full server.

What am I missing? Why is the object storage so expensive when the same provider offers unmetered dedicated servers?

«1

Comments

  • What am I missing?

    Minio.

  • FAT32FAT32 Administrator, Deal Compiler Extraordinaire

    I believe it is mainly on the infrastructure cost. For example, their codes are responsible on how to distribute the data among different physical servers, caching on SSD instead of spinning disks for frequently used objects, maintaining the integrity of objects by having multiple copies, having high availabilities by setting up failover, etc.

  • And, I'm guessing they might like to make some money by segmenting out a more "bespoke service" tier and charging whatever that market might bear. They hear you say: "too rich for my blood" - oh, well then "no problem sir, of course you also have the very affordable option to rent this fine server and roll your own instead." :smiley:

    Thanked by 2ITLabs FAT32
  • wasabi has unlimited BW with FUP (you can use bandwidth up to the storage amount you used).

    That could work, but there is also request fees

    IMO the best option is an unlimited BW VPS + Minio

    Thanked by 1flatland_spider
  • mrclownmrclown Member
    edited October 2019

    Minio works well for simple cases and potentially cheapest for unique request/traffic.

    You can also try S3 (choose proper tier) + Cloudfront or B2 + cloudflare.

    Thanked by 1ITLabs
  • raindog308raindog308 Administrator, Veteran

    I think the answer to your question is SLAs. Going from “it works 99% of the time” to “it works 99.999%” of the time is a massive jump in required infrastructure and cost.

    But I’m just guessing.

  • FranciscoFrancisco Top Host, Host Rep, Veteran

    @vimalware said:

    What am I missing?

    Minio.

    And a slab. And a slice.

    Bam!

    Francisco

  • @Francisco said:

    @vimalware said:

    What am I missing?

    Minio.

    And a slab. And a slice.

    Bam!

    Francisco

    And slabs. And slices.

    Bum!

    Distributed MinIO cluster.

  • ClouviderClouvider Member, Patron Provider

    Ridiculous bandwidth costs of single figure pence per GB, oh my. The cost is definitely not ridiculous.

  • cirrus_cloudcirrus_cloud Member
    edited October 2019

    How much storage do you need?

    1 TB of bandwidth from Hetzner for 1 euro is as cheap as it gets. Amazon or Azure charge like $80 per TB.

    Thanked by 1Hetzner_OL
  • hzrhzr Member

    I am interested too. I have a use case for 100TB+ (always growing, slow growing month over month) data to be stored as cheaply as possible, downloaded periodically, separated into cold/hot (hot is immediate https download, cold can be several hours).

  • jsgjsg Member, Resident Benchmarker

    @hzr said:
    I am interested too. I have a use case for 100TB+ (always growing, slow growing month over month) data to be stored as cheaply as possible, downloaded periodically, separated into cold/hot (hot is immediate https download, cold can be several hours).

    C14 might be a solution.

  • Hetzner doesn't have S3 but it has Nextcloud at €40 for 10TB. Maybe you can use that for this purpose. 10TB is the max size plan but I guess you can get multiple ones. Each one comes with 20TB of bandwidth. @Hetzner_OL is it ok to use Hetzner's Nextcloud that way?

  • jarjar Patron Provider, Top Host, Veteran

    Wasabi is great. Object storage the way it should be. They even have a desktop client, if you feel like using it as a Dropbox alternative.

    https://wasabi.com

  • @jar said:
    Wasabi is great. Object storage the way it should be. They even have a desktop client, if you feel like using it as a Dropbox alternative.

    https://wasabi.com

    Free egress up to the size of your storage is a pretty neat pricing model.

    Thanked by 1jar
  • I went through a lot of the object stores lately with similar requirements, and Wasabi is the cheapest if you're not going the custom route. You do need to be familiar with S3 config syntax to setup more then basic permissions structures.

    Another option was BunnyCDN. They'll sell you storage and distribute it on their CDN, which is might be something to think about.

    Backblaze B2 is a good option for applications to store data. They have some operation limits which might be a problem, and their small file transfers aren't the speediest.

  • williewillie Member
    edited October 2019

    At these 30-100TB levels if you want non-owned equipment at all, maybe you should get Hetzner SX servers and deal with operating them. Those are cheaper than any non-promotional storage plans I know of that don't have a bunch of extra charges for bw etc.

    Are there many of you who would might want a big managed dedi or cluster running an object store? That might be an opportunity here.

    Thanked by 1loyd
  • Buy B2 storage.
    B2 to cloudflare is free for unlimited bandwidth (due to direct peering)
    On Cloudflare, create a page-rule to cache images/everything for 1 year on their edge.
    Alternatively you can use a Worker script to map b2 storage with your domain and then cache it.

    Hurray! You don't pay anything for bandwidth now on!

  • Outbound unlimited cloudflare bandwidth is free? I doubt that ;).

    Thanked by 1uzaysan
  • jsg said: C14 might be a solution.

    C14 is archival and the cheapest version is best if you want to retain forever but expect to never access. I.e. you keep it only for just-in-case reasons. They even charge you to delete it, so if you will eventually expire it out you are better off with OVH Cloud Archive, where deletion is free.

  • JordJord Moderator, Host Rep

    We use minio over a couple of storage dedis at Hetzner. Any files that don't need to be cached or available worldwide quickly is just downloaded over Hetzner. Anything we need to access quickly worldwide we use BunnyCDN on top of minio.

    It works great for us, best of both worlds. We mostly use it for digital downloads for our clients. If they just want to upload a 100mb for example and don't want to pay a fee. Then it gets downloaded straight from Hetzner. They pay for larger files it's done via BunnyCDN.

    We were using OVH, but we wanted better control over our content. So we found minio and haven't looked back.

    Thanked by 3gazmull pike vimalware
  • For object storage, the providers have to provision a minimum of 3TB for every TB that they can sell to the customers because of redundancy. It can even go higher than that. They have to setup multiple servers and then the software. Of course, they have to pay someone to take care of the infrastructure. Purchasing HDD or SSD is a one time cost so they can recover the storage costs over the course of years but bandwidth is a recurring cost and they have to pay their ISPs for upstream bandwidth. The margins, thus are razor thin and so the billing is structured in a way that some profit can be made.

    Thanked by 1jsg
  • That seems the same as regular hosting except the 3x redundancy instead of 2x.

  • Except in regular unmanaged hosting the provider is not responsible for what you do with the server. Whether you do RAID-1, 0 is up to you. They are not responsible for data loss on your server if you are not maintaining backups. They are also not responsible if you suddenly get a spike in traffic that your server cannot handle and goes down.

    But if they're offering you object storage and tens of thousands of users are accessing at the same time, they will scale their infrastructure and serve that content.

  • Hetzner_OLHetzner_OL Member, Top Host

    willie said: Hetzner doesn't have S3 but it has Nextcloud at €40 for 10TB. Maybe you can use that for this purpose. 10TB is the max size plan but I guess you can get multiple ones. Each one comes with 20TB of bandwidth. @Hetzner_OL is it ok to use Hetzner's Nextcloud that way?

    I asked one of my coworkers about this, and he said that neither the Nextcloud (what we call "Storage Shares") nor the Storage Boxes are optimized for this kind of use. Instead, he recommended that @stefeman get a dedicated server with NVMes and run the object storage on the servers himself. --Katie

    Thanked by 3pike ITLabs uptime
  • MPGMPG Member

    @Hetzner_OL said:

    willie said: Hetzner doesn't have S3 but it has Nextcloud at €40 for 10TB. Maybe you can use that for this purpose. 10TB is the max size plan but I guess you can get multiple ones. Each one comes with 20TB of bandwidth. @Hetzner_OL is it ok to use Hetzner's Nextcloud that way?

    I asked one of my coworkers about this, and he said that neither the Nextcloud (what we call "Storage Shares") nor the Storage Boxes are optimized for this kind of use. Instead, he recommended that @stefeman get a dedicated server with NVMes and run the object storage on the servers himself. --Katie

    I agree and I like their honesty. I use Hetzner NC, it's slow. If you are pushing volume you will cry. But for personal, dollar per gig it's an awesome deal like their hardware configs. :-)

    Thanked by 1Hetzner_OL
  • hzrhzr Member

    Is it bad that I find the mental image hilarious of a bunch of very serious Germans at a conference room table discussing a LET forum thread?

    Thanked by 2uptime vimalware
  • raindog308raindog308 Administrator, Veteran

    hzr said: Is it bad that I find the mental image hilarious of a bunch of very serious Germans at a conference room table discussing a LET forum thread?

    There's a Hitler Downfall meme in that somewhere.

    Thanked by 1vimalware
  • @MPG said:

    @Hetzner_OL said:

    willie said: Hetzner doesn't have S3 but it has Nextcloud at €40 for 10TB. Maybe you can use that for this purpose. 10TB is the max size plan but I guess you can get multiple ones. Each one comes with 20TB of bandwidth. @Hetzner_OL is it ok to use Hetzner's Nextcloud that way?

    I asked one of my coworkers about this, and he said that neither the Nextcloud (what we call "Storage Shares") nor the Storage Boxes are optimized for this kind of use. Instead, he recommended that @stefeman get a dedicated server with NVMes and run the object storage on the servers himself. --Katie

    I agree and I like their honesty. I use Hetzner NC, it's slow. If you are pushing volume you will cry. But for personal, dollar per gig it's an awesome deal like their hardware configs. :-)

    I generally find nextcloud slow, maybe I'm just configuring it wrong, it's fine with a handful of files but throw a folder with a few K files at it and it falls to pieces

Sign In or Register to comment.