Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Dynamic Storage? - Page 2
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Dynamic Storage?

2»

Comments

  • ok, so the bunny stream API seem to be able to fetch videos from any public facing URL, so maybe I dont have to move them from my server, and clog up disk space and ram in the process.

    So, any provider, that allow some signed access key, would do the trick?

    client -> s3 -> bunny.

    thank you guys for all your help as well. im very grateful!

  • @maxwell_ said:

    @sanvit said:

    @maxwell_ said: who's a good S3 provider?

    Well, S3 is part of AWS, so I would say AWS is best.

    However, on your specific use case, I would recommend using something like Linode, DO, Scaleway, etc. as intr-datacenter transfers won't use up your traffic allotment. Scaleway's base pricing might be a bit expensive at 7.2EUR/m but their object storage lineup is free up to 75GB, and egress is also free (but capped at 100Mbps max), so that might fit your needs

    Thats interesting, will have to read up on that. but you are saying i could possibly upload straight from customers browser to scaleway/DO and then possibly move it to bunny? (if they support it)

    If I had to come up with such system, this is probably what I would have done.

    1. Let the user upload files to a S3 compatible storage.
    2. Log the user's upload, and when the upload is done, queue it on some kind of db, etc.
    3. Let the VM run each queue, downloading files one-by-one and uploading it to bunny stream.

    If you already have a script that can upload files directly to bunny stream, the other parts shouldn't be that hard.

    Thanked by 1maxwell_
  • maxwell_maxwell_ Member
    edited February 2022

    @sanvit said:

    If I had to come up with such system, this is probably what I would have done.

    1. Let the user upload files to a S3 compatible storage.
    2. Log the user's upload, and when the upload is done, queue it on some kind of db, etc.
    3. Let the VM run each queue, downloading files one-by-one and uploading it to bunny stream.

    If you already have a script that can upload files directly to bunny stream, the other parts shouldn't be that hard.

    This is probably the best solution, as yes, I do have the upload fully working (client->my server->bunny).

  • @maxwell_ said:
    ok, so the bunny stream API seem to be able to fetch videos from any public facing URL

    Oh, didn't know that :)
    If this is the case, letting bunny fetch directly from the object storage indeed seems to be the best option!
    Would recommend using B2 + Cloudflare for your case then!

    Just remember that most of the per-GB price is usually calculated hourly. So removing the file after it being uploaded would save you some money.
    AND DON'T USE WASABI FOR YOUR USE CASE. THEY CHARGE FILES FOR 90 DAYS, EVEN DELETED ONES

    Thanked by 1maxwell_
  • @maxwell_ said:

    @sanvit said:

    If I had to come up with such system, this is probably what I would have done.

    1. Let the user upload files to a S3 compatible storage.
    2. Log the user's upload, and when the upload is done, queue it on some kind of db, etc.
    3. Let the VM run each queue, downloading files one-by-one and uploading it to bunny stream.

    If you already have a script that can upload files directly to bunny stream, the other parts shouldn't be that hard.

    This is probably the best solution, as yes, I do have the upload fully working (client->my server->bunny).

    If Bunny's API allows direct fetching from public URLs, IMO that would be the most optimal for you :)

    Thanked by 1maxwell_
  • @sanvit said:

    @maxwell_ said:
    ok, so the bunny stream API seem to be able to fetch videos from any public facing URL

    Oh, didn't know that :)
    If this is the case, letting bunny fetch directly from the object storage indeed seems to be the best option!
    Would recommend using B2 + Cloudflare for your case then!

    Just remember that most of the per-GB price is usually calculated hourly. So removing the file after it being uploaded would save you some money.
    AND DON'T USE WASABI FOR YOUR USE CASE. THEY CHARGE FILES FOR 90 DAYS, EVEN DELETED ONES

    Thank you for the wasabi warning :)
    Why did you say B2 + Cloudflare, do I need both? is not one enough?

  • @maxwell_ said: Why did you say B2 + Cloudflare, do I need both? is not one enough?

    Backblaze B2 is known to be one of the cheapest s3 compatible storage, and if you use B2 with Cloudflare (pointing Cloudflare to your B2 endpoint), you get free egress as well (you don't have to pay the $10/TB egress fee).

    so it would be something like

    CUSTOMER -> B2 -> Cloudflare (Proxy) -> BunnyStream

    Using only B2 is still fine if your traffic is rather low, and you can bear the egress fee.

    Thanked by 1maxwell_
  • @sanvit said:

    @maxwell_ said: Why did you say B2 + Cloudflare, do I need both? is not one enough?

    Backblaze B2 is known to be one of the cheapest s3 compatible storage, and if you use B2 with Cloudflare (pointing Cloudflare to your B2 endpoint), you get free egress as well (you don't have to pay the $10/TB egress fee).

    so it would be something like

    CUSTOMER -> B2 -> Cloudflare (Proxy) -> BunnyStream

    Using only B2 is still fine if your traffic is rather low, and you can bear the egress fee.

    Amazing. So i would pay b2 for the (temp) storage, cloudflare for the transfer, and then my usual bunnystream bill.

    this is exactly what i was trying to request in my initial confusing post.

  • wait, is cloudflare actually free in this setup? no cost for even moving from cloudflare to bunny?

  • @maxwell_ said:
    wait, is cloudflare actually free in this setup? no cost for even moving from cloudflare to bunny?

    yup.

    Thanked by 1maxwell_
  • @sanvit said:

    yup.

    Thank you!

  • @maxwell_ said:

    @sanvit said:

    yup.

    Thank you!

    Let me know if you face any problems setting up :)

    Thanked by 1maxwell_
  • Come to think of it, B2 is free for 10GBmo (around 7.2TBmin), so if you delete your files on a timely manner, your setup could be free (plus bunny stream and the server that handels the API)

    Thanked by 1maxwell_
  • @sanvit said:

    @maxwell_ said:

    @sanvit said:

    yup.

    Thank you!

    Let me know if you face any problems setting up :)

    thanks, i might need that :smile:

  • @sanvit said:
    Come to think of it, B2 is free for 10GBmo (around 7.2TBmin), so if you delete your files on a timely manner, your setup could be free (plus bunny stream and the server that handels the API)

    well, i would delete the file asap already, as soon as client finish upload and inform my server about it, i would call bunny stream to download it. and then, ill delete it when that's done.

    i'm sure it wont be easy to set up, but it will be fun :smiley:

Sign In or Register to comment.