Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Dynamic Storage?
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Dynamic Storage?

maxwell_maxwell_ Member
edited February 2022 in Requests

Im building a cloud based storage service, and my clients will be uploading quite large files.
I need these files, to first hit my server, but then move them off to an object storage cdn.

I cant! cut out my server as a "middle man" but this also means my cache will/could be clogged up before my code manage to move the files off my server and clear up space.

I'm wondering, is there such a service, where I can run php and the disk grows as needed, up and down, so I never risk temporary running out of space, that I could use just for the upload purpose, and then my current server could handle serving static files, database calls etc.

grateful for any replies!

«1

Comments

  • I don't know for block storage. Almost all object storage providers do this.

    Thanked by 1maxwell_
  • edit my post, i meant moving to cdn, not object storage. and i guess, my problem would be not only disk but also ram for said file uploads.

    i would prefer some kind of managed service, my server admin skills are zero.

  • bulbasaurbulbasaur Member
    edited February 2022

    Look into Amazon's EFS or Google Cloud's Elastic Filestore, although getting decent performance off these quickly becomes expensive.

    Or you could just use object storage; S3 and compatible object storages provide a PostObject API allow you to directly to said storage service from the client; though you must fill out the form parameters yourself and render the form to the client.

    Thanked by 1maxwell_
  • @maxwell_ said:
    edit my post, i meant moving to cdn, not object storage. and i guess, my problem would be not only disk but also ram for said file uploads.

    i would prefer some kind of managed service, my server admin skills are zero.

    Usually, you put your files on to an Object Storage, and let the CDN dynamically cache the required content.

    TBH I don't know why you're thinking of CDN in the first place. Unless your users that are downloading the files are not spread across regions, it would be better yto put your object storage whereever closest your customers.

    Thanked by 1maxwell_
  • @sanvit said:

    Im moving the files to bunnycdn stream, service. They currently dont support uploading files straight from the browser to their servers, it needs to go through mine.

  • @maxwell_ said:

    @sanvit said:

    Im moving the files to bunnycdn stream, service. They currently dont support uploading files straight from the browser to their servers, it needs to go through mine.

    Ah, makes sense.

    The only service I can think of is Cloudjiffy, which charges you based on actual disk usage.

    Or you could upload the original video file on an object storage and use your script to upload it to Bunny one by one?

    Thanked by 1maxwell_
  • @sanvit said:
    Ah, makes sense.

    The only service I can think of is Cloudjiffy, which charges you based on actual disk usage.

    Or you could upload the original video file on an object storage and use your script to upload it to Bunny one by one?

    Thank you for your reply. With object storage, i wouldnt even know where to begin, never worked with it. (except local windows machine backups)

    what would be great is if there is some managed vps provider here on LET that fits the bill.

  • @maxwell_ said:

    @sanvit said:
    Ah, makes sense.

    The only service I can think of is Cloudjiffy, which charges you based on actual disk usage.

    Or you could upload the original video file on an object storage and use your script to upload it to Bunny one by one?

    Thank you for your reply. With object storage, i wouldnt even know where to begin, never worked with it. (except local windows machine backups)

    what would be great is if there is some managed vps provider here on LET that fits the bill.

    TBH I would really recommend you have a look at Cloudjiffy then. @leapswitch
    They are the only provider that I can think of, that has usage-based pricing on storage

    Thanked by 2maxwell_ leapswitch
  • @sanvit said:

    thank you for your help, yeah i got cloudjiffy open here in another tab after your suggestion, will contact them later this evening.

  • @stevewatson301 said:
    Look into Amazon's EFS or Google Cloud's Elastic Filestore, although getting decent performance off these quickly becomes expensive.

    Or you could just use object storage; S3 and compatible object storages provide a PostObject API allow you to directly to said storage service from the client; though you must fill out the form parameters yourself and render the form to the client.

    Thank you for your suggestion.

  • AWS EFS would be your best bet in this case. I've asked their support to confirm that this use case is definitely supported and cost incurred shall be minimal.

    Having said that you should note that EFS has some limitations on IO for smaller instances so you may have to either preheat the EFS, or to spin up new one to churn the "freshness" of IO burst credit.

    Thanked by 1maxwell_
  • @cnbeining said:
    AWS EFS would be your best bet in this case. I've asked their support to confirm that this use case is definitely supported and cost incurred shall be minimal.

    Having said that you should note that EFS has some limitations on IO for smaller instances so you may have to either preheat the EFS, or to spin up new one to churn the "freshness" of IO burst credit.

    Thank you for your reply. I have never worked with EFS or any AWS service for that matter.
    So would i be able to upload files via browser (javascript) straight to EFS, then via my server, read said file and move it to my cdn provider? Wouldn't I be needing to run php on the upload target?

  • You mentioned BunnyCDN so I suggest you take a look at their Edge Storage. https://bunny.net/edge-storage/
    Basically, you can upload directly to their edge storage via API.
    However, it is common to use Cloudflare CDN for end users because Cloudflare offers free traffic for their CDN.
    You can configure Cloudflare to get data from either S3/Backblaze/Wasabi or Bunny Edge storage

    Thanked by 1maxwell_
  • @quanhua92 said:
    You mentioned BunnyCDN so I suggest you take a look at their Edge Storage. https://bunny.net/edge-storage/
    Basically, you can upload directly to their edge storage via API.
    However, it is common to use Cloudflare CDN for end users because Cloudflare offers free traffic for their CDN.
    You can configure Cloudflare to get data from either S3/Backblaze/Wasabi or Bunny Edge storage

    Thank you for your reply. I talked to bunny and they said currently one cant upload straight to their stream service. thats the one i need the file to end up at.

  • quanhua92quanhua92 Member
    edited February 2022

    @maxwell_ said:
    Thank you for your reply. I talked to bunny and they said currently one cant upload straight to their stream service. thats the one i need the file to end up at.

    If you want to avoid the middleman on your server, you can use serverless function to handle that at edge servers. Something like Cloudflare workers or AWS Lambda. The normal server workload can still be on your normal VPS.
    However, I am not sure if it can handle upload a large file. I think Cloudflare Workers limits request size to 100-500MB
    https://walshy.dev/blog/21_09_10-handling-file-uploads-with-cloudflare-workers

    For large file size, I would upload to S3 bucket directly.

    https://blog.rocketinsights.com/uploading-images-to-s3-via-the-browser/

    https://docs.aws.amazon.com/AmazonS3/latest/userguide/upload-objects.html

    In this page, you can upload with multipart up to 5TB.
    Then, you can create a CDN to get the data from that S3 bucket. No need bunny stream. I would use Cloudflare CDN here for free traffic.
    If you want BunnyCDN, then you create a pull zone and enable the video delivery feature

    Thanked by 1maxwell_
  • @maxwell_ said:

    @cnbeining said:
    AWS EFS would be your best bet in this case. I've asked their support to confirm that this use case is definitely supported and cost incurred shall be minimal.

    Having said that you should note that EFS has some limitations on IO for smaller instances so you may have to either preheat the EFS, or to spin up new one to churn the "freshness" of IO burst credit.

    Thank you for your reply. I have never worked with EFS or any AWS service for that matter.
    So would i be able to upload files via browser (javascript) straight to EFS, then via my server, read said file and move it to my cdn provider? Wouldn't I be needing to run php on the upload target?

    Yeah you do need to run a server for handling uploads. Call me old school but I don’t feel very comfortable exposing S3 endpoints directly to the public.

    Thanked by 1maxwell_
  • Why do you want "dynamic storage"? It is definitely not dynamic for someone.

    If it is for cost reason, it will likely cost more than getting the peak buffer size for your use-case, if you find someone who will offer niche features like this.

    How big are your files and in which scenario you envision your application being available but bunnycdn unreachable?

  • @quanhua92 said:

    @maxwell_ said:
    Thank you for your reply. I talked to bunny and they said currently one cant upload straight to their stream service. thats the one i need the file to end up at.

    If you want to avoid the middleman on your server, you can use serverless function to handle that at edge servers. Something like Cloudflare workers or AWS Lambda. The normal server workload can still be on your normal VPS.
    However, I am not sure if it can handle upload a large file. I think Cloudflare Workers limits request size to 100-500MB
    https://walshy.dev/blog/21_09_10-handling-file-uploads-with-cloudflare-workers

    For large file size, I would upload to S3 bucket directly.

    https://blog.rocketinsights.com/uploading-images-to-s3-via-the-browser/

    https://docs.aws.amazon.com/AmazonS3/latest/userguide/upload-objects.html

    In this page, you can upload with multipart up to 5TB.
    Then, you can create a CDN to get the data from that S3 bucket. No need bunny stream. I would use Cloudflare CDN here for free traffic.
    If you want BunnyCDN, then you create a pull zone and enable the video delivery feature

    Thank you. Yeah I need to use bunny, as im using their Stream serivce. My only issue, is that I dont want to run out of ram or disk space, on my server between the browser and bunny stream.

  • @cnbeining said:

    Yeah you do need to run a server for handling uploads. Call me old school but I don’t feel very comfortable exposing S3 endpoints directly to the public.

    Yeah thats my thinking too, thats why im looking for perhaps some other solution. as the only one i can think of is getting another server or vps to just! handle the uploads and ship them off to bunny stream when finished. dont want tank the whole site just because uploada eat up all ram.

  • @cadddr said:
    Why do you want "dynamic storage"? It is definitely not dynamic for someone.

    If it is for cost reason, it will likely cost more than getting the peak buffer size for your use-case, if you find someone who will offer niche features like this.

    How big are your files and in which scenario you envision your application being available but bunnycdn unreachable?

    I meant dynamic in the sense that it would not run out if many uploads happened at the same time, so it just grew with it. got no experience with it and wanted to see if such a thing exist, or/and is standard. the file size limit is probably going to be around 3-5gb, need to test properly. but its all multimedia files.

  • I talked to my host provider, and think I will just get a vps with 8? 16? gb ram and 100gb ssd just for this purpose.

  • Previously I though that big files is > 10 GB per file.. so 100 GB is meh.. you need at least 2 TB storage..

    Thanked by 1maxwell_
  • @chocolateshirt said:
    Previously I though that big files is > 10 GB per file.. so 100 GB is meh.. you need at least 2 TB storage..

    I talked to my host, and they said that an upload is not stored in RAM all the time, but slowly written to disk as well. And as soon as the file (say 5gb) is finished uploading, my script does a database entry, and moves the file to bunny. Its not going to reside on my server for long. And its all paying customers using it, so I dont think that many people will be uploading at the very same time. But yeah, maybe i need a bigger drive. will see what prices they offer.

  • @maxwell_ said:

    Yeah you do need to run a server for handling uploads. Call me old school but I don’t feel very comfortable exposing S3 endpoints directly to the public.

    You have some misunderstanding. You don't expose S3 endpoint for public usage. For each uploading request from your website, your user need to call your web API to be authenticated and your API returns a URL & a temporary access key for that specific uploading request.
    In that case, your server only need to generate the key and the uploading resource is the responsibility of S3 endpoint.

    Thanked by 2bulbasaur maxwell_
  • Take a look at the following tutorial and find the part from

    The client-side code is responsible for achieving two things:
    Retrieve a signed request from the app with which the image can be PUT to S3
    Actually PUT the image to S3 using the signed request
    
    

    https://devcenter.heroku.com/articles/s3-upload-node#initial-setup

    Thanked by 1maxwell_
  • @quanhua92 said:

    @maxwell_ said:

    Yeah you do need to run a server for handling uploads. Call me old school but I don’t feel very comfortable exposing S3 endpoints directly to the public.

    You have some misunderstanding. You don't expose S3 endpoint for public usage. For each uploading request from your website, your user need to call your web API to be authenticated and your API returns a URL & a temporary access key for that specific uploading request.
    In that case, your server only need to generate the key and the uploading resource is the responsibility of S3 endpoint.

    that is interesting for sure. who's a good S3 provider? i hate amazon, so want to stay away from them if possible.

  • @maxwell_ said:

    @quanhua92 said:

    @maxwell_ said:

    Yeah you do need to run a server for handling uploads. Call me old school but I don’t feel very comfortable exposing S3 endpoints directly to the public.

    You have some misunderstanding. You don't expose S3 endpoint for public usage. For each uploading request from your website, your user need to call your web API to be authenticated and your API returns a URL & a temporary access key for that specific uploading request.
    In that case, your server only need to generate the key and the uploading resource is the responsibility of S3 endpoint.

    that is interesting for sure. who's a good S3 provider? i hate amazon, so want to stay away from them if possible.

    Take a look at Bunny Edge storage. Figure out if it can transfer to stream. Otherwise, Backblaze, wasabi.

    Thanked by 1maxwell_
  • @maxwell_ said: who's a good S3 provider?

    Well, S3 is part of AWS, so I would say AWS is best.

    However, on your specific use case, I would recommend using something like Linode, DO, Scaleway, etc. as intr-datacenter transfers won't use up your traffic allotment. Scaleway's base pricing might be a bit expensive at 7.2EUR/m but their object storage lineup is free up to 75GB, and egress is also free (but capped at 100Mbps max), so that might fit your needs

    Thanked by 1maxwell_
  • @quanhua92 said:

    Take a look at Bunny Edge storage. Figure out if it can transfer to stream. Otherwise, Backblaze, wasabi.

    Just sent a support ticket to them with exactly what you suggested :smile:

  • @sanvit said:

    @maxwell_ said: who's a good S3 provider?

    Well, S3 is part of AWS, so I would say AWS is best.

    However, on your specific use case, I would recommend using something like Linode, DO, Scaleway, etc. as intr-datacenter transfers won't use up your traffic allotment. Scaleway's base pricing might be a bit expensive at 7.2EUR/m but their object storage lineup is free up to 75GB, and egress is also free (but capped at 100Mbps max), so that might fit your needs

    Thats interesting, will have to read up on that. but you are saying i could possibly upload straight from customers browser to scaleway/DO and then possibly move it to bunny? (if they support it)

Sign In or Register to comment.