New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Comments
What problem did you have with Nextcloud?
Nextcloud could not connect to their S3 service...just failed silently. Same with s3fs.
Nextcloud I can use it normally
Mind sharing your settings? I only put in bucket name, server endpoint, access key, and secret, enable SSL. Nextcloud failed to connect to the two regions I've used so far.
i pm you
Don't forget Pricing surprise : If you have file for download and use more than 3x the amount stored (store 100GB, egress 300GB) you pay per GB outgoing.
From their pricing page:
I don't consider the amount stored to be = "storage volume", especially how they reference "storage" in their pricing plan tables. I'd call the storage volume the 1TB, so if you have 100GB in files, you still have 3TB of egress.
If you were right and it was 3X the stored data, that would be complicated and stupidly fucked up and I wouldn't sign up for that (don't want to be charged for egress? ADD more data to be stored... that's fucked up).
Which region did you test?
store 100GB, egress 300GB? not 3X1T?
And now?
Nice alternative to B2 and Wasabi. Will use the pay-as-you-go plan of $4/TB. Those annual plans seem sketchy.
I jumped in to their pay-as-you-go plan for my personal backups and a storage solution.
For my mastodon instance, I'm proxying the protected bucket's files through a docker container. I'm loving it so far!
My initial plan was to use backblaze, but apparently it's banned in my country, for most stupidest reason. Ugh..
Wasabi was charging $6/month, so no pay-as-you-go plan, which is an overkill for me.
When I was struggling to find a cheap and decent solution, I stumbled upon idrive, and while discussing, their twitter account administer found my tweet and they have offered public access for my bucket, then after a quick support request (kudos to their support agent btw, he was right upfront and knew about the question and how to handle it quite nicely), after about 10 hours of a ticket period (timezone differences, support representative created a ticket for my stead) they enabled public access for my account.
However, I can confirm although files are publicly accessible by the URL (which could be fetched from the info section when the public option was enabled), I could also upload files to the bucket publicly, which is not good at all.
I could simply run
from a random place and I could upload this dummy.jpg. That's not what I wanted.
So I ended up keeping on using private buckets, and a using proxy to access assets for my CDN solution.
Nevertheless, I've been trying the service for about 2 days, and have been liking so far. The company has also been there since 1995, so I don't believe I'll have huge issues hopefully.
Ireland
Check Servarica.com. They offer 1Tb for 29 USD a year.
I think I solved the public read only thing! It looks like I can set the bucket policy with the AWS CLI Public read work but I cannot upload without authenticating.
I am now trying to sort out the certificate stuff with Cloudflare.
Got it all working.
E2 has enabled public access for your account?
Their twitter account confirmed to me that @vitobotta 's approach is the current way for the allowing only public reads.
However, sadly I could not put that policy through AWS CLI, I'd appreciate if you guys have some ready-to-copy-paste aws cli commands to disable public write and enable public reads.
Oh and they also mentioned me this:
So maybe I could just wait for a week (assuming it'd take just a week) and we'll see how it'll go.
I didn't do that because it makes the bucket completely public. Instead, I applied a bucket policy with only public reads enabled.
Amazing!
How did you do this?
Create a JSON file for the policy containing this:
Of course change "bucket-name" to the name of your bucket. Then to apply use the AWS CLI
PERFECT!!!
I never tried aws command before, but I found solution with
rclone
https://rclone.org/s3/#idrive-e2
Thanks @vitobotta for this, much appreciated!
Not related with this, but I'll try to monitor how much of a traffic is cached with default idrive headers from CloudFlare CDN (I'm modifying/adding
expires
andcache-control
headers to try to force Cloudflare to cache assets for a long period of time).If any one of you guys analyze this, I'd appreciate if you could share your experiences. Or maybe some custom rules through Cloudflare panel ?
--s3-acl
and--s3-bucket-acl
don't work for me. Can anybody confirm if PutBucketPolicy API works with new accounts and buckets?Ugh, can't edit my post. Apparently it's quite easy to create a cache rule on the cloudflare, and you have 10 cache rules on free plan.
Used for website?
Yup. I'm currently serving my mastodon instance's public assets through idrive, but running through a proxy for public access for static assets.
It's roughly something like this currently:
WWW -> Cloudflare + Proxy -> My Host Webserver -> Proxy docker container -> Private Idrive Upstream.
I'd rather remove that host webserver + proxy container from this flow, which I'm planning to achieve.
I also have another private bucket for my personal backups, which is running nicely so far.
They didn't enable public for me. It now shows "this feature is temporarily unavailable".
There was also another problem. I had like 1000 videos in a buckets without folders and after some time I noticed the bucket would lock completely up when trying to load the file list, even the files themselves wouldn't load. Then it would randomly load up, and then would lock up again after some time. I moved files to different folders and it seem to fixed that.
Try Workers.