New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Comments
Similar questions like yours.
Just rename the File to .jpg at the end. I did so and it's working perfectly.
Does these renamed rar files count against your general storage?
Yes, I just packed my whole backup-files together into one file.
And it was just a test to see if it's working or not.
So it doesn't work. It can upload, but will count against your general storage.
I mean on my student prime trial, I've got:
-"unlimited" photo
-5Gb general
if you're on this fully unlimited promo, then it doesn't matter.
And it was just a test to see if it's working or not.
Aaah, now I understand.
It worked and it doesn't counted as general storage. I could download this .jpg file on another computer, renamed it back to .rar and than I could unpack my files, which weren't corrupted because of the renaming.
I've got the student "free" package too with unlimited photo and video storage.
Interesting to know.
I believe it's not easy to upload there with scripts.
@nikki got something. amirite?
What exactly do you mean by this?
I am experimenting with acd_cli (works great!) also in combination with encfs (not so great) at the moment. You should have no problems due to the public API!
acd_cli + encfs works fine for me. if you upload too many unencrypted linux isos they may check for what you use the storage and if you host any obviously copyrighted isos.
Needs some tweaking on my side I think, default block size causes too many small packets what kills my CPU and throughput.
And don't worry, not going to use it for anything like this. Not my business...
Was evaluating this as an alternative to encrypting my servers backups with duply, such that especially the weaker ones don't hit any limits while backing up. So my main backup would just collect the backups from duply / rsync and then put it to ACD.
I may or may not be sitting on a node server that accepts files over http and now ftp (which also supports listing, renaming, moving, etc), auto encrypts, and sends it up to cloud drive >.>
I'll consider putting it somewhere, possibly on github, but for now I need to do some cleanup on it. It'll also have no instructions, though it's pretty simple to figure out I hope.
Meant no offense.
I know as much about coding as a chicken knows about tying its shoelaces.
That'll be awesome!
How would one move your storage off Amazon at a later stage if need be? Would you be able to RSYNC to a new backup host or ... ?
Sure, if you mount it via FUSE - should be no problem then.
Otherwise I would suggest rclone, but there are many more tools that feature downloading from Amazon Cloud Drive.
Update to this, my account was approved and is working fine.
No promises as to how good it is, but it does support HTTP uploads (useful for curl) and FTP (For other predefined services, even backup software)
https://github.com/nikkiii/backupd