All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Automating downloading / syncing. Help?!
So anyway... I'm thinking about making a small script that will periodically download files from this location:
http://www.arma2.com/beta-patch.php/
which has files on:
ftp://downloads.bistudio.com/arma2.com/update/beta/
However, when trying to access this area anonymously through an FTP client, I can't seem to view or download anything, but if I connect directly through it via ftp://downloads.bistudio.com/arma2.com/update/beta/ARMA2_OA_Build_95208.zip
then I can.
My question is, is there a way for me to pick up every single file name from arma2.com/beta-patch.php and then do a command to "wget" them all and place them onto my server? But only download 'new' ones.
The reason for this is that when a new patch is released, DayZ (a mod) tends to also update to support new beta patches, which means that popular servers are running latest patches, the beta patch site is incredibly slow due to load, if I could run a script that would check every 6 hours and wget
if there's a new one, that would save me time to just download it from a more "unused" location, also allows me to advertise my "unofficial" mirror online.
Thanks,
Chris.
Intital thoughts were to just wget
recursively using a lengthly command i found, but doesnt seem to work, due to it can't get a directory listing of ftp://downloads.bistudio.com/arma2.com/update/beta/
; I've contacted the dev's to see if there's a way I can do this for a secondry mirror for myself, but as of yet heard nothing back.
Comments
will grab all the files linked to on that page, though I have no idea how to get timestamping working, since you can't access the FTP root. Could try messing with this?
wow, that's actually working. Thanks alot mate! What would be the command to grab the .log's too?
Nvm.
@ihatetony it won't allow me to use the -N feature to check to see if it's newer or not on my end. Is this something that will work after the first download?
Throw in -A.log (I think, anyway) to get the logs as well. -N won't work for FTP downloads unless wget can access the FTP listing, it seems. If nothing changes in the old files, you could try -nc to just have it skip downloading existing files?
Ah; that looks like fun, so it just matches the filenames?
Nothing will be changing since their just builds, not edited builds.
Also, tried to get --timestamping working, but that's obviously irrelevant if i'm just using -nc.
:'] I'll try and let you know. @ihatetonyy