New on LowEndTalk? Please Register and read our Community Rules.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.
Usenet 'suckup' feed?
This is just another brainfart of mine so feel free to say 'You're nuts'.
I read somewhere* that a guy managed to setup a 'Usenet suckup feed' which transforms your server into a 'usenet feed/mirror'. It basically just downloads the files from another usenet provider and forwards them directly via your server to the end user.
I am interested how this guy accomplished this. What's the correct terminology for this, e.g. for what should I search?
Giganews seems to call it 'IHave and Suck Feed service' but I am curious if this can be accomplished with other providers as well.
Comments
If you have enough bandwidth and harddrives to suck up 10tb/day go for it!
Creating an index with Newznab is one thing - mirroring all the content seems like quite another.
Not planning on deploying this commercially/large scale or even setting this up. Just interested in the technology behind this
'For science'
You're not mirroring all stuff. You are basically a tunnel to their service. Nothing is permanently stored on your server, just a small cache
I'm not sure if it does what you think it does. From the Giganews description it seems like this is just an ordinary Usenet feed but instead of having to make peering agreements with all the big providers you just make one with Giganews
Read up on the archive.org people
How about, you grab an RSS feed of a particular group (or groups) you want to 'mirror'.
i.e. http://rss.binsearch.net/rss.php?max=50&g=alt.binaries.british.drama
Feed that into your newsgroup client, and make it request the feed every hour or so, and download everything.
Make the completed folder accessible via HTTP (with directory listings?)
Cron a command to delete all files that are over x days old within the web directory.
Seems like a waste of bandwidth to me though
Ouch that is indeed a big difference from what I thought It was!
They don't index usenet, do they?
VPSes come with tons of BW nowadays. I never even managed to exceed 250GB in a month so 'why can't I hold all these TBs' ?
Nice idea, sadly not very practical as your usenet client cannot pull it/use it.