Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Help with spammers/bots
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Help with spammers/bots

vladincvladinc Member
edited March 2013 in Help

I got a dedicated server from OVH .. and few sites hosted there..
What I have notices for the last few days is increased load on the server and registrations on the web sites from spammers.. A lot..
I am looking for way to block these .. something like dynamic list of updated IP's which after that are blocked with IP tables or CSF firewall rules..
I found this : http://www.zoobey.com/index.php/resources/all-articles-list/417-spam-reduction-using-spamhausorg-and-iptables which is exactly what I am after .. but the spamhous web site seems to be down for the last few days .. So its not really usable..
Any ideas how I can make something like this working..
Thanks a lot

Comments

  • jarjar Patron Provider, Top Host, Veteran

    My opinion: http://rules.emergingthreats.net/

    Gonna be a large rule list.

  • That's nice.. but how to implement this rule set into the firewall..
    or update the firewall / iptables with this ruleset?

  • DomainBopDomainBop Member
    edited March 2013

    something like dynamic list of updated IP's which after that are blocked with IP tables or CSF firewall rules.. ..

    I found this : http://www.zoobey.com/index.php/resources/all-articles-list/417-spam-reduction-using-spamhausorg-and-iptables which is exactly what I am after

    CSF has Spamhaus builtin (SECTION:Global Lists/DYNDNS/Blacklists)

    If the sites are php sites you could also install Zbblock
    http://www.spambotsecurity.com/zbblock.php

  • One word: captcha

  • vladincvladinc Member
    edited March 2013

    @BronzeByte - got it already :)
    @DomainBop - how to enable it in CSF?
    edit - found it :)

  • DomainBopDomainBop Member
    edited March 2013

    @vladinc go to section "Global Lists/DYNDNS/Blacklists" (in /etc/csf/csf.conf) and change the value LF_SPAMHAUS to "86400" and LF_SPAMHAUS_EXTENDED to "1" (86400 updates the spamhaus list every 24 hours)

    I'd also suggest enabling the dshield and bogon lists (same section, LF_DSHIELD and LF_BOGON, change the values to "86400" to update lists every 24 hours)

  • KrisKris Member

    @vladinc said: @BronzeByte - got it already :)

    Use a captcha software like SolveMedia / flash powered ones that require more than a typed word if you want to avoid registration.

    Too many easy to use external and cheap captcha typers for a simple old-school PHP generated captchas to do anything.

  • yes .. @DomainBop - found it and enable it ..
    let see what will happen in the next day or so..
    Could all these things slow down my server significantly ?

  • @Kris said: Use a captcha software like SolveMedia / flash powered ones that require more than a typed word if you want to avoid registration.

    This.

    Trial out different captcha systems to see what works best for you; be sure to use some form of analytics to make sure legitimate users trying to signup aren't being scared away too, though.

    Are you using WordPress by any chance? If so, I may have a couple suggestions for you.

  • natestammnatestamm Member
    edited March 2013

    I like more of the application related solutions in this regard. I think depending on your volume a combination of measures will help. Major content driven sites employ not only a multitude of these techniques But they evaluate users differently too. I am really torn with RBLs and other published lists. I like what seems to me to be more progressive trends for establishing sender Ip reputation. But that's a little off track, just that at the end of the day I would want to give what may have been a compromised Ip or similar another chance because that can happen to VM buyers too. With any list you're going to have to do more reconciliation yourself *at least using a solid script or risk being unfair. Meh, there's always a Butt, you just have to look for it.

  • Awmusic12635Awmusic12635 Member, Host Rep

    Cloudflare is always an option. Great at reducing spam

  • klikliklikli Member
    edited March 2013

    @Kris said: flash powered one

    Go HTML5

  • Depending on your situation, BlockScript may be another solution for you. BlockScript is also equipped with a built-in API which can query the blacklists contained within the software.

  • the same owner of blockscript in glype?

  • erhwegesrgsrerhwegesrgsr Member
    edited March 2013

    @BlockScript said: Depending on your situation, BlockScript may be another solution for you. BlockScript is also equipped with a built-in API which can query the blacklists contained within the software.

    BlockScript is really worthless software... blocks so much legit traffic and there are many free alternatives

  • BlockScriptBlockScript Member
    edited March 2013

    @BronzeByte said: BlockScript is really worthless software... blocks so much legit traffic and there are many free alternatives

    BlockScript is unique in that there is no other similar software or blacklists like it. If you aren't interested in blocking Tor, open proxies, VPNs, or bots then I would not suggest it for your site. I'm not interested in interrupting this thread, just making a comment.

  • @BlockScript said: Tor, open proxies, VPNs

    Just check if originating ISP is a home provider, check proxy ports and Tor has a list of exit nodes...

    @BlockScript said: bots

    checking the user agent I guess? Malicious bots will fake that so I don't see a point why you won't just utilize robots.txt as the only legit bots are search engines

  • BlockScriptBlockScript Member
    edited March 2013

    @BronzeByte said: Just check if originating ISP is a home provider, check proxy ports and Tor has a list of exit nodes...

    @BronzeByte said: checking the user agent I guess? Malicious bots will fake that so I don't see a point why you won't just utilize robots.txt as the only legit bots are search engines

    Let me clear up some confusion because there is more to it than that. Port scanning each visitor to your site to see if they are connecting from a suspected proxy port is not a good approach as a sole method to detect a proxy. There are many types of proxies so there has to be a rhyme and reason to what you are doing.

    A robots.txt file will not protect your site from bad bots, especially ones that masquerade as a legitimate bot. This is why accurate and complete blacklists are important.

    Note that in addition to the customizable features of the software, the following proprietary IP address blacklists are contained within it and are updated (at least) once per day:

    • Ranges of hosting providers and proxies.
    • Ranges of "bad bot" networks.
    • Ranges used by Opera proxies.
    • Open proxies.
    • PlanetLab CoDeen proxies.
    • Tor exit nodes.

    Each BlockScript installation comes equipped with a powerful suite of APIs to enable developers to integrate BlockScript with external systems. One of the APIs enables you to query BlockScript for information about an IP address. For each query, BlockScript will perform all relevant tests against the target IP address as per the preferences set in your BlockScript control panel. You can decide if you want to block the traffic, send it elsewhere, give an error, or whatever you want.

    The queries you send will be to your BlockScript installation. This makes queries extremely efficient and fast. A lot of free RBL's are unreliable(as pointed out by the OP), contain bad or wrong information(as anyone can see), and do not contain information that BlockScript's blacklists contain.

    I really do appreciate feedback and suggestions. If you are serious about identifying and/or blocking suspect traffic to your site, I would suggest trying BlockScript, or at least take a moment to learn about how it works.

  • the following proprietary IP address blacklists
    Ranges of hosting providers and proxies.
    Ranges of "bad bot" networks.
    Tor exit nodes.

    zbblock also blocks those...and it's free

  • @BlockScript said: A robots.txt file will not protect your site from bad bots, especially ones that masquerade as a legitimate bot. This is why accurate and complete blacklists are important.

    Actually certain levels of PCI are harder on a viewable robots.txt as it can give a potential attacker a window into your file structure. Needless to say I agree with you. I employ a method for reverse host checking, ua confirmation, and ultimate a hard exit for a request to a robots.txt file upon 'bad bot failure', which can simultaneously be logged and referenced at a later date. It does require some attention to permissions regarding the involved scripts or you can open a nasty hole. But done properly will at minimum put a stop to unnecessarily growing access logs.

  • @DomainBop said: go to section "Global Lists/DYNDNS/Blacklists" (in /etc/csf/csf.conf) and change the value LF_SPAMHAUS to "86400" and LF_SPAMHAUS_EXTENDED to "1" (86400 updates the spamhaus list every 24 hours)

    I'd also suggest enabling the dshield and bogon lists (same section, LF_DSHIELD and LF_BOGON, change the values to "86400" to update lists every 24 hours)

    Thanks, I need that too.

  • @Jarland,

    What are you importing those blocks into?

Sign In or Register to comment.