Question for Web Server Gurus and Security Ninjas.

How to prevent bots, crawlers, spammers sending various numerous requests to your web servers?

There have been numerous requests to routes like /admin /ssh /phpmyadmin etc etc and all kinds of stuff to the web server.

Is there a way to automatically block those stupid IPs :/

  • 10
    Fail2ban is simple and reasonably effective
  • 7
    With ipset you can create a list of ip and ip blocks to avoid (it works in combination with iptables).
    The idea is:
    1) get an incoming packet
    2) check if it comes from the list of banned ips; if yes, discard it
    3) otherwise, keep traversing the other iptables rules
  • 6
    Unix: fail2ban

    Windows: wail2ban (I’ve never used it but it’s out there)
  • 8
    Two things I do. Block ssh port, admin path and any other paths that you don't want crawled like .git etc. You can allow only certain ip's through, so whitelist the ip's you need to access those things and block all others. I use nginx rules to block paths and iptables to block ports and only allow a whitelist of ips through.

    In nginx it's something like

    /admin/ {
    deny all;

    etc. Or you can block via http auth too so need password to access. But then that would just get hit constantly i guess, i don't use this method myself, just mentioning it.

    Second thing is i block useragents I don't want access like baidu spider, semrush etc in nginx file. Lastly, i use nginx throttling forget what is called, but if certain number of hits from same ip in a certain amount of time it will temporarily throttle that traffic. I'll have to look up what it's actually called. I also parse access logs and just outright block certain ip blocks with iptables if those ip blocks are problematic.

    For my personal servers I use iptables rules that block brute force.

    Arch wiki had good article on how to implement a stateful firewall.


    I've used fail2ban before with success, but it can also block legitimate traffic if not set up properly.
  • 4
    Here is the nginx rate limiting feature almost no one knows about it uses, but it's awesome.

  • 2
    As others have mentioned, fail2ban, and lock down certain pages to specific addresses.
  • 1
    Ok probably the most noob question you have every seen regarding this topic.

    The devs said they have setup everything and also using cloudflare.

    So why do I still see those requests in the logs?
  • 1
    @cursee cloudflare enterprise?
  • 3
    Nginx rate limiting and CSF?
  • 1

    Are they 200 requests or 503/403 requests? If they are still 200 not locked down. As far as ports are concerned if they are getting hit at all from non authorized ips they are def not locked down.

    Do a port scan from a vpn ip or an ip outside of the one that you usually use that should be whitelisted. nmap.

    As far as people using Cloudflare I think cdns are just waste of money and do absolutely nothing. It also depends how much control is given to cloudflare, if it's just used as cdn then it won't really block or throttle any traffic. I've never ever in the history of my career actually seen a cdn or cloudflare make any difference to a site's performance or do anything useful at all.
  • 1
    ipset, Fail2ban, Imunify360, CSF/LFD
  • 0
    Thank guys. I have advised them to learn about fail2ban and implement it.

    Let's see 😁

    I'll share update here if there is any.

    Personally I just use someone's services for my projects if it's something I'm not strong at. This advice is for a friend.
Your Job Suck?
Get a Better Job
Add Comment