Howto block badbots, crawlers & scrapers using list file

Hi Jurgenhaas,
Thank you for your response.
The backend consists of three seperate apache2 servers all running the same content. There is no other way into the environment but via haproxy first. I have tried with your change and this seems to be working now. Thank you very much.

I think the difference here where I had “hdr_reg(User-Agent)” and you have "hdr_sub(user-agent)"
either the hdr is correct or ‘user-agent’ is case sensitive although when I checked my config with “haproxy -f /etc/haproxy/haproxy.cfg -c” it didn’t show any errors. So I would assume that the “hdr_sub” is the one that helped.

Do you know of any way I could prevent new bots from attempting to crawl our site as I find it places load on the ‘overall picture’ if you know what I mean.

Many thanks