Anyone having a suggestion on how to block cloud crawlers/bots? Obviously I 
would like search engine bots to have access, but all the other crap I want to 
lose. Only 'real users'.

What is best practice for this? Just getting amazon, googleusercontent, 
digitalocean, azure ip ranges and put them in something like ipset or are there 
currently better ways of doing this?


Reply via email to