Answer the question
In order to leave comments, you need to log in
How to filter out fake traffic on a website?
Bots that imitate clicks on links come to the site. Browsing pages and the like.
Their IP address is different, the User-Agent does not differ from ordinary users, the countries are also different.
The only thing that gives them away is zero activity on the site and the interval of entry of all bots within an hour, in the amount of about 15 bots.
These bots can process JS scripts because Google Analytics accepts them as normal users. I tried to turn on spam bot filtering, but there was no result.
Banning by IP address, by country, by browser title is not an option. How else can you track fake traffic to the site from bots and prevent them from going to the site?
Answer the question
In order to leave comments, you need to log in
Define your 'zero activity' and this will be your filter condition.
ps the problem has no solution in the general form, it is an eternal struggle between the shield and the sword.
for sure they have one database of ip addresses that they use, no one has an infinite pool of addresses, so you can sooner or later collect all the addresses and ban by ip.
But before that, you need to build that base, highlighting what these visitors have in common. It is desirable to collect more such general parameters.
The answer is simple, as always:
1. We must use the "white" IP-list of search bots.
2. For everyone else, check the passed captcha through JS (captcha is your own unique one, not a recaptcha!) at the first internal transition within one session or the first POST request of a guest.
Also, I recommend the filter on my blog (the administration of the toaster forbade the link to be placed), so a Google search:"Universal STOP HOTLINKING for .htaccess"
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question