Answer the question
In order to leave comments, you need to log in
How to filter bots for analytics?
Hello!
There is a script that, when you navigate through it, determines the phone model, user ip.
After I upload the script to my site - in the notebook itself, where this data comes, a lot of system ips appear, the same Googlebot that follows the links, as far as I understand, to index the site. And these search engine bots do it all the time. Every day. Also, search engine bots access my site through phone emulators.
Created a website, uploaded a data collection script. I don’t go through it myself, after a while I open it and see in the notebook the data that came from the computer, from the Samsung phone, from the iPhone. Perhaps search engines open my site from different gadgets to check how they load through a particular device.
As a result, I cannot understand which of them is a real person, and which are search engine bots. You have to check each ip for belonging to a search engine.
For example, if I need it for business statistics, then it turns out that half of the clicks to my site are made by bots, not people.
Is there a script that helps filter all this garbage into "robot" and "user"?
Answer the question
In order to leave comments, you need to log in
Created by myself.
I look exclusively at the server-side page access log and without any js analytics on the pages.
Timings and requests: page + resources to it: loaded as usual - it's more like a browser, a page without fully loading all its resources - 100% bot.
And a lot more in terms of headers, ip + rdns and based on historical accumulated knowledge ("weights") of real-time analytics of the entire metric...
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question