Answer the question
In order to leave comments, you need to log in
How to protect the registration page in MediaWiki from robots?
I have a small wiki project based on the MediaWiki engine. For some time now, there has been an overrun of processor time on it, because of which the hoster threatens to disable access. Judging by the logs, these are robots from different IPs breaking into the registration page. Basically the request goes to /index.php?title=Special:UserLogin&action=submitlogin&type=signup&returnto=Title page. For example, there were 339 such requests from the same IP per day. Of all requests to the Special:UserLogin page, there were 88%.
How are such problems usually dealt with?
There is a ReCaptcha on the registration page, but what's the point? The costs of generating captchas are transferred to a third-party server, but this does not prevent the robot from hitting my page over and over again, consuming my resources.
Answer the question
In order to leave comments, you need to log in
I ran into the same problem when I was deploying a wiki for the needs of a small group of developers. I solved this problem by installing an authorization server through nginx - the load on the server dropped sharply. Is this option right for you?
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question