Answer the question
In order to leave comments, you need to log in
Fast server response code parser for 1 million sites. Have you chosen PHP so far?
Good day. The question is, I'm going to make a simple but fast parser of server response headers and parsing of main pages for about 1 million sites. So far, the choice has fallen on PHP, since there is a convenient CURL library for it (for parsing the main pages). But the confusing thing is that the PHP script runs into the limitations of nginx pretty quickly. You will have to continue parsing in several iterations and create an additional load on the database. Yes, and PHP is not designed for such long tasks inherently.
Ideally, parse every day. Do you think it's worth trying something else, or is PHP the right choice?
Answer the question
In order to leave comments, you need to log in
If about speed, then I would choose golang. There tasks are easy to asynchrony and parallelize.
In second place there will be a node, but using some kind of
µWebSockets.
Google about asynchronous cURL in PHP, and if possible, run PHP through the console. When launched through the console, there is no timeout limit, respectively, there will be no load, I did it myself on a very weak VDS.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question