Answer the question
In order to leave comments, you need to log in
How to implement code execution when one of the functions in it is not executed immediately?
Let's say there is such a task: get a page, cut tags, return the result.
How it's done in the standard way:
function get_url_without_tags($url) {
$content = download_page($url);
$content_without_tags = remove_tags($content);
return $content_without_tags;
}
function get_url_without_tags($url) {
task::add('download_page', $url, function ($content) {
$content_without_tags = remove_tags($content);
return $content_without_tags;
});
}
Answer the question
In order to leave comments, you need to log in
inside the anonymous function, there is no access to the original parameters (in this case, to $url)
function get_url_without_tags($url) {
task::add('download_page', $url, function ($content) use ($url) {
$content_without_tags = remove_tags($content);
return $content_without_tags;
});
}
Make the second queue for parsing the results and add tasks there in the load handler.
Do this:
— the download_page($url) function adds a URL for downloading and gets some ID of this URL in the queue;
- also add the upload status of each url to the queue;
- then in the cycle check the status by ID: if the status is "waiting" - then you are waiting, if the status is "completed" - execute the rest of the code.
You need to use m, but you can't pass a closure to the queue, so you need to have a task from a wrapper and a serializable array so that it can be restored by the task worker
Task::add('task_name', ['url' => $url, . ....]);
You were given an answer, but I had the experience of making this particular function, the task of downloading the URL, processing it and issuing data to the account. I decided it so I can’t give a specific code now, but the essence is this:
Script receiving all parameters
1. inserts all data into MySQL with the status “Waiting”, gets id
2. script script locks the .lock file ( php waits if it is currently locked)
3. opens the urls file writes ID:URL there (the ID that is received from the database) in APPEND mode
4. removes the lock from the file
5. gives the status “ID: Expected” to the frontend
6. completes its work.
The cron runs a bash script every 3 seconds that:
1. locks the .lock file
2. reads the urls file , cuts off 10 urls from there, puts the rest back into the file
3. removes the lock.
4. Using lynx, the url is downloaded and immediately dumped to the disk (there is a key for lynx, although you can download wget, depending on what task you have).
By cron every 10 seconds, a PHP script is launched that:
1. opens the folder and gets its listing
2. selects all tasks whose status is “pending”
3. 30 records of them are marked with the “processing” status
4. 30 of these files are processed, the result is ID(from file name) assigns the status in MySQL "done", all the data that was received during processing is inserted there
5. the script ends, deleting all processed files.
The Ajax script periodically checks whether the files that are in the queue are ready.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question