Answer the question
In order to leave comments, you need to log in
Algorithm for reading a large number of files in php
Good afternoon!
I can not understand how to build an algorithm for creating a backup. There are a large number of files and folders, how to count them all to begin with, for example, into a file, so that later you can slowly add them to the zip archive? The problem is that the allotted time for executing the php script may not be enough - you need to break it down into steps.
At least a link to material that will shed light on this problem or a script framework is required.
Thanks in advance!
Answer the question
In order to leave comments, you need to log in
Another option is possible:
1. We send a request from the browser via AJAX in the background to start the archiving process (do not wait for a mono response), in the php script on the server we use "set_time_limit (0)" so that the script does not die by timeout, in the script we archive the files and periodically write progress to the session (how much is packed, how much is left, what we are packing now) and read some value from the session that can tell us that we need to stop the archiving process.
2. Periodically make an AJAX request from the browser, read the progress values from the session on the server and send them to the browser.
3. If it will be necessary to interrupt the process, then we send a request to the server via AJAX, on the server in the session we set a certain value that will tell the archiving script to stop.
It is convenient because even if you close the browser, the process will not stop, and when you try again, you can find out what else is being archived and get the status.
Yes, there is no way to do it - php was created to die!!!rasras111
There is such a thing as queues, you add the task "collect such and such files into the archive", and then every 5 seconds, for example, you look at whether it is ready or not.
There is also an option with sockets and long pooling, but I think it's too complicated for such a task.
No need to try to implement anything through php.
You need to know about the existence of specialized tools.
In particular, if you need to archive a directory with files, then this is done with the command
tar -czf archive.tgz /path/to/catalog
This command is added to the cron.
And you don’t need to invent anything on the hook
1. From the browser via AJAX we request a list of files required for packaging. At this time, we prepare it on the server and set the sign of the beginning of the process (simofor, for example: a lock file, which other scripts will be required to check so as not to change the structure of directories and files, and so that the archiving process does not start again while there is a running process).
2. If a message comes from the server that the process is running, display a message in the browser.
3. If we received a list of files, then iterate through the list and send a request to add a file to the archive via AJAX, one by one, while you can display progress in the browser (how much is packed, how much is left, what we are packing now).
4. When all the files have been sorted, we send an AJAX request and tell the server that we have finished, we remove the symbol on the server, and send the url to download the archive in response (if necessary).
Only a problem may arise when the page in the browser is closed, then the simophore will not be removed and the script will not work again, for a solution, you can check when it was installed and ignore it after some time, or make a forced launch button.
I apologize for the inaccuracies. I answer:
1. it will be called from the browser;
2. php - the language in which I am trying to implement this algorithm.
How to get a list of directories and files
What is recursion
How to work with ZIP in PHP
How to increase script execution time
connect via ftp from a local machine using python and make a dump
If a person so badly wants to get confused, then why not? I do not consider the situation "another 100500 files were added to the archive during the packing process". I assume that from the moment the script is launched until its completion, the list of files is unchanged.
To complete the task, we need two scripts - a "packer" server and a "questioner" client.
1. From the browser we request the "questioner", it determines by some sign, for example, by the lock-file or by the list of processes, that:
a. Packer is not running
b. "Packer" is running - goto p.3
2. If the "packer" is not running - run
3. Read files list.log and done.log, get the number of lines in both; the first is a list of all files, the second is a list of already packed ones.
4. We show the user how many files are packed, how many are left.
5. Refresh the page (via javascript , via meta-
refresh
, or at least via computer reset) and goto p
. log collects a list of all files to be archived.
2. In a cycle:
a. read file line by line
b. adds a file to the archive
c. writes the file name to the file done.log
3. Upon completion, it dies, as it should be for any normal puff script.
Well, we're not looking for easy ways, right? It is necessary to do this in PHP, step by step, so that the user can see all the processes. Yes, with a strong desire in the "packer" you can still save not only the file name, but also the file size. And then write a tricky magic formula for calculating the packing time. But this is a separate topic, which the author will have to ask in a separate question.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question