Answer the question
In order to leave comments, you need to log in
PHP - Complete file not being read. Enough memory and time?
There is a script that has been working fine on localhost for 2 years on different versions of php - it loads and parses data from a csv file. Actually, the files are not that big - the maximum is 22 MB.
After the next reinstallation of the system (win7 64bit ultimate to a similar home premium) and reconfiguration of the web server (installation of new minor versions of Apache, php and mysql), the import suddenly went wrong - the commands file, fgets, file_get_contents and others suddenly started to issue only a piece of the file. Everything dances around a figure of about 65 KB, it definitely doesn’t read more, sometimes less.
The config contains memory_limit = 512M, max_execution_time = 1800
PHP full config: pastebin.com/rTiRr53t
Apache full config: pastebin.com/uSmpP684
The files are read like this:
$file_loc = 'pathtofile/file.dat';<br/>
$lines = file($file_loc);<br/>
<br/>
foreach ($lines as $line_num => $line)<br/>
{<br/>
echo $line;<br/>
}<br/>
Answer the question
In order to leave comments, you need to log in
The problem, it seems to me, is not in the configuration.
See if filesize($file_loc) is the correct size.
If so, maybe try the old fashioned way?
$fh = fopen($file_loc, "rb");
$data = fread($fh, filesize($file_loc));
fclose($fh);
Or even read in parts?
while (!feof($fh))
$mytext = fgets($fh, 1024);
1. Do you use encoding conversion functions? (Directly or indirectly).
2. What version of PHP was there before?
It is highly desirable to get the code itself or its fragment.
Do you edit that php.ini? PHP has several of them for different launch options (apache, cli, FastCGI).
Have you read binary files using "inline" file()? Very interesting. Indeed, try file_get_contents().
Once I struggled with the problem for a long time (everything is correct and nothing works), and then the problem turned out to be in the suhosin patch.
Could this be the issue?
And try comparing the results of filesize($file_loc) and strlen(file_get_contents($file_loc)).
Is the size of the read data always the same?
Could it be the data itself?
> Now the question is - what kind of miracles and why did it work with both types of paths before?
We guess again. But it's almost certainly a matter of the file access protocol. And as a result - recoding. PHP has a problem with transcoding, especially under windows. The 64K limit of the conversion result is a normal situation for PHP. If you dig deeper, then there are transcoding settings and a choice of transcoding library.
Maybe it's not, but at least I'll throw in an idea.
I had something similar when downloading a video with a flash. I also splashed around 40kb. I found out that if the file is shorter, then the server sends Content-Length HTTP headers. And if it is longer, then there is no heading. The culprit was mod_deflate which shrugged the given files on the fly. If the file was smaller than the cache, then the server would compress first and might show the correct size.
Try rolling back Apache. If it helps, then firebug thoroughly compare the answer.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question