L
L
lanabel2010-12-14 18:28:56
PHP
lanabel, 2010-12-14 18:28:56

PHP - Complete file not being read. Enough memory and time?

There is a script that has been working fine on localhost for 2 years on different versions of php - it loads and parses data from a csv file. Actually, the files are not that big - the maximum is 22 MB.
After the next reinstallation of the system (win7 64bit ultimate to a similar home premium) and reconfiguration of the web server (installation of new minor versions of Apache, php and mysql), the import suddenly went wrong - the commands file, fgets, file_get_contents and others suddenly started to issue only a piece of the file. Everything dances around a figure of about 65 KB, it definitely doesn’t read more, sometimes less.
The config contains memory_limit = 512M, max_execution_time = 1800
PHP full config: pastebin.com/rTiRr53t
Apache full config: pastebin.com/uSmpP684
The files are read like this:

$file_loc = 'pathtofile/file.dat';<br/>
$lines = file($file_loc);<br/>
<br/>
foreach ($lines as $line_num =&gt; $line)<br/>
 {<br/>
 echo $line;<br/>
 }<br/>

After the next moment, the line breaks. Google refers to memory_limit and downloading in parts using various tricks, but earlier downloading at a time also worked, since the files are not huge. Reverting to a previous PHP version didn't help.
What am I missing?

Answer the question

In order to leave comments, you need to log in

10 answer(s)
W
webscout, 2010-12-14
@webscout

The problem, it seems to me, is not in the configuration.
See if filesize($file_loc) is the correct size.
If so, maybe try the old fashioned way?
$fh = fopen($file_loc, "rb");
$data = fread($fh, filesize($file_loc));
fclose($fh);
Or even read in parts?
while (!feof($fh))
$mytext = fgets($fh, 1024);

E
eternals, 2010-12-14
@eternals

1. Do you use encoding conversion functions? (Directly or indirectly).
2. What version of PHP was there before?
It is highly desirable to get the code itself or its fragment.

N
Naps, 2010-12-14
@Naps

And try using file_get_contents instead of file

D
Dimonich, 2010-12-15
@Dimonich

Do you edit that php.ini? PHP has several of them for different launch options (apache, cli, FastCGI).

S
Sannis, 2010-12-15
@Sannis

Have you read binary files using "inline" file()? Very interesting. Indeed, try file_get_contents().

M
merlin-vrn, 2010-12-15
@merlin-vrn

Once I struggled with the problem for a long time (everything is correct and nothing works), and then the problem turned out to be in the suhosin patch.
Could this be the issue?

H
hayk, 2010-12-15
@hayk

And try comparing the results of filesize($file_loc) and strlen(file_get_contents($file_loc)).
Is the size of the read data always the same?
Could it be the data itself?

E
eternals, 2010-12-16
@eternals

> Now the question is - what kind of miracles and why did it work with both types of paths before?
We guess again. But it's almost certainly a matter of the file access protocol. And as a result - recoding. PHP has a problem with transcoding, especially under windows. The 64K limit of the conversion result is a normal situation for PHP. If you dig deeper, then there are transcoding settings and a choice of transcoding library.

S
SwampRunner, 2010-12-14
@SwampRunner

set_time_limit(0);?

@
@resurection, 2010-12-18
_

Maybe it's not, but at least I'll throw in an idea.
I had something similar when downloading a video with a flash. I also splashed around 40kb. I found out that if the file is shorter, then the server sends Content-Length HTTP headers. And if it is longer, then there is no heading. The culprit was mod_deflate which shrugged the given files on the fly. If the file was smaller than the cache, then the server would compress first and might show the correct size.
Try rolling back Apache. If it helps, then firebug thoroughly compare the answer.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question