S
S
Sergey Nozdrin2016-01-30 23:45:41
PHP
Sergey Nozdrin, 2016-01-30 23:45:41

Why isn't cURL downloading the entire page?

Hello!
For a long time I had a script for parsing my partner - another site - on my site. The whole task of parsing is to download the content of the partner site page a couple of times a day and search for keywords.
Everything suddenly broke down today. I ask the community to help figure out what is the reason.
Parsing was done like this:

$ch = curl_init ();
curl_setopt ($ch , CURLOPT_URL , 'http://www.vkbn.ru/rostov/suvorovskiy/flats/1526/');
curl_setopt ($ch , CURLOPT_USERAGENT , "Mozilla/5.0 (Windows; U; Windows NT 5.1; ru-RU; rv:1.7.12) Gecko/20050919 Firefox/1.0.7");
curl_setopt ($ch , CURLOPT_RETURNTRANSFER , 1 );
$flat_page = curl_exec($ch); 
curl_close($ch);

As a result, now I get only a piece of the page I need, and not the whole page. Same result when trying to download a page via file_get_content. Maybe someone faced a similar situation?
PS By the way, the situation is exactly the same if you download this page from the desktop via wget. Cuts the page in the same place.

Answer the question

In order to leave comments, you need to log in

1 answer(s)
A
Andrzej Wielski, 2016-01-31
@light204

I don’t see any problems, everything loaded fine on my locale.
Through wget as well.
Kick your host. Maybe some kind of limitation.
PS An array in source code with display none is powerful. You at least make an API for the purpose of exchanging information, do not litter the source code with this. It's bad for SEO, to say the least.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question