Answer the question
In order to leave comments, you need to log in
How to get output from urllib2 (or something similar) in real time?
I need to see that buffering is disabled. urllib2.urlopen('http://localhost/upload.php')
if you make a normal curl request from the console - I see a line, wait for a pause and another line.
It is necessary that the script could also understand this, that not 2 lines come at once, but in turn. With a difference of a few seconds. If you add a timeout (to at least see the first line) - crashes by mistake. In general, is there a normal way to run a request in the background mode and see, let's say every second, whether there is an exhaust from the request? Has anyone come across?
Answer the question
In order to leave comments, you need to log in
If I understand correctly, you need to ping the url? ...in this case, you don't have to download everything to see if the resource is available or if there is content
import urllib2
request = urllib2.Request('http://localhost:8080')
request.get_method = lambda : 'HEAD'
response = urllib2.urlopen(request)
print response.info() // content-length
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question