S
S
sumi3sew2020-01-06 02:06:38
Python
sumi3sew, 2020-01-06 02:06:38

Why http requests with proxies require so much RAM?

There is a script that sends requests to different sites, looks something like this:

def get_proxy():
    proxy_list = [‘первая прокся´, ´вторая´]
    proxy = random.choice(proxy_list)
    return proxy


While True:
    proxy = get_proxy()
    r = requests.get(url, proxies=proxy)

    proxy = get_proxy()
    r = requests.get(url, proxies=proxy)

    proxy = get_proxy()
    r = requests.get(url, proxies=proxy)

That is, a normal cycle with sending requests (there are several conditions for completing the cycle, but this does not play a role)
And that's the problem.
If you remove everything in the proxy selection method and write
proxy = {}
return proxy
Then the script will take 30-35 mb RAM at startup and after several hours of work
But if you return not an empty value, but select a proxy from an array (as in the first example), an hour after launch, the script starts to take up 500 or more MB of RAM, because of what could it be? How to fix it?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
S
SagePtr, 2020-01-06
@SagePtr

Apparently, there is a memory leak in the requests library.
It: https://github.com/psf/requests/issues/4601 ?

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question