Answer the question
In order to leave comments, you need to log in
Problem with muliproccesing module?
You need to parse the data, open the file, read the urls from there, create an array with these urls.
Since there are a lot of urls, using the multiproccesing module, I create streams, or whatever it is called. The problem is this, everything goes fine until the end of the file comes up, when there are about 30-40 urls left, this error flies out.
And no matter how many pools I set 20 or 3. And no matter how many urls I set, 100 or 5000, the error still crashes. What could be the problem?
PS Added try except wherever possible, anyway.
Part of the source code:
def make_all(url):
get_data(get_html(url), url)
def main():
with open('urls.txt') as file:
temp_urls = file.readlines()
urls = []
for i in temp_urls:
url = i.strip()
urls.append(url)
with Pool(20) as pool:
pool.map(make_all, urls)
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question