R
R
Ruslan Mordovanech2021-12-14 15:07:45
Python
Ruslan Mordovanech, 2021-12-14 15:07:45

How can I speed up the process of writing csv to json?

import requests
import csv
import urllib.request
import json

link = 'https://dsa.court.gov.ua/open_data_json.php?json=532'

response = requests.get(link).json()
urls = []
for item in response['Файли']:
    urls.append(list(item.values())[0])
    for url in urls:
        url = url
        response = urllib.request.urlopen(url)
        lines = [l.decode('utf-8') for l in response.readlines()]
        cr = csv.reader(lines, delimiter='\t')
        data = []
        for row in cr:
            if 'Херсонський міський суд Херсонської області' in row[0]:
                data.append({'court_name':row[0], 'case_number':row[1], 'case_proc':row[2], 'registration_date':row[3],
                     'judge':row[4], 'judges':row[5], 'participants':row[6], 'stage_date':row[7], 'stage_name':row[8],
                             'cause_result':row[9], 'cause_dep':row[10], 'type':row[11], 'description':row[12]})

        with open('12.json', 'a', encoding='utf-8') as f:
            json.dump(data, f, ensure_ascii=False, indent=13)


Everything works as it should, but the output and recording process itself takes a lot of time, can it be accelerated?
Forward Thank you!

Answer the question

In order to leave comments, you need to log in

1 answer(s)
S
Sergey Gornostaev, 2021-12-14
@Hery1

The most obvious is to not open the file on every iteration of the two nested loops.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question