N
N
nurzhannogerbek2018-11-26 13:51:58
JavaScript
nurzhannogerbek, 2018-11-26 13:51:58

How to create a large JSON file and output it as a response in Node.JS?

I have a fairly simple controller that fires when a specific URL is accessed. As you can see, the controller takes the parameter values ​​(start_date and end_date) that were specified in the URL and uses them in the MySQL database query. In other words, my task is to get a pool of data for a certain period and send JSON as a response.

async function get_all_information(req, res, next) {
    try {
        let start_date = req.query.start_date;
        let end_date = req.query.end_date;

        const binds = {};
        binds.start_date = start_date;
        binds.end_date = end_date;

        let query = `SOME LONG SQL STATEMENT`;

        await pool.query(query, binds, function (error, results) {
            if (error) throw error;

            console.log(results); // ~ 10 seconds

            res.send(JSON.stringify(results)); // ~ 15 seconds
        });
    } catch (error) {
        next(error);
    }
}

For a month, the request is processed for about 10 seconds and returns a large pool of data. To be more precise, the sql query returns 227011 rows. I tried to convert all this result to JSON format using the stringify() method. The Postman application I used for testing just broadcasts and the program stops. When accessing the URL in the browser, the data comes in parts and takes a very long time to load. In the terminal I see the following:
GET /api/information?start_date =2018-09-01%2000:00:00&end_date=2018-09-30%2023:59:59 200 15490.389 ms - 144219608

I tried to analyze and noticed that the datapool per day creates ~13MB JSON file. It can be assumed that the data pool in a month will create plus or minus ~ 400 MB JSON file.
How do you usually create large JSON files in Node.JS? What is the Best Practice? What is usually done when a large amount of data needs to be returned?

Answer the question

In order to leave comments, you need to log in

3 answer(s)
M
Mikhail Osher, 2018-11-26
@miraage

Just stream the response. If there are any loads at all, then create files using the cron, but still give them through the stream in order to save the load on the RAM on the server.
https://www.npmjs.com/package/mysql#streaming-quer...
https://medium.freecodecamp.org/node-js-streams-ev...
// EDIT
https://www.npmjs .com/package/mysql#piping-results...
Of course, it will be necessary to sit around and play pranks. An example is sketched on the knee)

async function getExpensiveDataFromDb(req, res) {
   // запускаем запрос без второго параметра
  connection.query('YOUR HUGE SQL QUERY')
    // получаем из него стрим
    // из документации пример, что в буффер по 5 записей за раз пихаем 
    .stream({ highWaterMark: 5 })
    // перенаправляем его в res
    // который является class Http.Response extends Stream (Writable)
    .pipe(res);
}

R
Roman Mirilaczvili, 2018-11-26
@2ord

Options:

  1. Return data broken into parts (let's say 100+100+100+37 rows). The API should provide the ability to retrieve data in chunks.
  2. Upload data to some cloud storage and notify when the report is ready
  3. Return just aggregated data
  4. If we are talking about database synchronization, then set up replication or use method # 1

4
4uTePoK, 2018-11-27
@4uTePoK

The data depends on the input parameters, so it makes no sense to translate it into statics.
The solution, of course, depends on the problem you are trying to solve.
From what is clear from your request. Answer:
It is better to batch the data by about 100 elements and stream the results in a loop.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question