Answer the question
In order to leave comments, you need to log in
How to create a large JSON file and output it as a response in Node.JS?
I have a fairly simple controller that fires when a specific URL is accessed. As you can see, the controller takes the parameter values (start_date and end_date) that were specified in the URL and uses them in the MySQL database query. In other words, my task is to get a pool of data for a certain period and send JSON as a response.
async function get_all_information(req, res, next) {
try {
let start_date = req.query.start_date;
let end_date = req.query.end_date;
const binds = {};
binds.start_date = start_date;
binds.end_date = end_date;
let query = `SOME LONG SQL STATEMENT`;
await pool.query(query, binds, function (error, results) {
if (error) throw error;
console.log(results); // ~ 10 seconds
res.send(JSON.stringify(results)); // ~ 15 seconds
});
} catch (error) {
next(error);
}
}
GET /api/information?start_date =2018-09-01%2000:00:00&end_date=2018-09-30%2023:59:59 200 15490.389 ms - 144219608
Answer the question
In order to leave comments, you need to log in
Just stream the response. If there are any loads at all, then create files using the cron, but still give them through the stream in order to save the load on the RAM on the server.
https://www.npmjs.com/package/mysql#streaming-quer...
https://medium.freecodecamp.org/node-js-streams-ev...
// EDIT
https://www.npmjs .com/package/mysql#piping-results...
Of course, it will be necessary to sit around and play pranks. An example is sketched on the knee)
async function getExpensiveDataFromDb(req, res) {
// запускаем запрос без второго параметра
connection.query('YOUR HUGE SQL QUERY')
// получаем из него стрим
// из документации пример, что в буффер по 5 записей за раз пихаем
.stream({ highWaterMark: 5 })
// перенаправляем его в res
// который является class Http.Response extends Stream (Writable)
.pipe(res);
}
Options:
The data depends on the input parameters, so it makes no sense to translate it into statics.
The solution, of course, depends on the problem you are trying to solve.
From what is clear from your request. Answer:
It is better to batch the data by about 100 elements and stream the results in a loop.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question