A
A
ArtiomK2020-10-26 19:15:48
Flask
ArtiomK, 2020-10-26 19:15:48

What do you consider a long request on the web server side?

It takes 0.36s - 0.4s to process a request on the http side of the Flask web server with the participation of Redis (not on localhost), of which 0.26s to convert results from Redis in Python code (convert bytes to a string, strings to a nested list structure, pass according to these lists, etc.) GET request body 835kb

Getting the result by the user (localhost) 0.5 - 0.8c.

Is it worth considering the processing of a user's request by an http server 0.36s - 0.4s as a bad result and try to improve it? What is the http server request time considered to be long and is taken out for example in Celery?

Answer the question

In order to leave comments, you need to log in

2 answer(s)
A
Alexey Cheremisin, 2020-10-26
@ArtiomK

I have a criterion that the request should not be processed longer than 100 milliseconds. If longer - we optimize, cache, redo, then - everywhere. In fact, in most cases, 100 ms is quite a lot, but it all depends on the load on the server, the number of RPS and other garbage.
One tenth of a second is a comfortable answer for any user, for good, a lot of any time is added here for: data transfer, browser rendering, JS launch, rendering ...
As a result: everything about everything would be nice to keep within 200-400 ms.

R
Roman Mirilaczvili, 2020-11-16
@2ord

You need to look not only at the private response, but also at the number of requests and the time it takes. If 1 request is answered in 500ms and this is a large report, then it is quite acceptable time. And if this is only a tenth of all requests, and this slows down rendering on the client, then this is no good. It is optimal to send a response so that the client receives it within 100 ms. Above this, the client already notices the response time.

GET request body 835kb
This is a rather large size for a response, and therefore it is required to form it first, and then give it to the client.
Instead, you can pre-form the answer (in the background with queues), and when requested, give a ready-made answer. Or, if possible, put the task in a queue and inform the client about the need to wait, and then, upon completion of the task, issue a download link or issue a ready-made answer on the spot.
it takes 0.36s - 0.4s, of which 0.26s to convert results from Redis in Python code (converting bytes to a string, strings to a nested list structure, walking through these lists, etc.)
Perhaps Redis stores data in a suboptimal way. Maybe you need to think about how to store so that data processing does not take so much time. Or maybe, in general, store data in a relational DBMS. Try to store the data in a materialized view so as not to waste time responding.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question