Answer the question
In order to leave comments, you need to log in
How to wrap HTTP requests in a message queue?
We have the following infrastructure: several instances of the mobile application server are deployed, they access another internal service via the SOAP protocol, the internal service also has several instances, each of its instances can simultaneously execute a certain number of requests, nginx is now used to distribute the load between services with modules ngx_http_upstream_module and ngx_http_limit_req_module.
The task came from the project manager, to wrap HTTP requests into messages for the queue broker (any queue manager), and use these queues to balance the load on the internal service, because. at peak times, its performance is not sufficient for the normal operation of the application.
I did not find any information on this issue, I would like to hear the opinions of others, perhaps someone had a similar experience, and in general about the appropriateness of such a decision.
Answer the question
In order to leave comments, you need to log in
To appreciate the feasibility, you need to understand the goal. If the goal is "not to lose a single request", then it is advisable. But I doubt that all these requests need to be processed. Because they will lose their relevance faster than they can be processed. Some intellectual throttling can help here.
There is a good report from SKB-Kontur about this topic
Igor Lukanin - How to survive under load: failure ...
You can execute tasks in the queue if the tasks can be executed deferred. When processing in a queue, there is no guarantee of execution within a given period of time, and the interaction mechanisms become more complicated.
Peak load management requires you to scale out instances of your application when the load increases beyond a given limit. Along the way, you may need to scale the DBMS.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question