Answer the question
In order to leave comments, you need to log in
Is forcing global.gc to run good or bad?
At the moment I work in a team that develops for one fairly large platform. There is a task to integrate clients of this platform.
While the integration bus is in development, it is necessary to implement the integration of several clients using ready-made methods.
This temporal integration is developed, but serious problems have arisen.
The amount of data is large and these methods are not ideal for handling big data.
We need to find a solution without rewriting the method, because too much is tied to them.
When integration is performed, RAM is completely clogged, and the CPU is spinning at the limit, and subsequently the server crashes.
Next, I break the work of the methods into sequential execution, so as not to store a large array of data in RAM and not to heavily load mongodb.
This solved the problem, but not much, since the RAM was still loading, and then the server dropped.
I tried to force the garbage collector --expose-gc with the launch of the node and after each call to mongo global.gc () called it. The result is positive, even with a small number of frames, integration is performed remarkably.
But now I need to draw conclusions, does this solve the problem, or is it a crutch, or is it a problem in the methods themselves, which causes a RAM leak. Could this lead to cleaning up vital data for the server to work?
Please provide feedback and possible solutions to this problem.
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question