Answer the question
In order to leave comments, you need to log in
MongoDB has been unresponsive for a long time. What could be the reason?
Good afternoon.
We've been fighting Monga for a long time and can't find the cause of this behavior.
We have a collection (more than 16 million records), most of them are tied to one client. On certain days (I don’t know what it depends on, magnetic storms can), when querying this collection and fetching data per day for each company separately, the cursor is received for a long time and, as a result, the exception "Read timed out after reading 0 bytes, waited for 120.000000 seconds.".
We have already tried everything possible. Various composite indexes. Data sampling method (switched from find to aggregateCursor with batch size of 50 records). And all the same, on this client, there is a hangup. Those. in the process of executing requests, there are 4-5 successful requests, then an error and the rest of the requests are successful. Even catching the error and re-requesting after 10 minutes did not give any result.
Moreover, for some reason, every morning, for the same client, it is not possible to access even another collection, also by index. Until I make a manual request for explain on this index. The first such request is executed for a long time, then quickly and the database hangs.
Mongo: 3.0.4 on Linux CentOS
Query:
'timestamp' => [
'$gte' => 1482624000,
'$lte' => 1482710399,
],
'type' => [
'$in' => ['one', 'two', ...]
],
'client_id' => 215,
'status' => [
'$ne' => -100
]
Answer the question
In order to leave comments, you need to log in
If you have 16 million records, most of which belong to one client, then everything may still depend on which fields you select when querying. If there are a lot of selectable fields or they are voluminous and at the same time a lot of documents are returned by request - in theory, the request can take a very long time, that's why the cursor falls by timeout.
Probably, you can try to play with the number of returned fields, then if everything returns to normal when the amount of information being pulled out decreases, then this is the point.
You can try increasing the cursor timeout. Again, if the problem disappears by increasing the cursor timeout, then it's about the amount of data.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question