I have a collection with 763 documents which I fetch 200 at a time using cursor based pagination.
I am using MongoDB Find Function and then using cursor.All to unmarshall the response into their respective data structures.
The first 3 pages (200*3 = 600) work smoothly however on the last page I see a context deadline exceeded issue.

incomplete read of message header: context deadline exceeded

Upon further inspection it’s coming from the following line of code

cursorErr := dbCursor.All(childctx, result);

This happens when the limit is set as 200, however if I change it to 163 (the exact number of documents on the last page) it works perfectly.

I tried the exact same query on Compass and it works fine.

@Hrishikesh_Thakkar Thank you for the question! It is possible that a context deadline is exceeded after the driver receive the server response but before it can finish reading it, which would result in the error:

incomplete read of message header: context deadline exceeded

In this particular case, the underlying net error would be “i/o timeout”. Does increasing the deadline on the childctx context resolve the issue?

Hi @Preston_Vasquez, I tried doing that but the issue isn’t with the timeout right? Also I have set the timeout as 60 seconds from 30 and the same issue persists.

I observed that if there’s 163 documents left to query and the limit set is 163 that works. However even changing the limit to 164 causes the context deadline exceeded issue

@Hrishikesh_Thakkar This error indicates that a context deadline has exceeded, the context times out the io causing a network error. Here is an example in gist that will occasionally reproduce the issue (you may have to adjust the deadline). In the case of this example, the solution is to extend the deadline. Does your implementation look similar?

Hi @Preston_Vasquez actually it does, thank you very much for sharing the gist. My only concern is that even without having a context deadline (i.e using context.Background()) there’s still no data returned. I even created a Python Implementation of the same query and its returning the first batch of 101 and after that the service just hangs. There’s no prompt. I waited for a couple of minutes.
Sharing the Python snippet as well

filter_criteria = {'address': '0x9c8ff314c9bc7f6e59a9d9225fb22946427edc03', '_id':{'$gt': '10x9c8ff314c9bc7f6e59a9d9225fb22946427edc03639'}} # Example filter sort_criteria = [('_id', pymongo.ASCENDING)] # Example sort criteria result = collection.find(filter_criteria, cursor_type=pymongo.CursorType.NON_TAILABLE).limit(200).sort(sort_criteria) print(result) # Print the fetched data for document in result: print("Printing Id") print(document["_id"]) # Close the MongoDB connection client.close()

Not sure if this helps but when I exported the sample set to a separate collection the queries worked as expected. It seems that the issue arises when they are in larger collections.

2 years later

Hii Community,
We are too facing this issue. Anyone from community have solution for this?