Hey everyone,
I’m currently in the process of updating some of our legacy code to use the Atlas Data API instead of raw connections with Python. I’m having no issues on that side, in fact, the code seems to be working perfectly fine. The problem I’m facing is that no matter how many write operations I send, those actions don’t actually get applied to our database and collection. I have the data source “PiCloudSA” linked to our only “PiCloudSA” cluster. If I check the usage / analytics and logs on the app, then all of the requests show as “OK” with my code getting responses with status code 201 as well as the id of the inserted document (When calling insertOne on the application).
I am checking the validity of this by filtering my documents in this collection by a date (descending order) formatted to a string (In this case the code produces “2024-06-27”) and I am unable to find any documents inserted in the past week. Even disregarding the filter by date, searching by the inserted ID returned when making the API call returns no results.
Any help would be appreciated.
Hey everyone. It’s been a long while since I originally opened this post only to realize how stupid of a mistake I’ve been making. It turns out that the requests were going through, but the Data API seems to have made a second database also called “PiCloudSA” instead of going to our main database within the “PiCloudSA” cluster called “pishop”.
steevej
(Steeve Juneau)
3
It is good to know that everything is fine.
So it looks like either the field database: was missing from the body of the request or was specifying the value PiCloudSA rather than pishop.