2 / 2
Jan 14

Hi everyone,

I’m Darshan Hiranandani, working with a Time Series collection in MongoDB and I’m facing an issue when trying to delete documents between two dates. Here are the parameters for my collection:

timeseries: {
timeField: “timestamp”,
granularity: “seconds”
}

The collection contains market price data for an asset, and due to a change in the price calculation, I need to delete documents within a specific date range before reinserting them with corrections.

However, I’m encountering a limitation when trying to delete documents based on the timestamp field. According to the documentation, delete commands on Time Series collections only support queries matching metaField values.

This is where I get confused:

  • The timestamp field is the main index for the time series. Why can’t I use it to match documents for deletion?
  • Am I misunderstanding the purpose of Time Series collections in MongoDB?
  • How can I delete documents between two dates? I’m considering adding the timestamp in the metaField, but that feels redundant, and I’m unsure of the performance impact.

For reference, this is how I’ve attempted the operation in Go:

filter := bson.D{{
Key: “timestamp”,
Value: bson.D{
{
Key: “$gte”,
Value: timestamp_start,
},
{
Key: “$lt”,
Value: timestamp_end,
},
},
}}
result, err := coll.DeleteMany(context.TODO(), filter)

But I’m getting this error:
“Cannot perform an update or delete on a time-series collection when querying on a field that is not the metaField.”

Has anyone else run into this issue or found a workaround? Any help would be greatly appreciated!

Regards
Darshan Hiranandani

As of right now, this is not possible to be done. Due to the way time series collection store data in the background into buckets, it would defeat the purpose of optimization if you would be able to fragment the buckets at will, by deleting data points from it.

Time series collection is highly optimized for inserting constant stream of time series like data, which would contain value that was at that time with new document representing new value at later time.

In your case i would suggest exploring inserting data like that and than using aggregation pipeline to derive the latest value.

Miloš.