If I apply the densify function using day or month as a range unit on a period that contains daylight saving adjustment, Mongo returns to me sample with ugly dates, because it is not able to apply timezone with the densify step.

Example, if I have 2 samples like this (base location is Europe/Rome):

{
    "ts" : ISODate("2000-12-31T23:00:00.000+0000"),
    "meta" : {
        "device" : "custom",
        "series" : "custom:1"
    },
    "v" : 100.84432598221046
}
{
    "ts" : ISODate("2001-06-30T22:00:00.000+0000"),
    "meta" : {
        "device" : "custom",
        "series" : "custom:1"
    },
    "v" : 100.14378032120374,
}

when I apply densify the result is:

{
    "ts" : ISODate("2000-12-31T23:00:00.000+0000"),
    "meta" : {
        "device" : "custom",
        "series" : "custom:1"
    },
    "v" : 100.84432598221046
}
{
    "ts" : ISODate("2001-01-31T23:00:00.000+0000")
}
{
    "ts" : ISODate("2001-02-28T23:00:00.000+0000")
}
{
    "ts" : ISODate("2001-03-28T23:00:00.000+0000")
}
{
    "ts" : ISODate("2001-04-28T23:00:00.000+0000")
}
{
    "ts" : ISODate("2001-05-28T23:00:00.000+0000")
}
{
    "ts" : ISODate("2001-06-28T23:00:00.000+0000")
}
{
    "ts" : ISODate("2001-06-30T22:00:00.000+0000"),
    "meta" : {
        "device" : "custom",
        "series" : "custom:1"
    },
    "v" : 100.14378032120374,
}

you can see it starts to use day 28!!!

If I have samples like this:

{
    "ts" : ISODate("2001-01-31T23:00:00.000+0000"),
    "meta" : {
        "device" : "custom",
        "series" : "custom:1"
    },
    "v" : 100.50917810392826,
}
{
    "ts" : ISODate("2001-02-28T23:00:00.000+0000"),
    "meta" : {
        "device" : "custom",
        "series" : "custom:1"
    },
    "v" : 100.84507500021624,
}
{
    "ts" : ISODate("2001-03-31T22:00:00.000+0000"),
    "meta" : {
        "device" : "custom",
        "series" : "custom:1"
    },
    "v" : 100.51826982331077,
}

and I apply densify, the result is:

{
    "ts" : ISODate("2001-01-31T23:00:00.000+0000"),
    "meta" : {
        "device" : "custom",
        "series" : "custom:1"
    },
    "v" : 100.50917810392826,
}
{
    "ts" : ISODate("2001-02-28T23:00:00.000+0000"),
    "meta" : {
        "device" : "custom",
        "series" : "custom:1"
    },
    "v" : 100.84507500021624,
}
{
    "ts" : ISODate("2001-03-28T23:00:00.000+0000")
}
{
    "ts" : ISODate("2001-03-31T22:00:00.000+0000"),
    "meta" : {
        "device" : "custom",
        "series" : "custom:1"
    },
    "v" : 100.51826982331077,
}

you can see Mongo creates a document that is not usefull

1 Like

I am facing the same issue when trying to densify my data on the date field… have you found a solution to this problem? I have tried several approaches, but the underlying issue is that densify is not aware of the timezone, so it will never “catch” the right bucket when the date range spans across a DST change.

Has there been a solution to this?
We’re facing it

1 Like

Any news on this? It’s a showstopper for us.

Same issue. Simple group by $dateTrunc, then $densify. If your timezone is not 0, the resulting data has many extra dates. Seems like densify algorithm does not matches the $dateTrunc
Example of code. pay attention to timezone

{ 
            "$match" : { 
                "createdAt" : { 
                    "$gt" : ISODate("2023-12-31T22:00:00.000+0000")
                }
            }
        }, 
        { 
            "$group" : { 
                "_id" : { 
                    "$dateTrunc" : { 
                        "date" : "$createdAt", 
                        "unit" : "month", 
                        "timezone" : "+0200"
                    }
                }
            }
        }, 
        { 
            "$densify" : { 
                "field" : "_id", 
                "range" : { 
                    "step" : 1.0, 
                    "unit" : "month", 
                    "bounds" : [
                        ISODate("2023-12-31T22:00:00.000+0000"), 
                        ISODate("2024-12-31T22:00:00.000+0000")
                    ]
                }
            }
        }