-
Notifications
You must be signed in to change notification settings - Fork 0
Description
Hello, I have sent a handful of TelemetryDeck.Referral.userRatingSubmitted signals with various TelemetryDeck.Referral.ratingValue values, however when I am trying to visualize these values in a histogram, it seems to just be dumping all the values into the first bucket and I cannot figure out why.
My query is as below:
{
"aggregations": [
{
"fieldName": "TelemetryDeck.Referral.ratingValue",
"name": "RatingsSketch",
"type": "histogram",
"splitPoints": [
0,
1,
2,
3,
4,
5,
6,
7,
8,
9,
10
]
},
{
"type": "doubleMin",
"fieldName": "TelemetryDeck.Referral.ratingValue",
"name": "Min"
},
{
"type": "doubleMax",
"fieldName": "TelemetryDeck.Referral.ratingValue",
"name": "Max"
}
],
"filter": {
"dimension": "type",
"type": "selector",
"value": "TelemetryDeck.Referral.userRatingSubmitted"
},
"granularity": "all",
"queryType": "timeseries"
}
Which is giving the result:
As you can see from the Min and Max aggregations, it is correctly identifying the range of values I've sent, however all the values in the histogram seem to just be going into bucket 0.
I'm very new to this so I'm not sure if I'm missing something, or if this is a bug.
If it helps, the signals were sent with a custom SDK I am developing (but following the Swift SDK), and the histogram seems to work correctly when checking the TelemetryDeck.Signal.durationInSeconds field of duration signals that are constructed/sent in the same way, so I'm not sure why it's different for this one.
Thanks for any help :)