I'm using a custom data feed, and while selecting different resolutions in the chart widget (e.g., 1m, 5m, 1h), the getBars method always receives '1D' as the resolution. The UI updates correctly, but the resolution passed to getBars doesn’t change.
Can someone help me understand exactly where the resolution is being set or how getBars is being called? I want to make sure it's receiving the correct resolution based on the selected timeframe.
Thanks in advance!