Help with slow time-series queries

I have dashboard chart data that needs to be stored as an array so it’s being stored as a JSON string in Losant. My time-series queries are slowing down now because there is so much data from >800 devices and I can’t aggregate the string data. Any tips on how I should get around this issue?

Can you please describe your data in more detail? Are you reporting arrays (as strings) to your device attributes? Are you able to provide some examples?

Yes, there are arrays as strings being reported in device state. We are using a plotly sankey diagram to graph the order of button presses on a consumer-facing display. There are source and target arrays for each device state reported; if they pressed button 1, 2, then 3 it would look like:

{
  source: [1, 2, 3],
  target: [2, 3, 0]
}

The workflow is getting all the source and target arrays to calculate a value array which is a count of e.g. how many users went from button 1 to button 2 on step 1, etc.

Ok, I think I understand. I think you’ve reached an amount of data where you’ll need to switch to Notebooks. Workflows and Dashboards aren’t designed for batch processing, as you’re experiencing. Notebooks, on the other hand, are specifically designed to batch process data from basically any number of devices.

It sounds like you’re calculating an aggregation across all users/devices to generate a big picture of what’s going on. Something like that likely only needs to be calculated hourly or maybe even once a day - instead of recalculated every time the dashboard is visited.

I recommend creating a Notebook that runs hourly or daily to continually recalculate the data for this view. I would probably recommend storing the results in a Data Table which can then directly be queried by your Custom HTML Block (I’m assuming you’re using a Custom HTML Block).

To run the Notebook on an interval, you would use a workflow, a Timer Trigger, and a Notebook Execute Node.