Hi everyone.
I need to export a time series device data with 1 minute of resolution and 30 days duration to CSV.
But when I use Losant API Node: Time Series Query, the debug return this error:
“Workflow payload is not available because it exceeded the maximum debug message size of 5MB.”
How can I get around this?
My workflow works fine if I use 60 minutes of resolution, for example.
Here is my losant api query:
{
"aggregation": "LAST",
"attributes": {{ jsonEncode working.attributesName }},
"deviceIds": [{{ jsonEncode working.machine.deviceId }}],
"duration": 2.592e+9,
"end": 0,
"resolution": 60000
}
Where attributesName is an array with the attributes that I’m querying.
After this query, I just save the csv file in application files.
The final idea would be to have a button in my application where the user would click and the data would be exported, for now I left it saving in the application files.
Thanksss, hope you guys understand my problem.
The message you are seeing only applies to the output of the workflow debug log, not to the actual, in-flight payload generated by the workflow. So though you cannot see the payload output because it is too large, the flow is executing and the values are available. I would recommend using a smaller duration / larger resolution while doing active development and then bumping the value back to its desired level once you are happy with the workflow.
Also, you mentioned the need to “export a time series device data … to CSV”. If you simply want that data delivered to an inbox as a CSV, you could use the Data: Export API endpoint instead, which would kick the request to an asynchronous job. That would speed up your workflow execution and also simplify it tremendously.
1 Like
Thank you Dylan.
First of all, I’m from Brazil, my English is not that good, so again, I hope you understand.
i’ve tried Data: Export before, but my idea is that my application have a button called “export data” or something like that in the device page, then when the button is clicked, the file will be downloaded in browser.
What is a good way to do that?
I know that I can save the file in “application files” and then in HTML do something like:
<a href="path_to_file" download="proposed_file_name">Download</a>
But I dont wanna to save this files in my “app files” because, in that way, they will be public static files, right? FTP is the best way?
@Igo_Monteiro there are two different ways you could accomplish this …
- User clicks your “Export Data” button
- Doing so submits an asynchronous request to an Experience Endpoint that fires an Experience Workflow.
- That workflow queries the data you’re requesting using a Data: Time Series Node (or, if necessary based on the nature of your query, the Data: Time Series Query action of the Losant API Node).
- Convert points from the JSON object to a CSV string using a CSV: Encode Node. You may need to insert one or more nodes before this point to modify the data for your specific purposes. (You could also do this conversion to CSV browser-side, skipping this step in the workflow and using a library to do the conversion in the user’s browser.)
- Using an Endpoint: Reply Node, send the CSV string back to the browser.
- On receipt in the browser, make the browser download the result.
If the size of the data you are attempting to request is too big to fit into a single API call, you could instead do the following:
- In Losant, create a webhook. You’ll need the URL of the webhook in a later step.
- On click of “Request Data Export”, fire an Experience Endpoint as before. You can provide immediate feedback to the user that the request is being worked on and that they will get a link in their email.
- In the workflow run, use a Losant API Node and execute a Data: Export request, making sure to include a callback URL in the body of the request. That callback URL should be the URL of the webhook you created. You will also need to include the user who made the request’s email address as a query parameter.
- Create an application workflow and, within it, use a Webhook Trigger and set it to execute on requests to the webhook you created.
- In the body of the webhook request will be a temporary signed URL for the generated CSV. Using the email address passed through the query parameter, you can send this URL to the user with an Email Node or a SendGrid Node.
Hi Dylan, I was doing almost what you said in the first solution, but i still having the problem that I mentioned in my first question: “Workflow payload is not available because it exceeded the maximum debug message size of 5MB.”
You replied that this message was just from the debug and that the workflow would run anyway, but that’s not what is happening, the workflow is stopping after running the Time Series API Node.
The workflow is stopping execution? Can you post a screenshot of the debug messages?
If so, you will need to go with the second method I described.
Yes, the workflow stops.

I’ll keep with the first solution, it fits better.
I’ll talk with my team and I guess we will just reduce the duration/resolution time like you said before and in the future, make this parameters adjustables by the user.
I see; I was thinking you were seeing the “payload is too large” message being output by a regular Debug Node, but it’s actually the Function Node throwing an error and failing to display the payload at that point. The Function Node can only operate on 5MB of data in one execution, which is why the workflow is stopping.
I don’t know what you are doing inside of that Function Node but you may be able to avoid this error one of two ways:
- Use native nodes to do your data transformations (such as the Object Node, Array Node, Mutate Node) if your use case allows it.
- If possible, limit the size of the payload you are passing into the Function Node by passing a function scope. This is a new feature we released a couple weeks ago.
Ok, ill explain what i’m doing.
My workflow is that:
I hid the node name because the company I work for is using a white label (enterprise license), so I don’t know how far I can show something like that.
Below is my time series query code, before this node i’m just limitating the attributes that i’ll query and storing them in “working.attributesName” array.
{
"aggregation": "LAST",
"attributes": {{ jsonEncode working.attributesName }},
"deviceIds": [{{ jsonEncode working.machine.deviceId }}],
"duration": 1.21e+9,
"end": 0,
"resolution": 300000
}
I’ve limited to 14 days and 5 minutes resolution, but the first idea was to use 30 days and 1 minute resolution.
And after that i’m mapping the result to encode in the csv node.
const { deviceId } = payload.working.machine;
let csvHeaders = payload.working.attributesName;
csvHeaders.push("time");
let csv = {
"in": [],
}
payload.working.tsResult.result.devices[deviceId].points.map(point => {
csv["in"].push(
{
...point.data,
time: point.time
}
)
})
payload.working.csvData = csv;
payload.working.csvHeaders = csvHeaders;
Ill give a try to the “function scope”.
Attached is an example workflow that accomplishes all the same things without the use of a Function Node. I tested it in an application I have that generates random data points for 3 attributes on one device every 5 seconds. Using this I was able to submit your full request: 30 days of data at a 1-minute resolution, and I was able to get the CSV string out as well. Please review this and let me know if it works for your use case, and if you have any other questions. (You will need to integrate it with your existing workflow and maybe change some of my variable names.)
time-series-to-csv-develop.flow (5.6 KB)
2 Likes
Hi Dylan,
that worked to me, 30 days and 1 minute resolution. Thank you so much!
A last question: i notice that you used “lookup” in mutate template, I didn’t know this “shortcut” existed. Where can I find this and more functions of this type in the documentation?
The lookup helper is actually a built-in Handlebars helper, and you are not the first person who was unaware it existed. For that reason we are actually updating our Accessing Payload Data documentation page to include it and other native Handlebars helpers alongside the ones special to the Losant platform.
Hi @Dylan_Schuster,
when I executed the workflow (without modifying it) I got this error:

The issue seems to be this part of the Mutate node:

Do you have any ideas, how this issue can be resolved?
You’ll need to adjust the values in the workflow’s Virtual Button to match the values for your application.
deviceId
is a unique ID specific to the device in the application I originally developed this application in; this should become the ID of the device you wish to export data for in your application. (The workflow must be in the same application as the device you are trying to export.)
You probably also want to change the attributes
array to be the names of the attributes on your device that you wish to export.
The other settings can remain the same unless you want a different time range, resolution, end date, or aggregation method.
Thanks. Got it.
This workflow does however only work for a single deviceId.
Correct, but it can easily be adapted to support multiple devices by ID or by tag query, with some adjustments to the request body in the Losant API Node and in the Virtual Button that feeds data into the workflow execution.