I am working on a new set of requests for a client and I am looking for help/tips on the most efficient way to setup the workflows/exports.
I would like to collect specific device states when the device sends an alert as well as when that alert is cleared.
System Shutdown = 0
Time: 14:00:00 UTC
System Shutdown = 1
Time: 15:00:00 UTC
The report would run every hour and send multiple device states durations between changes via a CSV file to a FTP site.
Any help in making this process clearer would be much appreciated!
Are you trying to export Losant events, or device state data, to this FTP site? I’m unclear on that from your explanation.
Also, do you just want to put up the raw CSV data, or do you need to do some sort of processing / calculation on that data in a Losant workflow first?
We do have a Stream Exports to AWS S3 template that I think would be useful to you; it demonstrates how to …
- Trigger an asynchronous task (in your case, you’d use a Timer Trigger hooked up to a Losant API Node that calls the Data: Export action.
- Provide a callback URL in the request, with the value being the URL of a Webhook from your application.
- Trigger another workflow execution off the request to the callback URL, using the Webhook Trigger.
- Pipe the CSV directly to your FTP server using the FTP: Put Node. (This replaces the AWS S3: Put Node in the template.)
Let me know if that points you in the right direction.