I have some 1Hz sensor data available to me as a CSV file per day.
I imported this to a table and can report on it, but maybe I’m thinking of this wrong. Would I be better off converting this to time series data?
If so, how do I do that?
A good general rule of thumb is that time series data is best placed onto devices. The underlying storage mechanism and dashboards are much better suited for that.
When you report device state, you can change the timestamp up to 30 days into the past. This feature is specifically designed for batch reporting like in your example.
I’ve actually implemented an automated approach almost identical to this, but it does involve a few moving pieces. In my example, the report data was an .xls (Excel) file, but CSV is essentially the same. Handling either of those formats is not particularly easy in a Losant workflow, so I used a Google Cloud Function (AWS Lambda would also work) to parse the data and then use the Losant JavaScript API Client to report the data using the Losant API. The daily reports were sent via email and I used Mailgun to receive the email and invoke the Cloud Function. So the flow was:
Email -> Mailgun -> Cloud Function -> Losant API
When sending device state using the Losant API, you can send multiple, individually timestamped, messages with a single call. You just have to keep the entire message under 256KB, so the number of states you can send at once depends on how many attributes you have. So instead of looping 86,400 times (1 per second for 24 hours), you might be able to send 100 or more at a time to reduce the time and number of API calls.
The number of pieces that have come together to solve this problem definitely illustrates a good feature idea, which is providing the ability to import CSV data as device state.
Great feature idea I think.
I know already several customers systems that pumps out csv files that someone later on has to import into Excel, run some macros and graph data.
If their existing system then could redirect their csv destination to Losant via ftps or similar, it would be an easy transition.
So if I wanted to convert data that lives in a Losant data table, to device data (at least for a test) I would have to loop though each row with a workflow and report device state?
That could work depending on the amount of data. Even though each Device State node is quick, when you multiply it by 10’s of thousands of executions, it’s very easy to reach the 60 second timeout when doing large looping operations like this.
I might recommend using a local script (Node.js, Python, Ruby, etc) to do the import using the Losant API. You can export the Data Table to a CSV from the Losant UI, or just use your original CSV files you started with.
I wrote a working Node.js script that can be found here:
I was able to test this successfully yesterday, but now I’m getting this error in node.js console:
(node:4680) UnhandledPromiseRejectionWarning: Error: deviceState no (or more than one) schemas match
Never mind. I had the time stamp formatted wrong.