I’m assuming that none of the graphics in this demo are standard blocks or Losant dashboards? I was curious which technologies were used to create them and if it was hosted on the Losant Platform?https://cdn2.hubspot.net/hubfs/742943/Website/Images/losant-ar-demo-ipad.jpg
It is in fact hosted within Losant using Losant Experiences. And you are correct, this demo is not using standard blocks or Losant dashboards.
In this case, this is a client-side web app that’s hosted within experiences and, because of experiences, this app has access to the devices’ data. In addition, this demo had a really nice AR component that was done using AR.js.
For more info, here are some good resources on Experiences:
The video was great. Do you have any boostrap or github repository for the Asset Tracking example or the Smart Office example in the video?
At the moment, I don’t think anything is published. I’ll talk with the team to see how we can get some of those examples out there. Are you looking for anything specific?
the filtering for the asset tracking is pretty similar to what I had in mind.
It would be great to get additional Experience examples published. Brandon’s earlier example would be especially handy.
I for one have been struggling to get traction with Experiences and demos would be appreciated (a new University session on Experiences would be awesome).
Just a quick correction to Taron’s post above. The application in that screenshot was created with ARToolKit for the marker detection. We’ve since updated the application to utilize js-aruco instead, which is built on OpenCV and offers more performant detection using smaller markers.
When a marker is detected, is uses the jquery-sse library to open a streaming connection to the appropriate device’s state stream. Since the graphs required a very specific visual style, they were created directly using D3.
We use the browser’s full screen API to put the web page in full screen after it loaded. Full screen gave the app a nice native feel.