I use the Slack output workflow trigger for debugging and monitoring JSON/payload messages (yes, I know debug does the same). The free version of Slack stores the latest 10K messages. For $8/month, Slack will store all messages. One of our client’s 800+ devices have generated over 5 million payloads/messages that are all searchable. Slack claims that there is no limit in the paid version. Interesting, as I bet there really is a limit. If I reach it, I will update this post.
This is an interesting approach to indexing debug messages. Have you looked into a service like Loggly? Their free tier allows up to 200 MB per day of data going into the service. Although it does not have the data retention that Slack may have, it may provide some better options than slack itself.
Is there a reason you are using Slack to store your payload messages instead of using something like Application Archiving to save your data for a longer amount of time than your Losant Data Retention Limit? Is it solely for the indexing and searchability of messages in Slack?
We love seeing the creative solutions you come up with!
I haven’t, so thanks for the link. I realize my post may not be that practical. The “real” reason for Slack is during presentations/demos it is nice to have a Slack window displaying messages as I jump from screen to screen. It just happened to occur to me that the “paid version of Slack retains all messages ever and that even though the messages I receive are from machines and not humans, they still qualify”. The searches across millions of messages take less than a second, so the way Slack indexes is very optimal. It is also fun to tell people that all of my Slack messages are from machines and not humans…