Guys, I am trying to understand if there is any technical / practical limits on the number of workflow storage items that can be leveraged in a given workflow. I am making use of the On Change
node with change identifier to parse to some heavily nested API payloads. Obviously this results in an explosion of storage items as the number of parent objects is parsed + the number of child objects. Any guidance on when this design becomes unmanageable with workflow storage / On Change?
The Workflow Storage UI (the area that let’s you manage, delete, etc storage keys) will have some difficulty after a few hundred entries, depending on your browser performance. This is purely a front-end consideration and will not impact workflow execution. This would be a fix on our end to add filtering and pagination to that list.
An expected use-case for On Change and Throttle identifiers is a device ID, so the number of keys could get quite large depending on the number of devices reporting data.
As long as your implementation does not create a continuously growing list of unique keys, you should be good to go. In the device ID example, there will only be as many keys as there are devices.
Thanks for the guidance on the expected use-case and don’t think the UI pagination / filtering will be a problem for this workflow. Let you know if the device count gets out of hand.