Global Variable Too Large

I’m currently using a global variable to house configuration parameters for all of my fleets. There are so many parameters that I’ve reached the maximum size allowed for the variable. So I MUST break that information apart into independent fleets.

I could make a global variable for each fleet to eliminate this issue, but it almost feels like a better approach would be to load from file on-demand based on the fleet. If done that way, I could iterate on those files periodically for integrity checks. (File changed dates, SHA, etc)

Since there is no Read File available at the Application level, what would be the recommended approach?

Thanks, Jason

Application Files could work. They are hosted on publicly available URLs, which may have security implications for your solution. There’s no application-level read operation since they’re accessible using their application-specific files URL:

https://files.onlosant.com/APP_ID/path/to/your/file.json

You can use the HTTP Node to directly download files in a workflow.

Alternatively, you could store these values in a Data Table with two columns: key, value. You can query up to 10k rows in a single Table: Get Rows Node. The Data Table might ultimately be a better solution since it does allow you to directly query specific configuration values based on their keys.

Thanks Brandon. These are the options I expected. The files being public does cause a security concern, so that’s out. The two column data table, though an option, gets complicated since my globals has objects nested.

I might try a hybrid approach; a data table with names to fleet globals. Feels a little obtuse, but will allow me to keep the fleets tied together and iterable. Is there a limit to the number of globals I can have in an application?

Looks like the maximum number of application globals is 100. Also looks like our docs don’t mention that limit, so I’ll make a ticket to get that added.

1 Like

Thanks Brandon. I’ve marked this entry as Solved, but had a follow-up question. Are there any plans to have private secure storage and file read access in workflows at any point?

Thanks, Jason

It’s certainly on our radar and we’ve brainstormed implementations a handful of times. It’s unlikely to be delivered this year.

We do have workflow nodes that interface with the cloud storage for AWS, Azure, and GCP. Those nodes make it fairly easy to use those for secure storage.

That’s a really good idea in general, but seems like the global variables plus data table idea is less overhead for this specific situation. I hate to reach into another cloud as frequently as this dataset is accessed.

Thanks again for the prompt replies.