We have a Resource job validating rows in datatable, each row when gets Acknowledged, “Success” counter gets incremented & for failure, “Fail” counter is Incremented. These counters are stored in workflow storage. Currently this is iterating serially & hence taking time when table Row count > 5K.
I need to optimize this Is there any better approach we can take? can this be achieved by iterating in parallel mode? because there will be dirty reads as max. 10 iterations runs in parallel.
Hi @jignesh_shah ,
Running your Resource Job iterations in parallel will definitely decrease the job execution time. The Increment Value operation in workflow storage is atomic, so counters in your workflow storage should still work as expected.
(Note that you don’t want to use something like Storage: Get Value → Math Node → Storage: Set Value as that operation is not atomic).
That said, it sounds like it might be better to use a Job: Complete Trigger. Each time the associated Resource Job completes, this trigger would fire with a payload containing an execution summary of the iterations. You could write to your workflow storage, or perform whatever other tasks are needed, based on this report.