Concurrency Control in Power Automate: Preventing Duplicate Runs
If your scheduled Power Automate flow processes records from a shared data source, concurrent runs can cause duplicate processing, data corruption, or throttling. Here is how to control it.
The Default Behavior
By default, Power Automate allows up to 50 concurrent runs for trigger-based flows. For scheduled flows, if a previous run hasn't finished when the next schedule triggers, both run simultaneously.
This is fine for stateless notifications. It is a problem for anything that reads, transforms, and writes data.
Setting Concurrency Control
On Triggers
- Click the trigger action (e.g., Recurrence or When an item is created)
- Go to Settings
- Toggle Concurrency Control to On
- Set Degree of Parallelism to
1
With parallelism set to 1, only one instance of the flow runs at a time. Additional triggers queue up and execute sequentially.
On Apply to Each Loops
The same setting exists on Apply to each actions:
- Click Settings on the Apply to each
- Toggle Concurrency Control to On
- Set the degree of parallelism (1 for sequential, higher for parallel)
Setting this to 1 processes items one at a time — slower but safe when order matters or when you are hitting API rate limits.
When to Use Each Setting
| Scenario | Trigger Concurrency | Loop Concurrency |
|---|---|---|
| Batch processing records | 1 | 1-5 |
| Sending notifications | Default (50) | 10-20 |
| Updating a shared counter | 1 | 1 |
| Independent API calls | Default | 20-50 |
The Queue Behavior
When concurrency is set to 1 and multiple triggers fire, the extra runs enter a queue. Important details:
- The queue holds up to 100 waiting runs by default
- Runs beyond the queue limit are rejected (not retried)
- You can monitor queued runs in the flow's run history
Combining with Duplicate Detection
Even with concurrency control, design your flows defensively:
- Idempotent operations: make sure running the same action twice produces the same result
- Status fields: mark records as "processing" before starting, "completed" when done
- Timestamp checks: skip records that were already processed within a time window
Key Takeaway
Set trigger concurrency to 1 for any flow that modifies shared data. The performance trade-off is almost always worth the data integrity guarantee. Parallel execution is an optimization — reach for it only when you have confirmed your logic is safe to run concurrently.
Comments
No comments yet. Be the first!