DEV Community

Cover image for Power Automate- the Best Way to Write to Excel?
david wyatt
david wyatt Subscriber

Posted on • Edited on

Power Automate- the Best Way to Write to Excel?

A colleague of mine asked me for some advice on exporting a larger dataset to an excel table. His flow was using the add row action in a for all loop, and for 4000 rows it was taking over 4 hours. My immediate thought was to use a Office Script, but I realised not everyone is fluent with them, and it as and added complexity that might not be wanted.
So I thought what is the best way?
I thought of 2 possible optimizations that might improve the process:

  • Concurrency
  • Version Control

The concurrency was the obvious choice, as surely writing 50 rows in parallel would be around 50 times faster. And the version control, well that was a guess, it shocked me that 4000 rows was taking so long, and my only thought to this was possible every add row was creating a version in SharePoint (why else would it be taking over 3 secs to add each row)


So it was time for an experiment, I was going to create 5 test cases and have a flow drag race.

Image description

The 5 test cases were:

  • Simple Add row
  • Add row with concurrency set to 50
  • Simple Add row in library with no version controls
  • Add row with concurrency set to 50 with no version controls
  • Office Script

For the test I setup a simple flow which would generate a 4 field, 1000 row array with random data. Then run the 5 test cases in series to write the array to an empty excel table, one scope for each case

Image description

Image description

Image description

The office script was using same pattern, with a loop creating a row array and then adding the row.

Image description


So what was the result, well first, it took 5 minutes to create the 1000 rows of dummy data, Power Automate really isn't optimized for adding to arrays in any format! And well I was right, Office Script is best, by a long shot (only 9 Seconds). But what was interesting was the other optimizations.

Image description

Version control seemed to have impact, but not as significant as I thought, and why was concurrency higher than simple add row, it had 50 calls in parallel yet was slower.
Image description

At row 130 I found the issue, a 12 minute wait due to 7 retry attempts. The reason for the 6 fails was a 429 "Rate limit is exceeded".

We had hit the API limit (this wasn't the only row too), the Excel connector limit is 100 per minute (SharePoint has 600 and I thought Excel was the same).

Image description

But why had the no version control concurrency case not had the same issue, well it had, but the timeouts were significantly less.

So I re-ran the tests. And this time both 15 mins, then both 26, one 23 one 24, and so on. It turns out that it was very much luck of the draw, would Microsoft enforce the API limit, and how long timeout for. The version control was a red hearing and had no real impact.

So my next thought was, can I optimize the concurrency. With a 100 API limit, if I set the concurrency to 50 and add delay for 29 seconds, that would enforce 100 calls per minute (the theoretical consistent max). I ran this multiple times and consistently got just over the 10 minute mark.

Image description


So the results were in,

  1. Office Script were by far the best (with additional benefit of only using 1 Power Platform API request).
  2. Optimizing the Concurrency to hit 100 calls per minute was second best and good for anyone not wanting to use Office Scripts (with only negative the data table wasn't in original order).
  3. Using concurrency above that can is very inconsistent, with API timeouts sometimes almost doubling the entire time (I also suspect even the simple add row may occasional get some timeouts too).
  4. And my guess that version control had an impact was totally wrong.

Top comments (0)