Important: This post was written and published in 2020, and the content below may no longer represent the current capabilities of Power BI. Please consider this post to be more of an historical record and less of a technical resource. All content on this site is the personal output of the author and not an official resource from Microsoft.
Back in August I highlighted the new dataflows PowerShell script repo on GitHub. These scripts provide an accelerated starting point to working with the dataflows REST APIs and to automate common dataflows tasks.
This week the dataflows team has released two new REST APIs for managing dataflows transactions (think “refresh”) and a new parameterized PowerShell script to make using the APIs easier.
You can find the APIs documented here and the PowerShell script here.
The most exciting thing about these APIs and scripts is that they enable one of the most frequently requested capabilities: you can now trigger a dataset refresh (or the refresh of a dataflow in another workspace) when a dataflow refresh completes.
If you’re reading this and are thinking “that’s all well and good, but I don’t want to use an API or PowerShell” please don’t fret. The Power BI team is working on non-API-based experiences to make the end-to-end refresh experience simpler and more complete. I can’t share dates or details here before they’re publicly announced, but I did want to share proactively before anyone asked about a code-free UX for refresh orchestration.
Update December 2020: There’s now a Power Automate connector that will let you do this without writing any code.
 And the best thing to happen in 2020 so far, if James Ward is to be believed.
 Or even “this should be ‘clicky-clicky-draggy-droppy’ and not ‘typey-typey-scripty-codey’.”
15 thoughts on “Automatically refresh dataset when dataflow refresh completes”
Haha! I made the blog footnotes 🙂
LikeLiked by 1 person
I figured no one would make it that far.
LikeLiked by 1 person
Hey buddy! Imagine someone is allergic to PowerShell. Is it possible for them to leverage Power Automate for this? And be able to do this before the Power BI team figures out their non-API experience? Asking for a friend. 🙂
Keep on being awesome!
LikeLiked by 1 person
Short answer: I don’t know.
I have only the most trivial Power Automate knowledge (I did a tutorial once back before it got its fancy new name) so I’m not qualified to say yes or no. I think/assume that you can use Power Automate to simplify calling the REST API (something like this https://docs.microsoft.com/en-us/power-automate/get-started-flow-dev) but I haven’t used it myself and honestly don’t know if I’m giving you a solution by sharing this link, or just sending you down a dead end path…
Yes you can. You can use power automate to create a custom connector, which can use the API definition file (containing all Power BI APIs), and then call any API from that one connector. This is really useful for orchestration.
Pingback: Automatically refresh dataset when dataflow refresh completes — BI Polar | ERP and BI
Pingback: Refresh a Dataset when Dataflow Refresh Completes – Curated SQL
What about using a service principal to perform this but also to perform the import of the JSON file as a dataflow in a workspace (https://ssbipolar.com/2019/10/03/importing-model-json-to-create-a-new-dataflow/). Apparently that throws an error: “Import Failed with code [DataflowUnauthorizedError] and details ”.
I believe the Post Import in Group is not an admin API to have this restriction around Service Principles. Long term, in production in order to fully automate things, it does not make any sense to leverage an actual account compared to a service principle that has the key secret stored in AKV. Also, cannot find any documentation around limitations of using Service Principals with the Post Import in group Rest API as to only statement around required scope is: Dataset.ReadWrite.All
Hi Catalin – if you’re looking for assistance, your best bet is to post this on community.powerbi.com. I do not believe that this API supports service principals today, but I’m also not an expert in this area, so my belief is not guaranteed to be correct…
Pingback: Automatically refresh dataset when dataflow refresh completes – ScienceGeek
the best practise is to have extraction dataflow and transformation dataflow after it, with linked entities.
the refresh command given on the extraction dataflow.
however, these powershell scripts dont recognize the dataset (that gets data from the transformation dataflow) as dependent on the extraction dataflow. Still some work to be done.
I have managed to automate this process by calling a power shell in R through power query.
What I get with this is a dataset that when updated starts the powershell process that updates the dataflow and when it finishes it updates the dependent datasets.
It’s fantastic and as long as there is nothing better it is a way for this automation.
Soon I will document it
Documented in my blog, but in a post and with videos in Spanish.
Pingback: Automatización de las actualizaciones de dataflows y datasets -
Pingback: Automatically refresh dataset when dataflow refresh completes – now with 100% less code! – BI Polar