If you’re a frequent reader of this blog, or a frequent user of Power BI dataflows, you probably know these things already:
- Power BI dataflows are workspace-level objects in the Power BI service that enable data reuse through self-service data preparation.
- Power BI dataflows store data in the CDM folder format in Azure storage.
- CDM folders use text files to store data, and a model.json file to store metadata.
- Workspace admins can export the metadata for a dataflow as a json file.
- Workspace admins can use the Power BI API to create a new dataflow from a CDM model.json file.
In this week’s Power BI service update, there’s something new to add to the list: You can now create a new dataflow from a previously-saved model.json using the Power BI web UI.
Let’s walk through how this new capability works, and where you might use it.
I’m starting with an “AdventureWorks” dataflow that already exists in one of my workspaces. It pulls in data from an Azure SQL Database source, and does some transformations along the way. I want this dataflow to be in a different workspace other than the one in which I’ve created it.
I begin by exporting the json for the dataflow.
Once this is done, I will create a new workspace, and add a new dataflow to it. After I select “new dataflow” I am presented with four options.
I’ll choose “Import Model” from this screen, and will browse to the saved json file on my hard drive. Power BI will import the json, and create a new dataflow in my new workspace.
And that’s pretty much that. I can refresh the dataflow and users can start connecting to it from Power BI Desktop or from their own dataflows. I can also go back into my first workspace and clean up by deleting the no-longer-needed dataflow I started with. Boom!
On the surface this may seem pretty anticlimactic, but a few key scenarios are enabled by this new capability.
The most obvious scenario is the one we just looked at: moving a dataflow from one workspace to another. Without the ability to import a model.json I would need to either save each the M query for each entity in the source dataflow, and then create new blank queries in the destination dataflow, or use the API to perform the import programmatically.
The most powerful and important scenario is user-centric application lifecycle management (ALM). Although the programmatic import and export of dataflows’ json definitions will be most useful for IT-driven DevOps and CI/CD processes, having this functionality exposed through the Power BI portal UI will allow “ALM lite” scenarios to be performed by self-service BI users. This could involve versioning dataflow definitions, promoting dataflows from dev workspaces to test or production workspaces, or other ALM needs.
I’ve been using this functionality for a while now and it’s great to see it available for everyone. I hope you enjoy it as much as I have!
Update: MVP Marc Lelijveld has published a great blog post on how to use the Power BI REST API and PowerShell to automate this task. Be sure to check it out too!
 Please note that as I’m writing this blog post, the dataflows API documentation do not mention this capability. Hopefully by the time you’re reading it, that will no longer be the case.
 Before the introduction of the new feature covered in this post, there were only three options on the screen.
 I’ve done this enough times to know how painful it can be.
 See .
 We roll out most new Power BI capabilities to Microsoft’s production Power BI tenant weeks before they’re deployed to customers. This is awesome, but also means it’s sometimes difficult for me to tell when it’s safe to blog about something new…