Importing model.json to create a new dataflow

Important: This post was written and published in 2019, and the content below may no longer represent the current capabilities of Power BI. Please consider this post to be an historical record and not a technical resource. All content on this site is the personal output of the author and not an official resource from Microsoft.

If you’re a frequent reader of this blog, or a frequent user of Power BI dataflows, you probably know these things already:

    1. Power BI dataflows are workspace-level objects in the Power BI service that enable data reuse through self-service data preparation.
    2. Power BI dataflows store data in the CDM folder format in Azure storage.
    3. CDM folders use text files to store data, and a model.json file to store metadata.
    4. Workspace admins can export the metadata for a dataflow as a json file.
    5. Workspace admins can use the Power BI API to create a new dataflow from a CDM model.json file.[1]


In this week’s Power BI service update, there’s something new to add to the list: You can now create a new dataflow from a previously-saved model.json using the Power BI web UI.

Let’s walk through how this new capability works, and where you might use it.

I’m starting with an “AdventureWorks” dataflow that already exists in one of my workspaces. It pulls in data from an Azure SQL Database source, and does some transformations along the way. I want this dataflow to be in a different workspace other than the one in which I’ve created it.

I begin by exporting the json for the dataflow.


Once this is done, I will create a new workspace, and add a new dataflow to it. After I select “new dataflow” I am presented with four options[2].


I’ll choose “Import Model” from this screen, and will browse to the saved json file on my hard drive. Power BI will import the json, and create a new dataflow in my new workspace.



And that’s pretty much that. I can refresh the dataflow and users can start connecting to it from Power BI Desktop or from their own dataflows. I can also go back into my first workspace and clean up by deleting the no-longer-needed dataflow I started with. Boom!


On the surface this may seem pretty anticlimactic, but a few key scenarios are enabled by this new capability.

The most obvious scenario is the one we just looked at: moving a dataflow from one workspace to another. Without the ability to import a model.json I would need to either save each the M query for each entity in the source dataflow, and then create new blank queries in the destination dataflow[3], or use the API to perform the import programmatically[4].

The most powerful and important scenario is user-centric application lifecycle management (ALM). Although the programmatic import and export of dataflows’ json definitions will be most useful for IT-driven DevOps and CI/CD processes, having this functionality exposed through the Power BI portal UI will allow “ALM lite” scenarios to be performed by self-service BI users. This could involve versioning dataflow definitions, promoting dataflows from dev workspaces to test or production workspaces, or other ALM needs.

I’ve been using this functionality for a while now[4] and it’s great to see it available for everyone. I hope you enjoy it as much as I have!

Update: MVP Marc Lelijveld has published a great blog post on how to use the Power BI REST API and PowerShell to automate this task. Be sure to check it out too!

[1] Please note that as I’m writing this blog post, the dataflows API documentation do not mention this capability. Hopefully by the time you’re reading it, that will no longer be the case.

[2] Before the introduction of the new feature covered in this post, there were only three options on the screen.

[3] I’ve done this enough times to know how painful it can be.

[4] See [1].

[5] We roll out most new Power BI capabilities to Microsoft’s production Power BI tenant weeks before they’re deployed to customers. This is awesome, but also means it’s sometimes difficult for me to tell when it’s safe to blog about something new…

21 thoughts on “Importing model.json to create a new dataflow

  1. Pingback: Importing model.json to create a new dataflow — BI Polar | MS Excel | Power Pivot | DAX | SSIS |SQL

  2. Hi Matthew,

    This is really, really great! Before this UI option I found out it was only possible via the PBI dataflow IMPORT REST API’s.

    Question though, in ALM and DevOps scenario’s, is it possible to overwrite existing dataflow entities with an updated JSON import? Use-case: to add additional tables/columns automatically from DevTest to UAT to Production environments.This requires ‘overwriting’ the current JSON file with an updated one.

    I saw in the UI that if I import a dataflow with the same name (A-MDW for example) it does not replace the current DF but creates a secondary A-MDW(1) dataflow. Roll-back scenario’s can also be tricky in this case of a duplicate.

    If this is possible we can fully automate our M queries in our DTAP environements based on source parameters 🙂

    Thanks a lot


    1. That’s a great question! Unfortunately, I do not have an equally great answer in return. ;-(

      I have not tested this scenario, and there is no “Update” API for dataflows. The building blocks may be there, but I can’t say offhand how you would assemble them.


    2. As a test I exported a dataflow to json file, deleted the dataflow and then imported to create the dataflow again. That all worked great except of course the dataflow id had changed to reports that connected to the original dataflow no longer worked.

      Would be great to be able to move a dataflow through dev, uat and prod work spaces.



  3. This is really helpful and I will be using it to meet my immediate need. Thanks!!

    Can this be used to import model in different tenant? I can test it but I thought just ask in case you already tested it.



  4. This is great. I will be using it immediately. Thanks!!

    Can this be used to import model in different tenant? I can test it but thought just ask it if you already know the answer.



    1. I would expect this to work – the workspace owner in the destination tenant would need to enter new valid credentials for the data sources, and refresh it.

      BUT… I haven’t tested it either. Please let me know! 😉


  5. It worked. I exported sample DF from one tenant and imported into the other one, after model import, it came up with message saying “edit credentials”. Opened new DF and entered the credentials and everything worked like a charm.

    Few hiccups, import didn’t worked using Edge so I used Chrome. Also on import model option there is a link to learn more which opens up Microsoft home page.

    Overall the functionality worked and super happy as I’m doing this massive dataflow project for a client and next week need to move to their environment. This option came at right time. Awesome awesome awesome.



  6. colin robinson

    I can confirm it doesn’t work with edge, but is fine with Chrome, Its got something to do with the “Importing Dialog” I imagine. We have a group policy on third parties cookies, may be related to that


  7. Pingback: Power BIte: Creating dataflows by importing model.json – BI Polar

    1. I do not know if this is planned, and honestly this is the only time I’ve heard the request.

      What is your use case? If you have already defined a dataflow, why do you want to replicate the logic in a PBIX file, instead of simply connecting to the dataflows you’ve created?


  8. Raj

    Hello. I have some dataflow, when i try to import them, i get the following error,


    Cananyone provide any assistance.


  9. Pingback: Sharing individual Power BI dataflows, but how? – Data – Marc

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s