In this post I introduce scripts to improve copying a Direct Lake semantic model to another workspace using Microsoft Fabric Git integration.
I wanted to do this follow-up after my previous post about my initial tests to copy a Direct Lake semantic model to another workspace using Microsoft Fabric Git integration.
Due to the fact that I want to show how you can work with scripts locally to create the repository that contains the Direct Lake semantic model. Plus, how to do this in a way that includes the new Tabular Model Definition Language (TMDL) semantic file format.
In addition, I want to show how you can create a repository with multiple branches before you upload it to Azure DevOps. So that you can use multiple branches to deploy to multiple workspaces.
To manage expectations, this post covers the below concepts:
- Create a new folder to host your Direct Lake semantic model.
- Export the Direct Lake semantic model that uses the to a new folder using the Tabular Editor 2 Command Line.
- Transform the contents of the files to rename the Direct Lake semantic model.
- Initialize the folder to be a Git repository and prepare it so that you can work with multiple branches properly right from the start.
Feel free to check my Microsoft Fabric Git integration jargon guide for Fabricators post to help with any terminology/jargon in this post.
Create a folder
First you must create a folder locally to host your model.
If you intend to use a folder that contains a Power BI project you can create a new subfolder there that contains the name of the semantic model you wish to use, along with “.dataset” at the end.
For example, using “md DLTest.dataset\definition” to create a “DLTest.Dataset” subfolder at the root of your existing project.
If you are creating a new project, you can use the command “md ScriptedModel\DLTest.Dataset\definition” to create the following:
- The root Power BI Project folder (in this case ScriptedModel).
- Plus, the subfolder that contains all the files for the semantic (in this case (DLTest.Dataset).
- Finally, the definition subfolder with will contain the TMDL files.
Exporting the Direct Lake semantic model using Tabular Editor 2
You can export the Direct Lake semantic model using the Tabular Editor 2 Command Line.
I download Tabular Editor 2 and then ran the command “.\TabularEditor.exe /?”. Which shows the below TMDL options.
-TMDL Saves the model (after optional script execution) as a TMDL folder structure.
output Full path of the TMDL folder to save to. Folder is created if it does not exist.
id Optional id/name to assign to the Database object when saving.
To export the Direct Lake semantic model, I ran the below command.
$p = Start-Process -filePath ./TabularEditor.exe -Wait -NoNewWindow -PassThru -ArgumentList "`"Provider=MSOLAP;Data Source=powerbi://api.powerbi.com/v1.0/myorg/GIDLSemModelDev;User ID={ENTRA ACCOUNT WITH ACCESS};Password={PASSWORD FOR ACCOUNT};Persist Security Info=True;Impersonation Level=Impersonate`" DLTestDev -TMDL C:\repos\ScriptedModel\DLAutTest.dataset\definition"
Doing this exported the Direct Lake Semantic Model with the new TMDL file format.
However, as you can see above I still needed to copy the below three files like I did in my previous post.
- definition.pbidataset
- item.config.json
- item.metadata.json
I copied the files from another Power BI Project by running the below code.
copy "C:\repos\TMDLTest\Azure DevOps example.Dataset.\*.*" C:\repos\ScriptedModel
However, I did delete the “diagramLayout.json” file which was also copied over as I did not require it.
Once done, I updated the displayName value in the ‘item.metadata.json’ file to change the name of the database by running the below code.
cd C:\repos\ScriptedModel\DLTest.dataset\definition\
$Old_end = 'database DLTestDev'
$New_end = 'database DLTest'
(Get-Content "database.tmdl") -replace $Old_end, $New_end | Set-Content "database.tmdl"
Resolving UTF issue
As I mentioned in my previous post, the TMDL files that are exported by Tabular Editor 2 are being recognized as ‘UTF-8 with BOM’.
Luckily, I found an old stackoverflow article on how to resolve this. So, I ran the below code.
$Path = "C:\repos\ScriptedModel\DLTest.dataset\definition"
foreach ($file in ls -name $Path *.tmdl -Recurse)
{
$file_content = Get-Content "$Path\$file";
[System.IO.File]::WriteAllLines("$Path\$file", $file_content);
echo "encoding done : $file"
}
Which I verified afterwards in Visual Studio Code.
Initialize the folder to be a Git repository
I wanted to initialize the folder to be a Git repository, However, to prepare it for Git integration I wanted an empty main branch and a populated Dev branch.
With this in mind, I first ran the below code. Which initializes the folder to be a Git repository and allows me to check-in the main branch with no files. You can do this by using the “–allow-empty” option.
# Navigate to directory
cd C:\repos\ScriptedModel
# Initialize directory as a new repository as main as default branch
git init -b main
# Perform initial commit with no files selected for main branch
git commit --allow-empty -m "Initial commit"
Afterwards, I ran the below code to create a new dev and commit the existing files into it. From there, I added the new repository I had created in the Azure Repos service in Azure DevOps and pushed the local repository there.
# Checkout dev branch
git checkout -b dev
# Add all the files to the dev branch in repository
git add .
# Commit all the files added to the dev branch
git commit -m "Commit files dev branch"
# Add remote
git remote add origin https://{AZURE DEVOPS ORG}@dev.azure.com/{REPOSITORY}/Fabric/_git/ScriptedModel
# Push changes and both branches to new repository in Azure DevOps
git push --all origin
Which means that the repository in the Azure Repos service Azure DevOps contained two branches. One empty main branch and a populated Dev one.
My final test was to configure a Microsoft Fabric workspace to use Git integration. As you can see below it added the Direct Lake semantic model fine. Even though it says that it is unsupported.
Final words about scripts to improve copying a Direct Lake semantic model
I hope that sharing scripts to improve copying a Direct Lake semantic model has inspired some of you.
Because I wanted to highlight how to use Tabular Editor 2 to export a Direct Lake semantic model in this post. Plus, how to setup a Git repository with two branches locally before synchronizing it with Azure DevOps.
Of course, if you have any comments or queries about this post feel free to reach out to me. Especially if you test this yourself.
[…] Kevin Chant makes a copy: […]