In this post I want to cover one way that you can automate testing Microsoft Fabric Data Pipelines with Azure DevOps. By implementing the Data Factory Testing Framework within Azure Pipelines in Azure DevOps.
I previously shared the results of my initial tests of the Data Factory Testing Framework in the Fabric community blogs.
Now I want to show how you can automate testing Microsoft Fabric Data Pipelines in an efficient manner. In order to implement automated tests whilst performing CI/CD in order to identify potential issues. Plus, show how you can view your test results in Azure DevOps.

One thing I want to make clear is that even though the example is based on performing CI/CD with fabric-cicd you can implement this level of testing with a variety of CI/CD options for Microsoft Fabric. Including Fabric-CLI and PowerShell.
Due to the fact that you can introduce the relevant tasks to perform testing wherever is necessary in Azure DevOps.
To manage expectations, this post shows how to do it with the classic GUI-based Releases feature for deployments. Along the way I share plenty of links.
Example in this post
For this post I updated a release pipeline that I configured with in a previous post. Where I showed how to operationalize fabric-cicd to work with Microsoft Fabric and Azure DevOps.
For the benefit of simplicity, I added the testing of my data pipeline in my development workspace as an additional stage to my release pipeline. So that it is easier to visualize the flow of activities.
You are more then welcome to customize where you perform your testing to cater for your needs.
I made some changes to the fabric-cicd samples for the benefit of this post. For instance, I added the WithoutSchema sample Lakehouse that is included as part of the samples in the fabric-cicd GitHub repository.
In addition, I made the below changes the sample “Run Hello World” data pipeline:
- I added a new pipeline parameter called DirectoryName. In order to dynamically change a folder location.
- Plus, I added a new copy data activity which downloads the sample NYC Taxi data to a folder in the Lakehouse. Specifying the created parameter as the folder location.

Automate testing Microsoft Fabric Data Pipelines with Azure DevOps
After doing all the necessary updates I was then ready to configure automate testing the changes to my Microsoft Fabric Data Pipeline with Azure DevOps.
In order to do this I added a new stage to my release pipeline and changed the dependency of the deployment to test. Like in the below example.

I also changed the alias of my release artifact to “fabric-cicd-sample”. In order to make the call to pytest easier to read.
Anyway, in my new stage I opted to work with the windows-latest Azure Pipeline agent to make deployments easier. Plus, I added the below four tasks.

Below is a breakdown of each task.
Use Python 3.12
First task to run is to select the version of Python to work with via the Python version task. I encountered issues when selecting Python 3.13 so opted for Python 3.12 instead.
Install necessary libraries
Afterwards I had to install the necessary libraries in a PowerShell task. In this task I ran the below code.
pip install data-factory-testing-framework
pip install pytest
Running the above code installs the Data Factory testing Framework and Pytest libraries. Both of which are required to test data Pipelines this way.
Run sample pipeline test
Once the libraries are installed the next task runs a pipeline script that I cover later in this post. In addition, I specify to output the test results into a special xml file so that I can display the results in Azure DevOps.
pytest fabric-cicd-sample\Tests\simple-pipeline-tests.py --junitxml=simple-pipeline-test-results.xml
One thing I need to highlight here is that I had to change the control options for this task. So that it would run even if a previous task had failed, unless the deployment was cancelled. I needed to do this so that the task would still publish any test failures.
Publish Pipeline Test Results
Last task that I configured in the new stage was a Publish Test Results v2 task. Which publishes the xml output that contains the test results into Azure DevOps. I specified to look for the below file(s) with the JUnit result format.
**/simple-*.xml
Python script to perform Data Pipeline test
Once everything else was configured the final piece of the puzzle was the Python script to perform the testing. So I added the below Python script to a new Tests folder.
import os
import pytest
from data_factory_testing_framework import TestFramework, TestFrameworkType
from data_factory_testing_framework.models import Pipeline
from data_factory_testing_framework.state import PipelineRunState,RunParameter, RunParameterType
@pytest.fixture
def test_framework(request: pytest.FixtureRequest) -> TestFramework:
return TestFramework(
framework_type=TestFrameworkType.Fabric,
root_folder_path=os.path.dirname(request.fspath.dirname),
)
@pytest.fixture
def pipeline(test_framework: TestFramework) -> Pipeline:
return test_framework.get_pipeline_by_name("Run Hello World")
def test_directory_parameter(request: pytest.FixtureRequest, pipeline: Pipeline) -> None:
# Arrange
activity = pipeline.get_activity_by_name("Copy sample data")
state = PipelineRunState(
parameters=[
RunParameter(RunParameterType.Pipeline, name="DirectoryName", value="SampleData")
],
)
# Act
activity.evaluate(state)
# Assert to check correct directory name is used
assert (
activity.type_properties["sink"]["datasetSettings"]["typeProperties"]["location"]["folderPath"].result
== "SampleData"
)
In reality, this script is very similar to my original post on the Fabric Community blog. Where I test that the output Directory is set correctly. With some slight changes, as below.
- I needed to import os in order to work with os.path.dirname to navigate between folders.
- I had to change the name of the data pipeline to the “Run Hello World” sample.
Test results to automate testing Microsoft Fabric Data Pipelines with Azure DevOps
When I ran the pipeline the three stages complete like in the below example.

In my new “Tests Data Pipeline” stage an icon appeared showing completed tests. Clicking on the stage and changing the filter to “Passed” allowed me to see more information.

Of course, my testing would not be complete without testing for failure. So, I updated my Python script so that the name of the directory in my assertion test was SampleData2.
Doing this caused my release pipeline to fail. However, because I had changed the control option for the publish task, I was still able to view the test results.

Final words
I hope that showing one way you can automate testing Microsoft Fabric Data Pipelines with Azure DevOps helps some of you get started.
Plus, has introduced a lot of you to the Data Factory Testing Framework. Because there are a lot of possibilities with it.
Of course, if you have any comments or queries about this post feel free to reach out to me.
Be First to Comment