My T-SQL Tuesday contribution for this month is about using Azure DevOps in a hybrid data world. Which is hosted by Ben Weissman this month.
Ben has invited us all to talk about our hybrid and edge experiences. For example, where are we on our journey and what challenges we faced. You can find out more about the invite by clicking this link about his T-SQL Tuesday 139 invitation or on the image below.
I am sure today you are going to see some really great posts about various cloud services. With some good insights about how to use them in a hybrid environment.
With this in mind, I am covering my Azure DevOps journey. About how I learned to deploy updates for some of these services in an easier, more automated way.
So that you can look to do the same for deployments to various environments. Whether they are hosted on-premises, in the cloud or a combination of both (aka hybrid). To give yourself more time to learn more about these great services.
Just to clear up some jargon. When I say deployments, I mean either new installations of services or updates to existing ones. For example, a new SQL Server database or a schema update to an existing Azure Synapse Analytics dedicated SQL Pool.
Hybrid data world
In the past, a large number of professionals worked with Data Platform environments hosted by servers on-premises. However, that all changed once Data Platform services started being introduced in the cloud.
Especially Platform as a Service (PaaS) offerings like Azure SQL Database. Because they appeal to people for various reasons. For example, you do not have to look after an underlying operating system.
In reality, this has led to a lot of companies hosting services in a hybrid way. By having them hosted both on-premises and in the cloud.
Personally, I love using cloud services. One of the many advantages is that it’s a lot faster to deploy cloud services than ordering new servers to host services on-premises.
I have spent a lot of time over the years working with cloud services. In order to get up to speed with them.
In fact, just the other day I was thinking that a lot of the Azure Data Platform services are like the different pieces of cooking equipment you can get for camping. Like the Roadii and the Frontier Stove.
Because there is a lot of crossover in what you can use the different ones for. However, a lot of them can be very useful for niche situations as well.
Before I cover Azure DevOps I do want to add one other vital point. I know a lot of people say they do not have cloud experience yet. However, bear in mind that if you are using Azure DevOps or GitHub you are probably working in a hybrid way already. Because the cloud-based versions of these services are the most popular ones in use.
Azure DevOps in a hybrid data world
One of the challenges I have faced is how to manage hybrid Data Platform deployments. Where there is a need to deploy updates both on-premises and in the cloud.
With this in mind, I decided to vastly improve my knowledge about Azure DevOps. One of my biggest challenges was balancing learning Azure DevOps whilst keeping on top of the Azure Data Platform offerings.
However, learning more about using Azure DevOps for hybrid deployments helped me guide a team to use Azure DevOps. To the stage where they were able to update one single SQL Server database across a four figure number of SQL Servers. All of the updates were done using a state-based deployment method in one single deployment pipeline.
It also helped me come up with ways to use Azure DevOps for Data Platform migrations scenario as well.
For example, I came up with a way to update both on-premises SQL Server databases and Azure SQL databases using one commit. So, you can start to migrate databases in different environments to the cloud and still update all of them using one deployment pipeline.
I have also used what I have learned for other Data Platform services as well. For example, I’ve also discovered that using the above method allows me to do unit tests for an Azure Synapse Analytics dedicated SQL Pool.
Using Azure DevOps to automatically deploy updates in a hybrid data world has made my life a lot easier. Because using this to manage deployments easier allows me to focus more on other things. Like the various Azure Data Engineering services.
Other Azure DevOps advantages
In addition, it has provided other advantages as well. For example, it has allowed me to plan work a lot easier thanks to Azure Boards.
After learning so much about using Azure DevOps I was able to pass the AZ-400 Azure DevOps exam. Plus, it has made learning things like GitHub Actions so much easier.
Nowadays, I balance learning new things with sharing my Azure DevOps knowledge with others.
I really enjoy sharing my knowledge about using Azure DevOps. It has led me to in to some interesting situations. Including being asked to provide customized training days.
It is a lot to learn. My advice is to start simple and learn what you need to for the challenge at hand and build your knowledge up from there.
By all means, when you start looking at Azure DevOps watch or read the material that shows how to create a pipeline using a GUI. So that you understand how pipelines work. If you are new to Azure Pipelines, using the Classic Editor can help you understand it more.
However, for real-life situations use YAML for your pipelines. It might seem scary at first but it brings many advantages. You can find some examples in my public GitHub repositories.
Final word
I hope this months contribution about using Azure DevOps in a hybrid data world has been of use to some of you.
Learning how to do this using Azure DevOps has been an interesting journey for me. In fact, it’s one of the reasons I enjoy sharing my knowledge about it with others. Even if it means spending two hours answering questions about Azure Boards.
Of course, if you have any comments or queries relating to this post feel free to reach out to me.
Congratulations on passing the AZ-400 exam! Azure DevOps is such a complex subject.
[…] Kevin Chant […]