To try something new, I thought for a few blog posts I would write a fictional tale about lessons learned by a time travelling SQL Server DBA. Because at the start of a new year people tend to look into the past and the future.
I decided it would be a good idea to cover some topics in a different way. In addition, there’s some important points I will share towards the end of the series.
All characters are entirely fictional and anybody with the same names or any similarities as any of the characters is purely coincidental.
The following morning
Our DBA woke up and put the previous events in the past down to a bad dream. So, they got ready to start their new job for real this time.
When he got to the train station he got a nice surprise. Because the train turned up exactly on time. It appeared to be a very clean train with comfortable seats.
Once he sat down he saw he had a lot of different USB ports he could plug his phone into. Another nice surprise was that the Wi-Fi connection in the train was really fast and reliable.
He looked up at the office building after getting off the train. It looked older than it looked in his dream. When he went inside everybody seemed really friendly.
However, there was one colleague who asked him if they had worked together somewhere else. Because there was something very familiar about him.
He was amazed at how smooth his induction process was. Everything appeared to be so well planned for his arrival. Apart from meeting his new boss, who was away on a business trip.
In addition, he was shown how all the DBA’s do their deployments using a deployment pipeline. He found their DevOps implementation very impressive. Because it seemed clear and easy.
One of his colleagues explained that years ago they use to do their deployments using a waterfall process. However, the company outgrew it and they had to adapt with the times.
So, they decided to look into implementing DevOps. Luckily, they had somebody in the company with the drive and vision to help others adopt it.
Because of this they now multiple deployments during the day, instead of having to stay behind in the office and do them in the evenings.
He offered to help with some performance issues. So, his new colleague asked if he could work with somebody to help resolve an issue.
Whilst working with a new colleague it was explained that some time ago they looked to automate fixes for their most common performance issues. Which meant that they had more time to work on other things.
In addition, due to all the new self-healing features in SQL Server like the automatic index management levels they no longer had to fix that many performance issues.
Of course, now and again performance issues still happened. However, because everything was now in Azure they could maintain their exabytes of data in databases and Big Data Clusters with ease.
After this statement alarm bells were ringing, and our DBA experienced a bit of déjà vu. So, he asked his colleague what year it was. His colleague told him it was 2030.
Just like his experience in the past he was feeling very confused again. However, he decided that the best thing to do was to go with it and see what happened.
During the day our DBA was presented with further training as part of his induction. For example, how all the deployment pipelines worked within Azure DevOps and how to easily get feedback from any deployments.
Because Managed Instances were heavily used it was also included in the induction. Our DBA was a bit nervous when he was told he would have to wait until it was finished before moving onto the next topic.
So, expecting the deployment to take six hours you can imagine the look on their face when the whole thing was deployed in thirty minutes.
It was also made very clear that using third party tools to perform backups were a thing of the past. Because all backups were managed by Microsoft in Azure. Including long-term retention of backups.
However, Log Shipping was still in use so that some secondary databases could have delayed updates. Log Shipping more popular in the business now that you could backup the secondary databases it used.
In addition, a Visual Studio Subscription had already been arranged with a large amount of Azure credit. It was explained that if they needed more credit at all they could have some transferred to them from another colleague with ease.
It was highlighted that everybody in the company used PowerShell 15 and dbatools wherever possible to make their lives easier. Due to the hundreds of features that have been added to it over the years.
In addition, they were given a schedule on what they would go through the next day. Topics included further Azure DevOps training, Kubernetes and Network Monitor.
Towards the end of the day our DBA was left to get use to things on their new Windows 14 laptop.
It was surprising to see that Azure Data Studio was the preferred application being used to manage SQL Server.
Of course, Management Studio was still in use for certain things. However, during the day it was still being called Enterprise Manager by some people.
Learning yet another new browser was another interesting learning curve. Good old notepad was still part of the Operating System though.
However, they did decide to do some online browsing to see what else had happened in the last decade. Things had certainly changed a lot in the last decade.
For example, there were now a lot more cloud services. Plus, a lot of the older ones had being renamed with a lot more functionality.
SQL Server on Linux had become more widespread in Production. Mostly due to the fact that people were using it with Kubernetes to improve management.
Due to this, more Data Platform experts now had Linux certifications.
Data Platform user groups had appeared in most cities around the world. In addition to this, there were also a lot of new speakers at these events.
Apparently, there had been an initiative put in place to help new speakers develop which had become very popular.
Furthermore, Microsoft’s learning materials had increased to epic proportions. With enough material to empower data platform experts to share a careers worth of knowledge with others.
After looking at the final site our DBA had become a bit overwhelmed and decided to go home. Just like before, he fell asleep almost immediately when he got home later that evening.
To be continued…
I hope you all enjoyed ‘Lessons learned by a time travelling SQL Server DBA – Part Two’.
‘Lessons learned by a time travelling SQL Server DBA – Part three’ will be published in the near future. When it is published I will also update this post with a link to it.
Final words (for part two)
I hope you’ve all enjoyed this potential insight into the future. Of course, this is just my take on some things that can happen in the future.
For example, more companies looking their DevOps processes. In addition, making sure that inductions of new colleagues are done as smoothly as possible as part of the improvements.
It’s my personal opinion that performance tuning will still be required to some degree in the future.
However, I think in most cases issues will be investigated only one time and then automated steps will be introduced to prevent it happening again. With only the more complex systems requiring more help fine tuning databases.
In addition, I suspect Microsoft are going to put more features in place to help with performance tuning. Which is one reason why I previously posted about the possibility of using automatic index management levels here.
Of course, Microsoft already offers a great deal of documentation to help with SQL Server. For instance, the SQL Ground to Cloud workshop which you can read about in detail here. So, you can imagine how powerful more learning material can be.
I am aware people are going to have a variety of views about the topics covered in this post.
With this in mind, I am interested in knowing people’s thoughts on this content and the format. So please feel free to leave a comment.