Redgate’s DLM Dashboard: The awesome continues

The artist formerly known as SQL Lighthouse, underwent a HUGE overhaul, and now goes by the name DLM Dashboard. This tool allows you to: “monitor your database schemas and alerts you to schema changes, including database drift”. And even though I find Redgate’s tools awesome, I’m still going to be brutally honest in this blog (and I know they appreciate that!).

In the second half of this blog post, there is also a Q&A with a Product Marketer of Redgate. In that Q&A, he answers a few open questions that will clarify the product and the future of this product.

 
Download and install
When you download and install the Redgate DLM Dashboard, you get a number of things. The most important things the installation folder (C:\Program Files (x86)\Red Gate\DLM Dashboard 1) contains are a webserver and a monitoring service. These are used to host the webinterface of the DLM Dashboard, and the service needed for monitoring your environments and databases. There are also 3 Windows services installed: “DLM Dashboard Monitoring Service”, “DLM Dashboard Storage Service” and “DLM Dashboard Web Server”.

 
Configuring your environment
After the installation, the DLM Dashboard is started:

 
When you click on “Set up DLM Dashboard”, you are redirected to the configuration page. First you need to add a project:

 
I created a Sandbox project, in which eventually all my environments will be configured. So my project will contain my 4 environments (DTAP). DTAP stands for Development, Test, Acceptance and Production. More on DTAP can be found here. In this test, I’m going to start with configuring a Development environment:

 
After adding that, you need to add a server:

I’m using the sa account to connect to this server, because it’s on a stand-alone server. If you start using the DLM Dashboard (or any other software), it’s bad practice to use the sa account. So if you want to use this in your environment, don’t use the sa account!!!

 
In the next screen you see, you get the question if you want to run the monitoring script yourself, or if you want the DLM Dashboard to run it:

 
When the script completes, you end up with a new database on the server that you configured in the dashboard. The database is named Redgate, and the name of the database can’t be changed at this point. For further information on this, check out the Q&A at the bottom of this post.

This database contains only 1 table named “DDL_Events”, to store all the changes that happened in your database(s).

The DLM Dashboard also creates a server DDL-trigger called “RG_SQLLighthouse_DDLTrigger” (which can be found under “Server Objects” -> “Triggers” in SSMS). This trigger registers all the changes on your database, and stores those changes in the Redgate database.

Now you need to combine all the configured items (Environment, Server and Database) in 1 last configuration:

 
I’ve executed these steps (configure an environment, add a server, and add a database) twice, to add a test environment as well. I’ve skipped the Acceptance and Production (which would complete the DTAP) in this test. So now my dashboard looks like this:

 
This is the default view when you just configured your databases. The schema of your database is basically a snapshot of your current database status.

 
Making changes, and marking versions
After you’ve configured the dashboard with all your environments (in my case 2 environment, just for testing purposes), it’s time to make some changes in your databases. Let’s add a table to the Sandbox database on development:

USE Sandbox
GO


CREATE TABLE dbo.TriggerSchemaChange
  (ID INT IDENTITY(1,1),
   Value VARCHAR(10))

 
If you wait for a few seconds (the dashboard will automatically refresh), you see that one of your databases is starting to “drift”:

 
If you click on “Review”, you get an overview of the changes that were made, and by who they were made:

 
In this case, I’ll tag this version as 1.0.0.1, and save that change. The dashboard changes, and the new version is shown. And if we execute the same change on the other database, the DLM Dashboard automatically recognizes the version:

 
You just need to acknowledge the “new” version on the test environment, just like on the development environment, and your dashboard looks all okay again:

 
The world ain’t all sunshine and rainbows…
Even though I think this is a wonderful way to keep an eye on your environment, there are still a few things that are missing in my opinion. I’ve asked Redgate these questions, and Andrew Denty (Blog | @andrewdenty), who is a Product Marketer at Redgate, was kind enough you answer these questions. This Q&A is published after getting permission from Andrew and Redgate. Thanks again Andrew!

 
1) What is the delay in tracking changes and refreshing the webpage, and can we change that?

The DDL trigger we install on the SQL Server puts entries into a table each time a database is changed.

Currently we poll the server every 13 seconds and query that table to see if there have been any changes.

This at the moment is not configurable, but should be as it is not a lot of work (and will probably be done before we leave beta). It’s also worth pointing out that you don’t need to hit refresh or anything – the dashboard live refreshes on its own.

 
2) Can we change the port number on which the webinterface runs (default 19528)?

Yes see http://documentation.red-gate.com/display/SL1/Changing+the+port+SQL+Lighthouse+uses

 
3) Can we change the database name (rename from RedGate to DBA for example), or log events to a dedicated database or server? And how does it react when the target server isn’t available?

At the moment no, the Redgate database is hard coded into the application, and is expected to be on the same server as the databases we are monitoring. The DDL trigger lives on the server so it always runs if the server is running. If DLM dashboard cannot reach the SQL server it will show you in the UI the server is not available. When it becomes available again DLM dashboard will check if there has been changes whilst it has been unavailable’.

 
4) Why do you use a DDL trigger, instead of Extended Events?

In SQL Lighthouse, because our DDL Trigger runs when a change happens, and crucially because it runs synchronously with the change, i.e. the change is temporarily blocked until we return. Therefore we not only get the information that we’re given, but we can also ask for other information, like the:

1. Accurate date it happened
2. The application that made the change
3. The IP address that made the change
4. The nest level, e.g. was this a change made directly, or did a change run a DDL Trigger + that DDL Trigger made this change
5. The transaction ID to help us group changes together

In contract, events run asynchronously, so by the time we see the message on the queue, the information above *may* still be queryable, but will probably not be available, and we can’t rely on it. The plus side of events is that they can’t break anything, because they run asynchronously, whereas if the DDL Trigger throws an error it’ll break what the user is doing.

In summary, we want the extra information so had to go with DDL Triggers.

 
5) The configuration limitations, are these just in the beta? The current configuration limitation is 5 database per environment

At the moment the beta isn’t designed to scale. We’re going to be working on improving the performance and scalability over the coming month. When we leave beta we’re planning to have a pro and the current free version. The pro version will allow you to monitor more instances and databases than the free version which will have some limitations. We’re still working out the exact details though.

 
6) Is there any SQLCompare integration planned? So for example, an overview of the differences between 2 environments (Development and Test), or syncing of 2 databases on different environments?

Yes, we’d like to add this to the Dashboard (assuming you have SQL Compare as well). We’d like to do this later this year– but we’re being agile and responding to feedback so we might change our plans and do something else first. Please do vote this up on UserVoiceas we’re using that to help us decide what to do next.

 
7) What is the plan on Octopus Deploy integration? We’re looking into that for automatic deploys, and it would be great to deploy a version from DLM Dashboard for example

Triggering deployments from the Dashboard isn’t something we have planned, though we do want to help be part of your release management process. I’d like us to be able to report back to Octopus if there has been drift away from the release it thinks is deployed (e.g .a hotfix has been made directly on production), so that you can get those changes back into process. Our SQL Releaseproduct (which is a database deployment plugin for Release Management tools like Octopus) may well use elements of the dashboard in the future.

 
 
Bugs
There are still a few bugs in the software:

– If you remove your project, the whole dashboard is empty, the database monitoring is still active, but you’re not asked if you want to disable/remove it completely
– If you remove your database, the database monitoring is still active, and Redgate database isn’t removed
– If you’re adding a column to a table, 2 columns are marked as changed (bugfix in progress)
– Cancel button that doesn’t work on the email notification screen (bugfix in progress)
– Can’t mark your initial version to a certain name (for example, tag it as version 1.0.0.0) (bugfix in progress)
– If your machine name is long, it’s trimmed in the review changes screen (bugfix in progress)
– Sometimes the webservice doesn’t start after a reboot of the machine, and you need to start it manually (bugfix in progress)

As mentioned, most of these bugs are already fixed, and ready for the next release. Redgate is really fast with picking up on bugs and fixing them. So if you find any bugs, please let them know!

 
Recap
Looking at the possibilities of the DLM Dashboard, and the version that is available now, I’m pretty confident this can become a great product! Redgate haven’t announced the differences between the free and paid versions yet, but there will always be a free version. Also, knowing Redgate, getting the paid version won’t be the biggest problem to overcome. Their licenses are pretty affordable, and the tools are indispensable if you ask me.

I think Redgate discovered a whole new market with this. They are jumping on the “release early, release often” train, and they do it quite well on the first try! I’ve never seen database professionals do this, in all the companies I’ve seen in the past. Only the .NET developers did that, and we all mocked them for it. But now, we’re starting to see a shift in the market, and Redgate is willing to help us on this. And for that, I want to thank the people of Redgate!

T-SQL Tuesday #57 – SQL Family and community

T-SQL Tuesday is a recurring blog party, that is started by Adam Machanic (Blog | @AdamMachanic). Each month a blog will host the party, and everyone that want’s to can write a blog about a specific subject.

This month the subject is “SQLFamily and community”. If you want to read the opening post, please click the image below to go to the party-starter: Jeffrey Verheul (Blog | @DevJef).



 
This months topic is SQL Family and community. The reason I chose this topic is the great SQL Server community. As I mentioned in my opening post, at first I didn’t know what to expect from the community, when I first started working with SQL Server. But it didn’t take long to notice the reason why everyone was to enthusiastic about it. They say the SQL Server community is friendlier and more helpful than all other communities. But it is?

It’s all about helping
A few years back, I attended a developer conference in the Netherlands called TechDays. One of the session I got to see was a session about spatial data by Bob Beauchemin (Blog | @bobbeauch). And yes, here is where the obsession for spatial data started I think. After the conference I started to play around with spatial data, and eventually I ran into a problem. At that moment I couldn’t find any useful resources online, and I knew Bob would know the answer.

After thinking about it a day or 2, I finally decided to mail Bob. This was a big step for me, because I was about to email one of the people I highly respect and look up to. The expectancy wasn’t too high, because he couldn’t possibly have the time to answer my stupid email. So I took the bull by the horns anyway and pressed send.. But to my surprise, it didn’t even take an hour before I received an answer. From Bob. THE Bob. I couldn’t believe it! The big Bob Beauchemin took the time to not only read my email, but even provide me with an answer to my question. And of course the answer was spot on, and precisely I needed to overcome my problem.

After that, I regularly emailed Bob about questions, interesting cases I came across, and many more things. But I didn’t send him an email for too long now, and I still hope I get to thank him for this in person one day. But this is where the community-vibe hit me.

 
From knowledge base to blog
When you get to work on all different kinds of projects, you end up with notepad files all over your desktop, USB hard drives, etc. At least, in my case that’s where it started. In order to create a structure in all these small solutions and notes, I decided to put them online. So basically I wanted to create a small knowledge base and one location where I could save scripts for myself, that I wrote before and might need in the future again. But after a few months, people started to thank me for my posts. I couldn’t understand why, because the posts were so basic, and sometimes totally gibberish in my opinion.

But after a while, I started to like the “blogging” (which was just brain-dumping until then). The blog posts became more coherent, written for a broader audience, and they were easier to follow and implement for readers. So the brain dumps grew into something new: useful information that other people could actually use! One of the things I’m most proud of until now is the spatial data series I wrote (yes, again spatial data). It grew from just 1 post (the basics of spatial data) to a series of 11 posts where I guide readers from the basics to building their own reports. This series was inspired by readers and colleagues that asked questions, and questions I had myself.

 
Online community
One of the greatest discoveries for me was the community on Twitter. Anyone you can imagine from the community is on Twitter, and available within a few clicks. And they all take the time to answer questions! This came as a big surprise to me at first. There are so much interesting people you can follow, interact with, and that can help you with technical challenges you have. In most cases, when you post a question with the hashtag #SQLHelp, you’ll get an answer within minutes. And you answer could come from anybody: An MCM from the UK, MVP from Australia, DBA from the US, a BI guy from South Africa, etc. All kinds of people with different skill sets and technical interests. And the variety of people make it worth while to follow and interact with them.

 
When strangers become friends
After being “active” in the community for a while, I started talking to some people of the SQL Server community online. At one moment I asked one of them to review a blog post, to see what she thought. That was the start of a friendship that brought me a lot the last couple of years. She introduced me to some other SQL Server professionals, that I consider close friends now. These friends support me, push me to do stuff when I need it, and help me in any way possible. They inspire me to do better, and take the extra step to achieve the goals I set myself. They are there when I need them, and that is the most important thing. And all of that because of SQL Family and the community! The community that I couldn’t believe to be more than a bunch of people with the same job and technical interests.

 
FORG
Being involved in the community is really cool, and there are benefits of that. For example, Red Gate gave me the opportunity to become a member of the Friends of Red Gate program, which was and is a great honor for me. I’ve loved using their tools for years, and now they gave me the opportunity to help make these tools better, by being part of this program. And hopefully there are a lot of cool things coming up, because I still have a lot of dreams and goals I set myself when I became a part of this program.

 
If you’re not involved yet, get involved!
The reason you should get involved in the community, is the fact that it’s really nice to help people, and you get so much in return. Getting help from someone is great, but helping other people makes you feel even better. At least, that’s my experience. So if you’re not already involved, start visiting local events like SQL Saturdays or online events like virtual users groups. It’ll give you insight in other SQL Server subjects, and show you that other people struggle with the same problems you do!

Become a T-SQL Hero with SQL Prompt

Since 1999, Red Gate Software has produced ingeniously simple and effective tools for over 500,000 technology professionals worldwide. From their HQ in Cambridge UK, they create a number of great tools for MS SQL Server, .NET, and Oracle. The philosophy of Red Gate is to design highly usable, reliable tools that solve the problems of DBAs and developers.

Every year Red Gate selects a number of active and influential community members (such as popular blog writers and community site owners) as well as SQL and .NET MVPs who are experts in their respective fields, to be part of the Friends of Red Gate (FORG) program. I’m proud to announce that I’m part of the 2014 FORG selection. This post is a part of a series of post, in which I try to explain and show you why the tools of Red Gate are so loved by the community.



 
What SSMS misses
The tool that Microsoft provides you with when you install SQL Server is pretty nice. It’s nicely designed (even though I’ve heard other opinions), it’s stable, and it does what it should do: it allows you to administer your servers. But that’s not the only thing that it should do in my opinion. If you take a look at Visual Studio as an example, that studio contains more options that helps developers do their job. And remember, SQL Server Management Studio (SSMS) is actually a Visual Studio instance with a different layout (just check the Ssms.exe.config)…

So why doesn’t SSMS have a schema compare option, like Visual Studio has? Visual Studio is no longer the environment that is used only by developers that work with ASP.NET and C#, but it evolved to much more the last few years. It’s now the tool for working with Data Quality Services (DQS) and SQL Server Integration Services (SSIS). So let’s talk about some other features that SSMS misses in my opinion, and let’s see how SQL Prompt can fill that gap.

 
IntelliSense
SSMS ships with a default intelliSense, but this isn’t an implementation that I would like to see. It misses a few vital features. For example, the fact that SSMS IntelliSense doesn’t take relations between objects into account, is one of the biggest shortcomings. One of the companies that created a tool to fix that is Red Gate. If you install SQL Prompt, you get IntelliSense 2.0, or IntelliSense on steroids if you like.

When you installed SQL Prompt, it gives you suggestions when you write a JOIN clause. This means that it scans column names, and traces primary- and foreign key relationships on the tables you are joining. The join suggestion based on keys can be recognized by the little key symbol in front of it:

 
Object discovery
Whenever you’re working in a database, and you’re writing your queries, there comes a point that you can’t remember a column name or datatype. In SSMS you need to navigate the object explorer to the object (let’s say a table), and generate a create script, or click on the table to get to the column list. SQL Prompt allows you to hover your mouse over an object, and see some extra information:

 
If you click on the popup, you’ll get another popup window with the creation script (by default), or a summary of the object:

 
Scripting options
Whenever you need to script an object, or want to see the contents of for example a Stored Procedure, you need to navigate to the object in your object explorer. With SQL Prompt, you can also use the mouse context menu to script objects. Just right-click an object you referenced in your query, and choose the “Script Object as ALTER” option:

 
This will generate an alter script for the object you selected. This makes it a lot easier to see the contents of a Stored Procedure or View, and change it when needed.

 
Useful functions
The last feature I want to show you is the menu of SQL Prompt. This shows you another set of useful tools and functions. For example, how do you format your T-SQL query? SQL Prompt can do that for you with a few mouse clicks, or if you press the hotkey combination. Another great feature is the “Find Unused Variables and Parameters”. This saves you time when you try to find out which declared variables you don’t use anymore, in a very large query. All of these options can be found in the SQL Prompt menu:

 
If you want, you can also create a style-export for all your colleagues, so your entire department or company formats queries according to the same layout. You can find out more about this in the SQL Prompt menu, under Options -> Format -> Styles. You can export your formatting options as a .sqlpromptstyle file, or import one.

 
Is it worth it?
If you would ask me, my answer would be: yes! Even though it’ll cost you about €285,- (or $390,-), it’s definitely worth it. It saves you a lot of time, and it adds a lot of useful (and needed) features to SSMS.

If you want to try it out, just go to Red-Gate.com, or the product site for SQL Prompt. You can download a trial there that contains all features, for a limited time.

 
If you want to read more about this topic, don’t forget to check out these blog posts:

Julie Koesmarno: Clean And Tidy SQL With SQL Prompt
Mickey Stuewe: Becoming a SQL Prompt Power User
Chris Yates: SQL Prompt – The Power Within

Friends of Red Gate Project

A few weeks ago I received the official confirmation that I’ll be a part of the Friends of Red Gate 2014 program. This program allows the participants to directly communicate with the product teams, provide feedback to teams, and try out beta-builds of tools.

 


 
The Red Gate tools are easy to use, and give you the upper hand on administering servers and databases. So together with other Friends of Red Gate, we decided to write a series of blog posts about the tools. The first one will be posted tomorrow. In these blog posts, we will try to show you why you should use their tools, and what problems they can solve for you.

 
The group that will write posts just like me consists of 3 other people:

 
Mickey Stuewe (Blog | @SQLMickey) from Orange County, USA
Chris Yates (Blog | @YatesSQL) from Kentucky, USA
Julie Koesmarno (Blog | @MsSQLGirl) from Canberra, Australia

 
So when you’re interested in reading about the tools from Red Gate, or you want to see how people use the same tools in different jobs, you should definitely check out this series!