SQL Server Source control for bandits and anarchists.
A friend recently described to me how he had to give administration privileges to developers on production. I was appalled because I had lived through a similar experience at a ‘dot com’. My ‘dot com’ went even further: they gave ‘sa’ privileges to everyone in the company, including the receptionist, and then encouraged everyone to try things out because “This database stuff isn’t that hard.” Ah, good old ‘outside-the-box’ Harvard Business School thinking. In consequence, it was a constant battle to find out why the production system was going off-line regularly: Once a salesman even dropped the tempdb because “we don’t want temporary objects in our system.” After this experience, I’ve always cracked down on letting anyone into my production servers with too many rights.
But every day on the forums, I read about people dealing with users that have ‘sa’ privileges on production, developers developing in production, and the poor database professionals who, like my friend, are forced to deal with these situations. Columns, tables and whole databases appear and disappear. Functionality changes and no one knows why. Figuring out how to keep track of the changes in a production system, just to help put out the latest fire, becomes an imperative.
The best way to fix this issue? Put in place a well-documented development and deployment process that limits who has access to production. Unfortunately, not everyone is going to be able to do this. There may even be valid reasons (although I can’t imagine what they might be at the moment) why your company would allow that sort of extremely risky behavior. This means you have to deal with the situation, come what may.
Then there’s the other side of the coin. If you’re managing financial data in a publicly-traded company or you’re working with credit card data, you don’t have that ‘Dodge City on a Saturday night’ feeling in your system. Instead, you’re also dealing with Sarbannes-Oxley compliance where every change to the structure has to be recorded. You have to meet the PCI Section 10.2 of the Payment Card Industry Data Security Standard which reads in part:
10.2 Implement automated audit trails for all system components to reconstruct the following events:
10.2.7 Creation and deletion of system-level objects
Whether you’re dodging the bullets of all ‘sa’ all the time access, or hand-cuffed to a rigid regulatory regime, you need the ability to know what’s changed on your production systems and keep track of those changes over time. There are plenty of ways of monitoring DDL changes. You can set up DDL triggers to output changes to some set of tables. You could set up a task to monitor the default trace and try to capture changes into a table again. You could use Profiler to set up a server-side trace and capture events that show structural changes which output to a file. You might, if you’re feeling terribly smart and brave, use Extended Events. All of these are ‘intrusive’ in that they require actions that make changes on the server and possibly impact performance. Further, while they all give you a list of changes, they are also a long way from allowing you to compare ‘before’ and ‘after’ scripts to see what’s changed, or letting you explore backups to find out retrospectively when a change was made. They can’t help to check immediately that the code in production is exactly the code that is in the proper, labeled, version in source control. I’d like to suggest that Red Gate’s SQL Compare Professional could help you with tasks like this.
Automating SQL Compare
No, I don’t think you should fire up the SQL Compare GUI every morning when you get into work and compare your development and production servers. Let’s face it, you might not have a development server if your development takes place on production (there’s that Saturday night gun fire again). But you don’t have to use the GUI to use SQL Compare. The Professional edition comes with a command line that you can put to work. With the command line you can automate a comparison to capture changes that have been done to your production system. But what do you compare to?
Here, we get some interesting choices. Again, you could compare between your development database and your production database. The problem with this is, your development database is likely to be different from the production database because that’s where development is taking place. New and different objects will be, and should be, all over the place. So that makes it a poor choice to audit changes in your production system. This leaves you with two good options: First, you could use backups. SQL Compare can compare a live database and a backup (or a backup and a backup or… you get the idea). Another option, and frankly, the better one in my opinion, is to compare between your production database and a set of scripts: This is because SQL Compare Pro can use scripts to compare to a live database (and a backup and… I’ll shut up now). This adds one more type of software that you can toss into the mix for a complete solution, source control. And what if you automated all of this using Microsoft’s excellent automation language, PowerShell? Now we’re talking…
[Editor: Why would this be better than using the command line?]
Yes, that was really my editor asking that question. I know it’s because he wants me to tell you that by using PowerShell you get a full scripting language that can handle large numbers of objects using set based operations through the pipe. PowerShell also provides more command flow than what’s possible in a batch command and it’s easier to automate through SQL Agent. There’s more! You can also enable remoting so that you can run these processes on multiple servers at the same time as well as all the other great stuff that’s possible through PowerShell. That’s what he wants me to say, but my short answer is: “Dude! Come on! It’s PowerShell, duh! Of course we’re going to use it.”
Using Source Control and Scripts to Audit Schema Changes
In order to use SQL Compare, you have to have something to compare against (even if it’s just a blank folder). Hopefully, you already have your database in some form of source control as part of your development and deployment processes. If not, SQL Compare provides a mechanism for making it happen. Whether you do or not, this code in source control will need to be stored separately from your development code in order to isolate keeping track of production changes from your development processes. There will be no use of the pretty GUI in any of these examples. Everything will be done through the command line. More specifically, all calls will go through PowerShell invocations of the command lines. You can use any source control system. I’m using SourceGear’s Vault for the examples. There are attachments that use SVN and TFS. First, I want to generate the scripts for my entire database (if you want to do the whole server, all at once, here’s a script for that too):
$DBName = "MovieManagement";
$Server = "grant-red1\gfr1";
$Scripts = "c:\dbscripts";
$Command = "C:\Program Files (x86)\Red Gate\SQL Compare 9\sqlcompare.exe";
[Array]$arguments = "/s1:$Server","/db1:$DBName","/mkscr:$Scripts\$DBName","/q";
& $Command $arguments;
This will create the script for your database in the location specified. The commands for the SQL Compare Pro command line are very straightforward:
- /s1 – That’s the server we are comparing from
- /db1 – That’s the database that is the source
- /mkscr – Creates a script out of all objects in the database
- /q – Makes it “quiet” or less verbose
To get the code from that location into source control, you’d need to use the command line from your source control system. The one for the Vault software I’m using looks like this:
$DBName = "MovieManagement";
$Scripts = "c:\dbscripts";
set-location "C:\Program Files (x86)\SourceGear\Vault Client\";
./vault REMEMBERLOGIN -user admin -password 12345 -host VaultHost -repository MyRepo
./vault.exe add $ $Scripts\$DBName;
./vault.exe commit;
./vault.exe checkin;
If you’re working with lots of servers or lots of databases, you might want to combine these scripts and do a little more automation around getting everything into the system in one pass.. The important point is that I now have a database that is checked into source control. With that, it’s a trivial task to then create an automated script that will nightly compare my database and my scripts in source control.
What I’m going to do is set up a script that will get the latest code from source control. It will then compare this code to the production server. If there are changes in production, first, SQL Compare will generate a report of these changes. Second, an email will be sent to you, alerting you to the issue. Finally, the changes will be applied to source control and labeled with the current date so that you can retrieve them over time and so that any new changes tomorrow will get their own report and email:
## 1. Set Variables
$DBName = "MovieManagement";
$Server = "MyServer\MyInstance";
$Scripts = "C:\dbscripts";
$Date = Get-Date;
$pswd = ConvertTo-SecureString "12345" -AsPlainText -Force
$Creds = New-Object System.Management.Automation.PSCredential ("login", $pswd)
## 2. A little construct to deal with a shortcoming in Vault
$Date.AddMinutes(-2);
## 3. Getting the latest from Source Control. could get version or label instead
set-location "C:\Program Files (x86)\SourceGear\Vault Client\";
./vault REMEMBERLOGIN -user admin -password 12345 -host VaultHost -repository MyRepo
./vault GET $/$DBName;
./vault CHECKOUT $/$DBName;
## 4. SQL Compare database to source code
set-Location "c:\Program Files (x86)\Red Gate\SQL Compare 9\";
$MyOUtput = ./sqlcompare /s1:$Server /db1:$DBName /scr2:$Scripts\$DBName /sync /report:$Scripts\schemaDiff.html /reporttype:Simple /f;
## 5. Determining if there are changes
set-location "C:\Program Files (x86)\SourceGear\Vault Client\";
IF ($MyOUtput[11] -ne "Error: The selected objects are identical or no objects have been selected in")
{
## 6. There are changes, add them all to source control
##These have to be walked & date checked because Vault won’t recognize new files
Get-ChildItem $Scripts\$DBName -Recurse | where-Object{!$_.PsIsContainer} | where-Object{$_.CreationTime -gt $Date} | foreach-object {./vault ADD -commit ("$" +$_.FullName.Replace($Scripts,"").Replace("\"+$_.Name,"").Replace("\","/"))$_.FullName};
## 7. send an email with the report
Send-MailMessage -From service@red-gate.com -To Someone@red-gate.com -Subject "Schema Changes in database $DBName" -Body "The attached report shows changes that have occurred in the last 24 hours" -Attachments $Scripts\schemadiff.html -SmtpServermail.mydomain.com -Credential $Creds;
}
## 8. Move everything back into source control
./vault COMMIT $/$DBName;
./vault CHECKIN $/$DBName;
This may look slightly complicated, but there are really only a few commands at work. At the top, in the section labeled #1, we’re setting a number of variables. I defined values for the name of the database, the server, where we’re storing the scripts, a date value to deal with a small glitch in my source control software, and a set of credentials for email. That’s it. Nothing complicated.
After that, at #2, I have to get the date back about two minutes so that my source control software can find new files that get added because of SQL Compare.
From #3 down, things get interesting. First, I’m connecting up to my source control system and then I’m retrieving files from the system and checking them out for editing. All of that is done using commands from the source control system. If you’re using something other than Vault, your commands will be different, but you get the idea.
At #4, all the work is done by SQL Compare. It runs a sync between the scripts folder and the database in question. I also have SQL Compare generate a report. The output from SQL Compare goes into a variable for use later.
#5 is one of the slightly tricky parts. You need to know if SQL Compare found an identical data structure or if there were new or different objects. If you’ve worked with the command line before, you know that it outputs to the %ERRORLEVEL% variable. You can check that for the value, 86, which means no changes were found. It’s supposed to be $LASTEXITCODE, but it wasn’t working. Instead, I was able to capture the output from SQL Compare and line #12, $MyOutput[11], will match the string specified, “Error: The selected objects are identical or no objects have been selected in.” With this you can determine if there are changes you need to capture.
If you do have changes, #6 fires. Because Vault would not add new objects that had been added to the folders under source control automatically from the command line, I was forced to come up with a little bit of PowerShell code to do the trick. I got a directory listing through the Get-ChildItem. Then I filter that so I’m only looking at files and I filter those files so I’m only looking at ones that have been created prior to the data that I set back in #2. These will be the new files created by SQL Compare. All these processes are using the pipe provided by PowerShell so it’s a single command. It results in a call through for each-object to run the Vault command line to add the files to the appropriate location within Vault.
As part of the changes piece of the script, #7 will send an email using the Send-MailMessage cmdlet provided in PowerShell 2.0. The mail sent will have an attachment of the report file.
Finally, I commit the changes and check them in. Anything that wasn’t changed will be automatically unchecked.
That’s it.
Clearly, there are few statements that are unique to my system so paths and names will have to be adjusted for your system. Your source control system may also have completely different commands, so you’ll need to adjust that. But, as you can see, this is not hard to set up. With this in place and scheduled through SQL Agent, I can run this once a day and see exactly what has changed on the database in question, whilst making sure that the changes go into Source Control. I can always compare production and what’s stored in source control. Even better, I can pull a particular day or version from the source control system into a separate folder and then compare that to another day or version out of source control in order to report on changes between any two moments in time.
Using Backups to Audit Schema Changes
Since not every development team has source control, it’s worth exploring how you could do this using backups. It won’t be as powerful, but most of the functionality can be duplicated, except for getting the changes into source control, so you’re completely dependent on the backup being in place. If it’s missing, or old, you won’t be able to get a compare. Also, because you’re not going to source control, there’s no history of changes. However, the script would be very straightforward and only entail a couple of changes to two of the scripts above:
## 1 Setup variables
$DBName = "MovieManagement";
$Server = "myserver\myinstance";
$Backup = "C:\bu";
$pswd = ConvertTo-SecureString "pswd" -AsPlainText -Force
$Creds = New-Object System.Management.Automation.PSCredential ("service@red-gate.com",$pswd)
## 2 Compare database to source code
set-Location "c:\Program Files (x86)\Red Gate\SQL Compare 9\";
$MyOUtput = ./sqlcompare /s1:$Server /db1:$DBName /bu2:$bu\$DBName.bak /sync /report:$bu\schemaDiff.html /reporttype:Simple /f;
## 3 Check for changes
IF ($MyOUtput[11] -ne "Error: The selected objects are identical or no objects have been selected in")
{
## 4 send an email with the report
Send-MailMessage -From service@red-gate.com -To someone@red-gate.com -Subject"Schema Changes in database $DBName" -Body "The attached report shows changes that have occurred in the last 24 hours" -Attachments $Backup\schemaDiff.html -SmtpServermail.domain.com -Credential $Creds;
}
Basically we’re just removing all the source control settings and then modifying the call to sqlcompare.exe to use /bu2 instead of /scr2, backups instead of scripts. That’s it. Now as long as there is a backup that is at least a day old, you can automate this script to generate an auditing report.
[Editor: OK, you've done the basic stuff. What about a few handbrake turns around the square?]
Seriously, that’s exactly what he said. No lie.
Fine.
Let’s assume that you’d still like to use source control, but, you don’t actually have direct access to your production servers, however, you do have access to the backup locations. Let’s also assume that you want to run this for all the backups, for all your servers, for all your databases, AND you’re going to go ahead and add this to all your source control systems, and then run a compare nightly on this mess (he had better like this):
## 1: Basic set up
$Scripts = 'C:\dbscripts'
##A little construct to deal with a shortcoming in Vault
$Date = Get-Date;
$Date.AddMinutes(-2);
## 2: Load the list of folders
set-location "C:\Program Files (x86)\SourceGear\Vault Client\";
./vault REMEMBERLOGIN -user admin -password 12345 -host VaultHost -repository MyRepo
[xml]$FolderList = ./vault LISTFOLDER -norecursive $
#1 Arguments with directories where FULL backups are kept are passed in.
get-ChildItem $args | where-Object{($_.Extension -eq '.BAK') -and (!$_.PsIsContainer)} | foreach-object {
## 3: Get the full name which is the path, database name, and update the scripts folder
$fullname = $_.FullName;
$DBName = $_.Name.Replace('.bak','')
$dbscript = $Scripts + '\' + $DBName
## 4: Determine if this folder exists in Vault
set-location "C:\Program Files (x86)\SourceGear\Vault Client\";
$Parsed = select-Xml -xml $FolderList -XPath'vault/folder/folder[@name="$/$DBName"]'
if ($Parsed.Node.Name -eq '$/$DBName')
{
## 5: It does, so get the latest scripts
./vault GET $/$DBName;
./vault CHECKOUT $/$DBName;
}
else
{
## 6: It doesn't, so add the folder in preparation for the objects
./vault ADD -commit $ $dbscript
}
## 7: See if the folder exists, if not create one
if (!(test-Path $dbscript))
{
new-Item $dbscript -ItemType directory
}
## 8: Do the comparison
set-Location 'c:\Program Files (x86)\Red Gate\SQL Compare 9\'
$MyOutput = ./sqlcompare /b1:$fullname /scr2:$dbscript /sync /report:$dbscript\schemaDiff.html /reporttype:Simple /f
## 9: Do the same trick to check on changes
##Determining if there are changes
##IF ($LASTEXITCODE -ne 0)
IF ($MyOUtput[11] -ne "Error: The selected objects are identical or no objects have been selected in")
{
set-location "C:\Program Files (x86)\SourceGear\Vault Client\";
##There are changes, add them all to source control
Get-ChildItem $dbscript -Recurse | where-Object{!$_.PsIsContainer} |where-Object{$_.CreationTime -gt $Date} | foreach-object {./vault ADD -commit ("$" +$_.FullName.Replace($Scripts,"").Replace("\"+$_.Name,"").Replace("\","/")) $_.FullName}
}
##clean up
./vault COMMIT $/$DBName
./vault CHECKIN $/$DBName
}
[Editor: Yeah: cool.]
The way this script works is pretty straight forward. You can pass it multiple directories through the command line, and it will read each directory and pull all the files from them. The assumption is that the backup file name is the same as the database name (and yeah, I know that’s not always the case, but it’s usually part of the name, fix that yourself). We go through the same set-up in #1 that we did in the previous script. At #2 I get a list of the root folders from the source control system so that I can determine which databases need to be added and which ones I can check out and add or update stuff. At #3 I set up a few variables to make the code easier to work with, and at #4 I made sure I had a folder ready to receive any new databases because if there is no folder, Vault isn’t happy. #5 is where I perform GET & CHECKOUT operations if folders already exist. If no folder exists, then I create one at #6. At #7 I create physical file folders for the scripts created in SQL Compare. The comparison is done at #8. Finally at #9, I do the same trick as before to add files to Vault.
That’s it. It really wasn’t that hard to work through all the files. I didn’t bother to set up email this time because I suspect you wouldn’t be happy to get an email with X number of attachments sent to you once a day. However, it wouldn’t be an issue to take care of that as well.
Conclusion
Whether you’re dealing with the cow town Saturday night of all ‘sa’ access or the ordered world of a publicly traded company, you’re going to need to audit changes on your system. Combining the scripting capabilities of PowerShell and all its quick & easy power, with the command line facilities of your source control system and SQL Compare seems to be just the combination to get this problem under control.
Sources:
No comments:
Post a Comment