Tuesday, March 6, 2012

Automated Script-generation with Powershell and SMO


If you’re a DBA or a database developer, you’ve probably found that point‘n’click script-generation via SSMS becomes an increasingly tedious chore, and one that always increases the chance of error. Now, if you’re happy with the tedium of using SSMS to create your database scripts regularly, then use it, because it works, and it is probably the best way of getting one-off ad-hoc scripts for special purposes.
In my case, however, I want automated scripting, because it gives me the opportunity of getting the result exactly as I want it. I’m then more confident of having an up-to-date database build-script to hand, and I’m more likely to use this technique to perform tedious routine jobs such as integration, source control and creating test databases.
A live database is created from a number of SQL scripts, both DDL and DML. Script-generation is the act of reverse-engineering that live database, or a database component such as a schema or set of routines, into one or more scripts that are capable of creating an exact replica.
You’ll probably want a complete build script, as it is difficult to create a build script for a working database from individual object scripts alone, and impossible unless you are aware of the server settings and have scripted the database settings too. It is therefore wise to regularly save a complete script for building the entire database, and maybe a separate script for the database settings as well. You’ll probably also want to use script-generation to get the source of individual routines, tables and views for source control, for editing and for your own records.
There are a number of choices to be made when creating your database scripts, depending on the problem you’re trying to solve. It’s like buying a frou-frou cup of coffee. You are bombarded with decisions. Are you updating the schema of an existing database, or are you creating it from scratch? Do you only wish to update the routines? Do you want to temporarily disable the constraints in order to import data in the ‘wrong’ order? Do you want to leave out the DRI until you’ve imported all the data?
You need this flexibility required in a script-generation solution because you have a fair amount of complexity to deal with. A database application, at any point in time, consists of one or more databases on one or more servers, and may have more than one version or fork being developed. They may be using different data. As well as the code for all the routines, views and tables, there will be database settings and server settings. Finally, there will be data, even if just the basic enumerations and static data without which nothing works. As well as the code in the shared ‘public’ database, you may also have stealth things you are trying out, and sandbox stuff that needs to be preserved. You will also need to script your endpoint configuration and tasks that will go on the SQL Server agent. If you’re doing serious website work, you’ll have queues managed by service broker too.
If the production-DBAs have scripts for all of this, for all current versions, along with migration and rollback scripts, then they are smiling.
As well as squirreling away the code in order to preserve the work of the team, keep track of progress, maintain test-cells, do builds and rollbacks from builds, and to relate bugs to code alterations, you need codes to understand the database. You can understand a minnow of a database such as Adventureworks through Point n’ click in SSMS, but for an industrial-strength behemoth, then it is far quicker to eyeball down the build scripts. I know of very few ways to generate database scripts, and a lot of these do it wrong, because the scripts are formatted as a machine-to-machine communication. Table build scripts, for example, can be written for legibility or merely in order to get a correct build. In order to quickly learn a database, you need the legible version.
No, I had to write my own version and it has paid dividends. Functions, for example, are easier to understand with a structured comment block listing the parameters and comments in extended properties, and even where they are referenced and what they reference. Tables are far better with the simpler constraints written in-line, and comments on both table and column pulled from the extended properties.

Automated scripting of database DDL

Let’s start with the obvious technique. We’ll use PowerShell and Server Management Objects (SMO). No, don’t panic. SMO is a monster, but that’s because it is written for many purposes, the worst of which is providing an interface between SSMS and the servers. In most cases, there is already a high-level interface for the likes of you and me. I’ll admit that SMO isn’t easy. It is always a bad sign when you've got a problem with a script, and you reach for Google, only to find nothing more than a string of Chinese characters on half a page, and a similar StackOverflow question left unanswered since 2008, save for advice to use SSIS instead.
This sort of thing tends to happen when you're using SMO, which is such a shamebecause it is so powerful. In fact, almost any job that you can do via SSMS you can do through SMO. With PowerShell and SMO, you can work magic, but with that strange, lonesome feeling that not many people have ever walked down the same path. It's the awful documentation, combined with the intricate, arcane, multi-layered interface that makes SMO hard work for the uninitiated, but it shouldn't be like this.
Microsoft has always shown its ambivalence in letting users loose on SMO, by neglecting to provide anything remotely resembling adequate documentation. All we get is minimal, and probably machine-generated, documentation of the SMO classes, methods and so on. Even the examples have errors. Microsoft has to keep SMO up to date because it is used for SSMS, but there seems to be a great deal of passive resistance to supporting users who need to use it for scripting. For this reason, I’ll be keeping these examples as simple as I can.

Getting stuck in quickly

There is usually a quick way to do things. Here is the PowerShell to script out the MyDatabase database from MyServer into the local directory E:\MyScriptsDirectory' (it will do others, of course, by changing the assignments to the three variables at the head of the script). Note that in these PowerShell scripts I've opted to avoid the terse style, mainly because the terse style is less intelligible for those of us who just want to use PowerShell without getting too absorbed.
$Filepath='E:\MyScriptsDirectory' # local directory to save build-scripts to
$DataSource='MyServer' # server name and instance
$Database='MyDatabase'# the database to copy from
# set "Option Explicit" to catch subtle errors
set-psdebug -strict
$ErrorActionPreference = "stop" # you can opt to stagger on, bleeding, if an error occurs
# Load SMO assembly, and if we're running SQL 2008 DLLs load the SMOExtended and SQLWMIManagement libraries
$ms='Microsoft.SqlServer'
$v = [System.Reflection.Assembly]::LoadWithPartialName( "$ms.SMO")
if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {
[System.Reflection.Assembly]::LoadWithPartialName("$ms.SMOExtended") | out-null
   }
$My="$ms.Management.Smo" #
$s = new-object ("$My.Server") $DataSource
if ($s.Version -eq  $null ){Throw "Can't find the instance $Datasource"}
$db= $s.Databases[$Database]
if ($db.name -ne $Database){Throw "Can't find the database '$Database' in $Datasource"};
$transfer = new-object ("$My.Transfer") $db
$transfer.Options.ScriptBatchTerminator = $true # this only goes to the file
$transfer.Options.ToFileOnly = $true # this only goes to the file
$transfer.Options.Filename = "$($FilePath)\$($Database)_Build.sql";
$transfer.ScriptTransfer()
"All done"
I reckon this is the simplest PowerShell script to get an executable build script, and it isn’t too painful. Die-hard SMO-heads will notice that I have to write to a file via SMO in order to get the batch-terminator GO into the script. To do this, I’ve created a ‘ScriptOptions’ object, which isn’t entirely necessary yet, but will be once we increase the complexity of the task. If you run the script, it will successfully build a database, but there will be a lot missing, because we’ve been using the default options for generating the script. You’ll have no DRI. The tables, in other words, won’t have their constraints and indexes, or any dependent objects at all. It will miss out all the extended properties as well.
There is some work to be done. Not all the defaults for the script options are sensible. A quick bit of PowerShell to query the ScriptOptions object will tell us what the defaults are. I’ll print them all out because this is a useful reference when you’re struggling with a script task. This is the equivalent of the options for the frou-frou cup of coffee, and you’ve just been served ‘black without sugar’. Our scripting options are below, along with the defaults.

Options: ‘Do you want cinnamon with that coffee, sir?’

These are the scripting options with their default settings. I haven't explained them, because they are either obvious or  undocumented
FileName
EncodingSystem.Text.UnicodeEncoding
DriWithNoCheckFalse
IncludeFullTextCatalogRootPathFalse
BatchSize1
ScriptDropsFalse
TargetServerVersionVersion110
TargetDatabaseEngineTypeStandalone
AnsiFileFalse
AppendToFileFalse
ToFileOnlyFalse
SchemaQualifyTrue
IncludeHeadersFalse
IncludeIfNotExistsFalse
WithDependenciesFalse
DriPrimaryKeyFalse
DriForeignKeysFalse
DriUniqueKeysFalse
DriClusteredFalse
DriNonClusteredFalse
DriChecksFalse
DriDefaultsFalse
TriggersFalse
StatisticsFalse
ClusteredIndexesFalse
NonClusteredIndexesFalse
NoAssembliesFalse
PrimaryObjectTrue
DefaultTrue
XmlIndexesFalse
FullTextCatalogsFalse
FullTextIndexesFalse
FullTextStopListsFalse
IndexesFalse
DriIndexesFalse
DriAllKeysFalse
DriAllConstraintsFalse
DriAllFalse
BindingsFalse
NoFileGroupFalse
NoFileStreamFalse
NoFileStreamColumnFalse
NoCollationFalse
ContinueScriptingOnErrorFalse
IncludeDatabaseRoleMembershipsFalse
PermissionsFalse
AllowSystemObjectsTrue
NoIdentitiesFalse
ConvertUserDefinedDataTypesToBaseTypeFalse
TimestampToBinaryFalse
AnsiPaddingFalse
ExtendedPropertiesFalse
DdlHeaderOnlyFalse
DdlBodyOnlyFalse
NoViewColumnsFalse
SchemaQualifyForeignKeysReferencesFalse
AgentAlertJobFalse
AgentJobIdTrue
AgentNotifyFalse
LoginSidFalse
NoCommandTerminatorFalse
NoIndexPartitioningSchemesFalse
NoTablePartitioningSchemesFalse
IncludeDatabaseContextFalse
NoXmlNamespacesFalse
DriIncludeSystemNamesFalse
OptimizerDataFalse
NoExecuteAsFalse
EnforceScriptingOptionsFalse
NoMailProfileAccountsFalse
NoMailProfilePrincipalsFalse
NoVardecimalTrue
ChangeTrackingFalse
ScriptDataCompressionTrue
ScriptSchemaTrue
ScriptDataFalse
ScriptBatchTerminatorFalse
ScriptOwnerFalse
This is a scarily complicated set of options. One can’t help wondering why some of these options would be required. However, we can soon put our script right without too much bother. Your own requirements may be different, but I was aiming for an exact copy of AdventureWorks in my testing!

Making an exact copy of AdventureWorks

$Filepath='E:\MyScriptsDirectory' # local directory to save build-scripts to
$DataSource='MyServer' # server name and instance
$Database='MyDatabase'# the database to copy from
# set "Option Explicit" to catch subtle errors
set-psdebug -strict
$ErrorActionPreference = "stop" # you can opt to stagger on, bleeding, if an error occurs
# Load SMO assembly, and if we're running SQL 2008 DLLs load the SMOExtended and SQLWMIManagement libraries
$ms='Microsoft.SqlServer'
$v = [System.Reflection.Assembly]::LoadWithPartialName( "$ms.SMO")
if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {
[System.Reflection.Assembly]::LoadWithPartialName("$ms.SMOExtended") | out-null
   }
$My="$ms.Management.Smo" #
$s = new-object ("$My.Server") $DataSource
if ($s.Version -eq  $null ){Throw "Can't find the instance $Datasource"}
$db= $s.Databases[$Database]
if ($db.name -ne $Database){Throw "Can't find the database '$Database' in $Datasource"};
$transfer = new-object ("$My.Transfer") $db
$CreationScriptOptions = new-object ("$My.ScriptingOptions")
$CreationScriptOptions.ExtendedProperties= $true # yes, we want these
$CreationScriptOptions.DRIAll= $true # and all the constraints 
$CreationScriptOptions.Indexes= $true # Yup, these would be nice
$CreationScriptOptions.Triggers= $true # This should be included when scripting a database
$CreationScriptOptions.ScriptBatchTerminator = $true # this only goes to the file
$CreationScriptOptions.IncludeHeaders = $true; # of course
$CreationScriptOptions.ToFileOnly = $true #no need of string output as well
$CreationScriptOptions.IncludeIfNotExists = $true # not necessary but it means the script can be more versatile
$CreationScriptOptions.Filename =  "$($FilePath)\$($Database)_Build.sql";
$transfer = new-object ("$My.Transfer") $s.Databases[$Database]

$transfer.options=$CreationScriptOptions # tell the transfer object of our preferences
$transfer.ScriptTransfer()
"All done"
If you test a database built with this script against the original, it gives a pretty good account of itself. All that’s missing are some extended properties on indexes, but there is no switch that one can flip to tickle those out of SMO, so I suspect that someone has made a mistake.
We did this intermediate version because it is simple and demonstrates a clean technique which you can take and expand on. It’s great for archiving a complete build script that you can use in source control alongside the individual object scripts. However, I’m going to include a more complete version that will give you a database build script and an object-deletion script as well as the object-build script, all concatenated into one script. You’ll begin to understand why I like to create a ‘ScriptingOptions’ object to store the options, since it is more efficient for this sort of job.

Getting database settings and object drops into a database-script

 # set "Option Explicit" to catch subtle errors
set-psdebug -strict
$DirectoryToSaveTo='e:\MyScriptsDirectory\' # local directory to save build-scripts to
$servername='MyServer' # server name and instance
$Database='MyDatabase' # the database to copy from
$ErrorActionPreference = "stop" # you can opt to stagger on, bleeding, if an error occurs
Trap {
# Handle the error
$err = $_.Exception
write-host $err.Message
while( $err.InnerException ) {
   $err = $err.InnerException
   write-host $err.Message
   };
# End the script.
break
}
# Load SMO assembly, and if we're running SQL 2008 DLLs load the SMOExtended and SQLWMIManagement libraries
$v = [System.Reflection.Assembly]::LoadWithPartialName( 'Microsoft.SqlServer.SMO')
if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMOExtended') | out-null
   }
$My='Microsoft.SqlServer.Management.Smo'
$s = new-object ("$My.Server") $ServerName # get the server.
$Server=$s.netname -replace '[\\\/\:\.]',' ' # remove characters that can cause problems
$instance = $s.instanceName -replace '[\\\/\:\.]',' ' # ditto
$DatabaseName =$database -replace '[\\\/\:\.]',' ' # ditto
$DirectoryToSaveTo=$DirectoryToSaveTo+$Server+'\'+$Instance+'\' # database scripts are local on client
if (!( Test-Path -path "$DirectoryToSaveTo" )) # create it if not existing
       {$progress ="attempting to create directory $DirectoryToSaveTo"
              Try { New-Item "$DirectoryToSaveTo" -type directory | out-null }
             Catch [system.exception]{
       Write-Error "error while $progress$_"
                return
                }
           }
<# now we will use the canteen system of SMO to specify what we want from the script. It is best to have a list of the defaults to hand and just override the defaults where necessary, but there is a chance that a later revision of SMO could change the defaults, so beware! #>
$CreationScriptOptions = new-object ("$My.ScriptingOptions")
$CreationScriptOptions.ExtendedProperties= $true # yes, we want these
$CreationScriptOptions.DRIAll= $true # and all the constraints
$CreationScriptOptions.Indexes= $true # Yup, these would be nice
$CreationScriptOptions.Triggers= $true # This should be included when scripting a database
$CreationScriptOptions.ScriptBatchTerminator = $true # this only goes to the file
$CreationScriptOptions.Filename = "$($DirectoryToSaveTo)$($DatabaseName)_Build.sql";
# we have to write to a file to get the GOs
$CreationScriptOptions.IncludeHeaders = $true; # of course
$CreationScriptOptions.ToFileOnly = $true # no need of string output as well
$CreationScriptOptions.IncludeIfNotExists = $true # not necessary but it means the script can be more versatile
$transfer = new-object ("$My.Transfer") $s.Databases[$Database]
$transfer.options=$CreationScriptOptions # tell the transfer object of our preferences
$scripter = new-object ("$My.Scripter") $s # script out the database creation
$scripter.options=$CreationScriptOptions # with the same options
$scripter.Script($s.Databases[$Database]) # do it
"USE $Database" | Out-File -Append -FilePath "$($DirectoryToSaveTo)$($DatabaseName)_Build.sql"
"GO" | Out-File -Append -FilePath "$($DirectoryToSaveTo)$($DatabaseName)_Build.sql"
# add the database object build script
$transfer.options.AppendToFile=$true
$transfer.options.ScriptDrops=$true
$transfer.EnumScriptTransfer()
$transfer.options.ScriptDrops=$false
$transfer.EnumScriptTransfer()
"All written to $($DirectoryToSaveTo)$($DatabaseName)_Build.sql"
This isn’t entirely what we want for other purposes, of course. What about when you want to create a database without indexes, constraints or triggers, import the data in BCP fast-mode and then add the indexes, constraints and triggers? Yes, you can squirrel away far more test-runs this way, and load them rapidly, but in order to do it, you need a build script without them first, and a second build script with them only. With the first ones, you can have a ‘knock-down’ kill script that deletes everything from the database before you start, but you definitely don’t want it for the second script. You’ll soon be eyeing up all those scripting options, though, believe me. I’ll be covering a lot about this in future articles.

Automated scripting of objects

A second task is to save each object to a separate file. You’ll need to do this to get your local (unshared) database into source control if you’re not using SQL Source Control. The simplest way of doing this, if you are lucky enough to have SQL Compare, is
sqlcompare.exe /s1:MyServer /db1:MyDatabase /mkscr:MyNewDirectory    /q
You can do it in Powershell, and you will, again, have more knobs you can twiddle to get the individual scripts exactly how you like them.
Here is a simplified script that shows you one of the several methods of doing this. Like SQL Compare, it saves each object type into its own subdirectory.
$ServerName='MyServer'# the server it is on
$Database='MyDatabase' # the name of the database you want to script as objects
$DirectoryToSaveTo='E:\MyScriptsDirectory' # the directory where you want to store them
# Load SMO assembly, and if we're running SQL 2008 DLLs load the SMOExtended and SQLWMIManagement libraries
$v = [System.Reflection.Assembly]::LoadWithPartialName( 'Microsoft.SqlServer.SMO')
if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {
    [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMOExtended') | out-null
}
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SmoEnum') | out-null
set-psdebug -strict # catch a few extra bugs
$ErrorActionPreference = "stop"
$My='Microsoft.SqlServer.Management.Smo'
$srv = new-object ("$My.Server") $ServerName # attach to the server
if ($srv.ServerType-eq $null) # if it managed to find a server
   {
   Write-Error "Sorry, but I couldn't find Server '$ServerName' "
   return
}
$scripter = new-object ("$My.Scripter") $srv # create the scripter
$scripter.Options.ToFileOnly = $true
# we now get all the object types except extended stored procedures
# first we get the bitmap of all the object types we want
$all =[long] [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::all `
    -bxor [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::ExtendedStoredProcedure
# and we store them in a datatable
$d = new-object System.Data.Datatable
# get everything except the servicebroker object, the information schema and system views
$d=$srv.databases[$Database].EnumObjects([long]0x1FFFFFFF -band $all) | `
    Where-Object {$_.Schema -ne 'sys'-and $_.Schema -ne "information_schema" -and $_.DatabaseObjectTypes -ne 'ServiceBroker'}
# and write out each scriptable object as a file in the directory you specify
$d| FOREACH-OBJECT { # for every object we have in the datatable.
   $SavePath="$($DirectoryToSaveTo)\$($_.DatabaseObjectTypes)"
   # create the directory if necessary (SMO doesn't).
   if (!( Test-Path -path $SavePath )) # create it if not existing
        {Try { New-Item $SavePath -type directory | out-null }
        Catch [system.exception]{
            Write-Error "error while creating '$SavePath$_"
            return
         }
    }
    # tell the scripter object where to write it
    $scripter.Options.Filename = "$SavePath\$($_.name -replace '[\\\/\:\.]','-').sql";
    # Create a single element URN array
    $UrnCollection = new-object ('Microsoft.SqlServer.Management.Smo.urnCollection')
    $URNCollection.add($_.urn)
    # and write out the object to the specified file
    $scripter.script($URNCollection)
}
"Oh wide one, All is written out!"
This time, we get SMO’s EnumObjects method for the current database object, then get the scripter object, via the Script method, to script out each object individually and save it to a separate file. Each filename is generated from the name of the object, and the directory name is generated from its object type. You can, of course, be very selective about what you script out and you’ll see that scripting out a single object type, such as a table, is very simple.
In this following script, we'll save just the tables, scripting different schemas into different directories and adding the DRI, indexes, extended properties and triggers to each table. It would be equally simple to script whatever types of objects you want just by 'or'ing the DatabaseObjecTypes to taste. ( in the previous script, I specified 'all but...')
$ServerName='MyServer'# the server it is on
$Database='MyDatabase' # the name of the database you want to script as objects
$DirectoryToSaveTo='E:\MyScriptsDirectory' # the directory where you want to store them
# Load SMO assembly, and if we're running SQL 2008 DLLs load the SMOExtended and SQLWMIManagement libraries
$v = [System.Reflection.Assembly]::LoadWithPartialName( 'Microsoft.SqlServer.SMO')
if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {
   [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMOExtended') | out-null
}
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SmoEnum') | out-null
set-psdebug -strict # catch a few extra bugs
$ErrorActionPreference = "stop"
$My='Microsoft.SqlServer.Management.Smo'
$srv = new-object ("$My.Server") $ServerName # attach to the server
if ($srv.ServerType-eq $null) # if it managed to find a server
   {
   Write-Error "Sorry, but I couldn't find Server '$ServerName' "
   return
}
$scripter = new-object ("$My.Scripter") $srv # create the scripter
$scripter.Options.ToFileOnly = $true
$scripter.Options.ExtendedProperties= $true # yes, we want these
$scripter.Options.DRIAll= $true # and all the constraints
$scripter.Options.Indexes= $true # Yup, these would be nice
$scripter.Options.Triggers= $true # This should be includede
# first we get the bitmap of all the object types we want
$objectsToDo =[long] [Microsoft.SqlServer.Management.Smo.DatabaseObjectTypes]::Table
# and we store them in a datatable
$d = new-object System.Data.Datatable
# get just the tables
$d=$srv.databases[$Database].EnumObjects($objectsToDo)
# and write out each scriptable object as a file in the directory you specify
$d| FOREACH-OBJECT { # for every object we have in the datatable.
    $SavePath="$($DirectoryToSaveTo)\$($_.DatabaseObjectTypes)\$($_.Schema)"
    # create the directory if necessary (SMO doesn't).
    if (!( Test-Path -path $SavePath )) # create it if not existing
           {Try { New-Item $SavePath -type directory | out-null }
        Catch [system.exception]{
             Write-Error "error while creating '$SavePath$_"
             return
             }
        }
    # tell the scripter object where to write it
    $scripter.Options.Filename = "$SavePath\$($_.name -replace '[\\\/\:\.]','-').sql";
    # Create a single element URN array
    $UrnCollection = new-object ("$My.urnCollection")
    $URNCollection.add($_.urn)
    # and write out the object to the specified file
    $scripter.script($URNCollection)
    }
"All is written out, wondrous human"

Automated scripting of static data

The last task we’ll set ourselves is to script out static data. This will really just include all those small tables without which your database simply won’t work. Now, there is no way that anyone but you will know which tables this includes, but don’t ‘go ape’ with this script, since it is scripting INSERT statements and that sort of thing doesn’t scale effectively for big tables. No, sir: you’ll want native-mode, fast-mode BCP for that.
This time, I’ve used a slightly different approach, in that I’ve actually constructed the URNs from the (maybe qualified)  names of the table; this means the schema too if you specify it, and also the database if you want that too. You just specify what tables you want to script and we just go and do it. With SMO there are always several ways of getting to your destination.
# set "Option Explicit" to catch subtle errors
set-psdebug -strict
$DirectoryToSaveTo='MyDirectory'# local directory to save build-scripts to
$servername='MyServer'# server name and instance
$Database='AdventureWorks'# the database to copy from (Adventureworks here)
$Filename='MyFileName';
$TableList='product, AdventureWorksDW.dbo.DimCustomer, HumanResources.Department, person.countryRegion';
# a list of tables with possible schema or database qualifications
# Adventureworks used for this example
$ErrorActionPreference = "stop" # you can opt to stagger on, bleeding, if an error occurs
# Load SMO assembly, and if we're running SQL 2008 DLLs load the SMOExtended and SQLWMIManagement libraries
$v = [System.Reflection.Assembly]::LoadWithPartialName( 'Microsoft.SqlServer.SMO')
if ((($v.FullName.Split(','))[1].Split('='))[1].Split('.')[0] -ne '9') {
  [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMOExtended') | out-null
  }
# Handle any errors that occur
Trap {
  # Handle the error
  $err = $_.Exception
  write-host $err.Message
  while( $err.InnerException ) {
   $err = $err.InnerException
   write-host $err.Message
   };
  # End the script.
  break
  }
# Connect to the specified instance
$s = new-object ('Microsoft.SqlServer.Management.Smo.Server') $ServerName
# Create the Database root directory if it doesn't exist
$homedir = "$DirectoryToSaveTo\$Database\"
if (!(Test-Path -path $homedir))
  {Try { New-Item $homedir -type directory | out-null } 
     Catch [system.exception]{
        Write-Error "error while creating '$homedir'  $_"
        return
        } 
  }
$scripter = new-object ('Microsoft.SqlServer.Management.Smo.Scripter') $s
$scripter.Options.ScriptSchema = $False; #no we're not scripting the schema
$scripter.Options.ScriptData = $true; #but we're scripting the data
$scripter.Options.NoCommandTerminator = $true;
$scripter.Options.FileName = $homedir+$Filename #writing out the data to file
$scripter.Options.ToFileOnly = $true #who wants it on the screen?
$ServerUrn=$s.Urn #we need this to construct our URNs.
$UrnsToScript = New-Object Microsoft.SqlServer.Management.Smo.UrnCollection
#so we just construct the URNs of the objects we want to script
$Table=@()
foreach ($tablepath in $TableList -split ',')
  {
  $Tuple = "" | Select Database, Schema, Table
  $TableName=$tablepath.Trim() -split '.',0,'SimpleMatch'
   switch ($TableName.count)
    {
      1 { $Tuple.database=$database; $Tuple.Schema='dbo'; $Tuple.Table=$tablename[0];  break}
      2 { $Tuple.database=$database; $Tuple.Schema=$tablename[0]; $Tuple.Table=$tablename[1];  break}
      3 { $Tuple.database=$tablename[0]; $Tuple.Schema=$tablename[1]; $Tuple.Table=$tablename[2]; break}
      default {throw 'too many dots in the tablename'}
    }
     $Table += $Tuple
   }
foreach ($tuple in $Table)
  {
   $Urn="$ServerUrn/Database[@Name='$($tuple.database)']/Table[@Name='$($tuple.table)' and @Schema='$($tuple.schema)']";
  $urn
   $UrnsToScript.Add($Urn)
  }
#and script them
$scripter.EnumScript($UrnsToScript) #Simple eh?
"Saved to $homedir"+$Filename+', wondrous carbon-based life form!'
"done!"
Now we’re making progress, and I’m hoping that, in these simplified scripts, I’ve given you some useful clues as to how to generate particular types of build scripts. I’ll let you turn them into robust command-line tools with parameters and all the error handling, logging and other bits and pieces you’ll want. In the next article, I’ll explain how you can extend this functionality by actually executing SQL batches via SMO. This will allow you to automate migration scripts and do integration testing and all manner of other things all from the one script.

Source:

No comments:

Post a Comment