这里会测试一些脚本,用于SQL Server调校,如索引维护、管理磁盘空间和导出DDL脚本,最后看看如何排Job来自动运行PowerShell脚本。
索引维护
要确保最佳性能,索引维护是一项重要的工作。传统地,DBA会写一些T-SQL来执行索引维护,但这并非容易的任务。PowerShell与SQL Server SMO库一起,提供了一个简练的解决方案用于索引维护。下面的脚本显示了如何如何通过执行PowerShell来完成广泛的索引维护日常工作。该脚本遍历用户数据库中所有表的所有索引,并基于索引的碎片水平来执行3项任务中的一个(代码文件:PS_ManageFragmentation01.PS1):
- 如果索引碎片低于5%,则不作任何事情,因为对性能的影响微不足道;
- 如果索引碎片位于5%和30%之间,则执行索引重组(reorganization);
- 如果索引碎片高于30%,则执行索引重建(rebuild)。
[string] $ServerName = $args[0]
[string] $TargetDatabaseName = $args[1]
[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo") | out-null
$server = New-Object Microsoft.SqlServer.Management.Smo.Server $ServerName
$targetDB = $server.Databases[$targetDatabaseName]
foreach ($table in $targetDB.Tables)
{
foreach($index in $table.Indexes)
{
$fragmentation = $index.EnumFragmentation()
$averageFragmentation = $fragmentation.Rows[0].AverageFragmentation
if($averageFragmentation -lt .05)
{
continue
}
if($averageFragmentation -ge .05 -and $averageFragmentation -lt .3)
{
$index.Reorganize()
continue
}
$index.Rebuild()
}
}
管理备份的磁盘空间使用率
使用PowerShell操作文件系统很简单。下面的例子中,创建了一个脚本来删除2天前的事务日志备份、8天前的差异备份,91天前的完整备份。基于文件扩展名和最后一次写入时间,把查询分成了3份,根据当前日期减去天数来定义各种备份类型的保存期(代码文件:PS_DeleteOldBackups01.PS1):
Get-ChildItem .\Backups -include *.trn -recurse '
| Where-Object { $_.lastwritetime -lt (Get-Date).AddDays(-2) } '
| Remove-Item
Get-ChildItem .\Backups -include *.dif -recurse '
| Where-Object { $_.lastwritetime -lt (Get-Date).AddDays(-8) } '
| Remove-Item
Get-ChildItem .\Backups -include *.bak -recurse '
| Where-Object { $_.lastwritetime -lt (Get-Date).AddDays(-91) } '
| Remove-Item
使用SMO提取DDL
错位索引或表定义不佳两种情况会有严重的性能影响,需要改变数据库DDL。处于这种考虑,我创建了下面的脚本,它可以排Job一天运行一次,从数据库中提取所有的DDL对象,并存放到磁盘中以提取时间命名的文件夹中。如果发现性能不佳,按天比对文件,来识别是不是因为DDL改变而引起数据库性能下降。这个脚本很长,但它组合了前面所讲的所有的概念及技术,把所有的数据库对象导入到文件(代码文件:PS_ExtractDDL01.PS1):
#Helper function to script the DDL Object to disk
function Write-DDLOutput ($filename, $object)
{
New-Item $filename -type file -force | Out-Null
#Specify the filename
$ScriptingOptions.FileName = $filename
#Assign the scripting options to the Scripter
$Scripter.Options = $ScriptingOptions
#Script the index
$Scripter.Script($object)
}
#Load the SMO assembly
[reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo") | out-null
#Create all the global vars we need
$Server = New-Object ("Microsoft.SqlServer.Management.Smo.Server")
$Scripter = New-Object ("Microsoft.SqlServer.Management.Smo.Scripter")
$ScriptingOptions = New-Object
("Microsoft.SqlServer.Management.SMO.ScriptingOptions")
$Scripter.Server = $Server
#Specifies the root folder that we'll store the Scripts into This will probably
become a param in future
$RootBackupFolder = "C:\SqlBackups\DDL"
#Get the day of the week so that we can create a folder for each day
$Today = [System.DateTime]::Today.DayOfWeek
#Store today's backup folder
$DDLBackupFolder = Join-Path -Path $RootBackupFolder -ChildPath $Today
#Check if today's folder exists
if ([System.IO.Directory]::Exists($DDLBackupFolder))
{
#If it does delete it's contents
Remove-Item (Join-Path -Path $DDLBackupFolder -ChildPath *) -Recurse
}
else
{
#Otherwise create it
[System.IO.Directory]::CreateDirectory($DDLBackupFolder) | Out-Null
}
#Setup the scripting options
$ScriptingOptions.AppendToFile = $true
$ScriptingOptions.FileName = $filename
$ScriptingOptions.ToFileOnly = $true
$ScriptingOptions.ScriptData = $false
#Loop through all the databases to script them out
foreach ($database in ($Server.databases | where {$_.IsSystemObject -eq $false -and
$_.IsDatabaseSnapshot -eq $false}))
{
$databaseBackupFolder = Join-Path -Path $DDLBackupFolder -ChildPath
$Database.Name
#This will be the database create script
Write-DDLOutput (Join-Path -Path ($databaseBackupFolder) -ChildPath
($Database.Name + ".sql")) $database
$ProgrammabilityBackupFolder = Join-Path -Path
$databaseBackupFolder -ChildPath "Programmability"
$DefaultsBackupFolder = Join-Path -Path
$ProgrammabilityBackupFolder -ChildPath "Defaults"
foreach ($default in $database.Defaults)
{
#Generate a filename for the default
Write-DDLOutput (Join-Path -Path
($DefaultsBackupFolder) -ChildPath
($default.Schema + "." + $default.Name + ".sql"))
$default
}
#Create a folders to store the functions in
$FunctionsBackupFolder = Join-Path -Path
$ProgrammabilityBackupFolder -ChildPath "Functions"
$ScalarFunctionsBackupFolder = Join-Path -Path
$FunctionsBackupFolder -ChildPath "Scalar-valued Functions"
$TableValuedFunctionsBackupFolder = Join-Path -Path
$FunctionsBackupFolder -ChildPath "Table-valued Functions"
foreach ($function in $database.UserDefinedFunctions | where
{$_.IsSystemObject -eq $false})
{
#script the functions into folders depending upon type. We're
only interested in scalar and table
switch ($function.FunctionType)
{
scalar
{
#Generate a filename for the scalar function
$filename = Join-Path -Path
($ScalarFunctionsBackupFolder) -ChildPath
($function.Schema + "." + $function.Name + ".sql")
}
table
{
#Generate a filename for the table value function
$filename = Join-Path -Path
($TableValuedFunctionsBackupFolder) -ChildPath
($function.Schema + "." + $function.Name + ".sql")
}
default { continue }
}
#Script the function
Write-DDLOutput $filename $function
}
$RulesBackupFolder = Join-Path -Path
$ProgrammabilityBackupFolder -ChildPath "Rules"
foreach ($rule in $database.Rules)
{
#Script the rule
Write-DDLOutput (Join-Path -Path
($RulesBackupFolder) -ChildPath
($rule.Schema + "." + $rule.Name + ".sql")) $rule
}
#Create a folder to store the Sprocs in
$StoredProceduresBackupFolder = Join-Path -Path
$ProgrammabilityBackupFolder -ChildPath "Stored Procedures"
#Loop through the sprocs to script them out
foreach ($storedProcedure in $database.StoredProcedures | where
{$_.IsSystemObject -eq $false})
{
#script the sproc
Write-DDLOutput ($filename = Join-Path -Path
($StoredProceduresBackupFolder) -ChildPath
($storedProcedure.Schema + "." +
$storedProcedure.Name + ".sql"))
$storedProcedure
}
#Create a folder to store the table scripts
$TablesBackupFolder = Join-Path -Path $databaseBackupFolder -ChildPath
"Tables"
$TableIndexesBackupFolder = Join-Path -Path
$TablesBackupFolder -ChildPath "Indexes"
$TableKeysBackupFolder = Join-Path -Path
$TablesBackupFolder -ChildPath "Keys"
$TableConstraintsBackupFolder = Join-Path -Path
$TablesBackupFolder -ChildPath "Constraints"
$TableTriggersBackupFolder = Join-Path -Path
$TablesBackupFolder -ChildPath "Triggers"
#Loop through the tables to script them out
foreach ($table in $database.Tables | where
{$_.IsSystemObject -eq $false})
{
#Script the Table
Write-DDLOutput (Join-Path -Path
($TablesBackupFolder) -ChildPath
($table.Schema + "." + $table.Name + ".sql")) $table
foreach($Constraint in $table.Checks)
{
#Script the Constraint
Write-DDLOutput (Join-Path -Path
($TableConstraintsBackupFolder) -ChildPath
($table.Schema + "." + $table.Name + "." +
$Constraint.Name + ".sql")) $Constraint
}
foreach ($index in $table.Indexes)
{
#Generate a filename for the table
switch($index.IndexKeyType)
{
DriPrimaryKey
{
$filename = Join-Path -Path
($TableKeysBackupFolder) -ChildPath
($table.Schema + "." +
$table.Name + "." +
$index.Name + ".sql")
}
default
{
$filename = Join-Path -Path
($TableIndexesBackupFolder) -ChildPath
($table.Schema + "." +
$table.Name + "." +
$index.Name + ".sql")
}
}
#Script the index
Write-DDLOutput $filename $index
}
foreach ($trigger in $table.Triggers)
{
#Script the trigger
Write-DDLOutput (Join-Path -Path
($TableTriggersBackupFolder) -ChildPath
($table.Schema + "." + $table.Name + "." +
$trigger.Name + ".sql")) $trigger
}
}
#Create a folder to store the view scripts
$ViewsBackupFolder = Join-Path -Path $databaseBackupFolder -ChildPath
"Views"
$ViewKeysBackupFolder = Join-Path -Path $ViewsBackupFolder -ChildPath
"Keys"
$ViewIndexesBackupFolder = Join-Path -Path
$ViewsBackupFolder -ChildPath "Indexes"
$ViewTriggersBackupFolder = Join-Path -Path
$ViewsBackupFolder -ChildPath "Triggers"
#Loop through the views to script them out
foreach ($view in $database.Views | where
{$_.IsSystemObject -eq $false})
{
#Script the view
Write-DDLOutput (Join-Path -Path
($ViewsBackupFolder) -ChildPath
($view.Schema + "." + $view.Name + ".sql")) $view
foreach ($index in $view.Indexes)
{
#Generate a filename for the table
switch($index.IndexKeyType)
{
DriPrimaryKey
{
$filename = Join-Path -Path
($ViewKeysBackupFolder) -ChildPath
($view.Schema + "." +
$view.Name + "." + $index.Name + ".sql")
}
default
{
$filename = Join-Path -Path
($ViewIndexesBackupFolder) -ChildPath
($view.Schema + "." + $view.Name + "." +
$index.Name + ".sql")
}
}
Write-DDLOutput $filename $index
}
foreach ($trigger in $view.Triggers)
{
#Script the trigger
Write-DDLOutput (Join-Path -Path
($ViewTriggersBackupFolder) -ChildPath
($view.Schema + "." + $view.Name + "." +
$trigger.Name + ".sql")) $trigger
}
}
}
脚本执行排Job
有两种方式对脚本执行排Job。第一种是使用Windows Task Scheduler,如果你没有安装SQL Server且希望执行PowerShell脚本时,用这种方式很有用。你可以很容易添加一项新任务到Scheduler,并执行PowerShell.exe, 把你要执行的脚本作为参数传给它。
对于安装SQL Server 2008 R2或更高版本的服务器而言,你还可以通过SQL Server Agent Job来执行PowerShell。通过新建Job,并在Type下拉框里选择PowerShell,可以很容易实现。然后数据PowerShell脚本到Commend文本框。
不幸的是,在SQL Server 2008 R2里使用PowerShell Job不是非常有用,因为它不能调用PowerShell 1.0,所以很多脚本和模块不能正常工作。如果你想在SQL Server 2008 R2中执行PowerShell 2.0脚本,你最好使用前面讲的执行PowerShell.exe的方式。幸运的是,这个问题在SQL Server 2012中解决了,因为他装载了PowerShell 2.0。
SQL Server代理Job的优势是你或许已经有Job在跑,这种方式能够让你在一个地方管理所有的Job。你也可以使用嵌入在SQL Job引擎里的日志功能来监控PowerShell脚本的执行情况。