PowerShell – Agile BI http://agilebi.com A community for sharing ideas about business intelligence development using agile methods. Tue, 08 Feb 2011 23:56:38 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.2 T-SQL Tuesday #15: Automating XML Updates w/ PowerShell http://agilebi.com/blog/2011/02/08/t-sql-tuesday-15-automating-xml-updates-w-powershell/ Tue, 08 Feb 2011 23:56:38 +0000 http://6.1430 Continue reading ]]> This month’s T-SQL Tuesday is being hosted by Pat Wright (blog | twitter). He selected the topic of automation with T-SQL or PowerShell. As it happened, I’d just had a scenario come up that seemed appropriate for a little PowerShell, so I went with it for this post.

The Varigence toolset is based on a language called BIML (Business Intelligence Markup Language), which is an XML-based language for creating BI solutions. We needed to make some mass updates to some of our sample files, but there were quite a few individual ones to update. I already had an XSLT transform that would update each file individually, so all I needed was a quick way to run it on each file in a directory.

I’m pretty new to PowerShell, but it seemed like a good fit for this. So, after a little trial and error, I came up with the following script:

[sourcecode language="powershell" padlinenumbers="true"]
$path = "C:tempxml*.xml"
get-childItem $path | rename-item -newname { $_.name -replace '.xml','_backup.xml' }
$files = Get-ChildItem $path
$xslt = New-Object System.Xml.Xsl.XslCompiledTransform
$xslt.Load("C:tempxmlDataType.xslt")
foreach ($file in $files) { 
    $oldFileName = $file.FullName
    $newFileName = $file.FullName -replace '_backup.xml', '.xml'
    $xslt.Transform($oldFileName, $newFileName) 
}
[/sourcecode]

Basically, it goes through a directory, renames all the existing .XML files to create a backup, and then gets a list of the files. It then loads an XSLT transformation (used to transform XML documents by inserting or removing markup). Finally, it loops through the files in the folder, and uses the XSLT transform to create an updated version of the file with the original filename.

There are probably better ways to do this, but it does exactly what I needed, and it let me learn a little more about PowerShell. It also struck me that this could be very useful to transform XML into a friendlier shape for processing in SSIS.

]]>
Filtering Objects in PowerShell based on a List of Accepted Values http://agilebi.com/blog/2009/06/10/filtering-objects-in-powershell-based-on-a-list-of-accepted-values/ Wed, 10 Jun 2009 01:47:15 +0000 http://8.21 Continue reading ]]> I was writing a script the other day where I want to return a collection of Services based on the name.  It took me a few minutes to figure out how to do this, so I thought I’d jot it down.  Nothing revolutionary, but I’ve definitely found this pattern to be handy.

Let’s start with a comma separated list of objects we want to filter by.  In this case I only want to return objects that start with “a”, “c”, “e”, or “g”.  I’ll take that list, and split it into an array.

# List we want to filter against
# Split it into an array
$filterList = ‘a,c,e,g’.Split(",")

Next, I’ll generate a collection of objects to try and filter.  I’ll just use a function to make life easy:

# Function to generate a collection of objects to test on
function Create-TestCollection {
    # Turn this into an object to filter
    $objectList = 'a,b,c,d,e,f,g'.Split(",")
    
    foreach($object in $objectList)
    {
        $output = new-object PSObject
        $output | add-member noteproperty Name $object
        
        Write-Output $output 
    }
}

# Populate the variable to test with our collection of objects
$collection = Create-TestCollection

Now $collection contains a list of objects with a single Name property containing “a”, “b”, “c”, “d”, etc.

Now list filter $collection using a Where:

Write-Host "Objects from the Collection where the Name property exists in an Array"
# Filter the collection of objects based on a CSV list
$collection | Where {$_.Name -and $filterList -eq $_.Name }

And it will return only the objects from our initial comma separated list:

Name                                                                                     
----                                                                                     
a                                                                                        
c                                                                                        
e                                                                                        
g    

This also works with other strings:

# An array we want to filter
$collection2 = 'a,d,g,j,m'.Split(",")

Write-Host "Strings in an Array where that exist in an Array"
# Filter the collection of objects based on a CSV list
$collection2 | Where {$_ -and $filterList -eq $_ }

Strings in an Array where that exist in an Array
a
g

Enjoy…

David

 

]]>
Using PowerShell to Manipulate SQL Server Analysis Services Traces http://agilebi.com/blog/2009/05/29/using-powershell-to-manipulate-sql-server-analysis-services-traces/ Fri, 29 May 2009 02:13:00 +0000 http://8.19 Continue reading ]]> I recently started using SSAS Server Traces a lot with SQL Server Analysis Services.  This type of trace is basically the same trace you can create with SQL Server Profiler, but it runs without Profiler, uses less resources, and can be persisted across reboots.  They’re a really handy tool.

I started using these when I built some AS monitoring tools based on the “Solution for Collecting Analysis Services Performance Data for Performance Analysis”  sample on CodePlex.  Seriously, totally revolutionized my life (at least the part related to administering complex AS installations and projects).  After installing, adapting, and enhancing the functionality there I found I wanted more and easier ways to control AS traces, so I built some PowerShell functions to help manage them.  These functions basically just wrap XMLA commands to make them easier to use.

Here are some sample files I’ll be talking about in this post:  Download Sample Files

Don’t worry about copying the sample code out of the post… it’s all included as part of the sample.

Creating SSAS Traces

You use a XMLA command to create a trace.  A trace looks something like this:

1: <Batch xmlns=”http://schemas.microsoft.com/analysisservices/2003/engine” xmlns:soap=”http://schemas.xmlsoap.org/soap/envelope/”>

   2:   <Create xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">

   3:     <ObjectDefinition>

   4:       <Trace>

   5:         <ID>My Trace</ID>

   6:         <Name>My Trace</Name>

   7:         <Events>

   8:           <Event>

   9:             <EventID>15</EventID>

  10:             <Columns>

  11:               <ColumnID>28</ColumnID>

  12:               <!-- ... More Columns ... -->

  13:               <ColumnID>3</ColumnID>

  14:             </Columns>

  15:           </Event>

  16:           <Event>

  17:             <EventID>16</EventID>

  18:             <Columns>

  19:               <ColumnID>24</ColumnID>

  20:               <!-- ... More Columns ... -->

  21:               <ColumnID>36</ColumnID>

  22:             </Columns>

  23:           </Event>

  24:           <!-- ... More events ... -->

  25:         </Events>

  26:         <Filter>

  27:           <NotLike>

  28:             <ColumnID>37</ColumnID>

  29:             <Value>Application I don't care about events from</Value>

  30:           </NotLike>

  31:         </Filter>

  32:       </Trace>

  33:     </ObjectDefinition>

  34:   </Create>

  35: </Batch>

Not the most fun to create by hand, but you could make it happen.  However, there is an easier way to come up with the CREATE statement for your trace.  Just do the following:

  1. Start up a SQL Server Profiler session and monitor the AS instance you’re working on.  You only need to capture the Command Begin event.
  2. Start up a 2nd instance of SQL Server profiler.  Use the GUI to create the trace you’re actually interested in, with all the events, columns, and filters.  Then start the trace.
  3. Snag the CREATE XMLA from the 1st Profiler section and save it off.

Now you have XMLA you can use as the base for the trace you want.  You’ll want to add a few more elements to the XMLA to make the server trace work though.  It will look something like this:

1: <Batch xmlns=”http://schemas.microsoft.com/analysisservices/2003/engine” xmlns:soap=”http://schemas.xmlsoap.org/soap/envelope/”>

   2:   <Create mlns="http://schemas.microsoft.com/analysisservices/2003/engine">

   3:     <ObjectDefinition>

   4:       <Trace>

   5:         <ID>My Trace</ID>

   6:         <Name>My Trace</Name>

   7:         <LogFileName>\MyServerTraceFilesMyTrace.trc</LogFileName>

   8:         <LogFileAppend>0</LogFileAppend>

   9:         <AutoRestart>1</AutoRestart>

  10:         <LogFileSize>100</LogFileSize>

  11:         <LogFileRollover>1</LogFileRollover>

  12:         <Events>

  13:           <!-- ... The rest of the Create statement you just generated ... -->

There are just a few extra fields there.  Here’s what they’re used for:

LogFileName Name of the log file.  Must end in .trc.  The AS Service Account must have permission to write to the directory.
LogFileAppend 0 for Overwrite, 1 for Append.
AutoRestart 0 for No, 1 to restart when the server restarts.
LogFileSize Size in MB.  The log will roll over when it reaches this size.
LogFileRollover 1 means create a new log file (it appends 1, 2, 3, etc. for each new log) when you reach the LogFileSize.

Deleting SSAS Traces

So, we’ve created a trace that auto restarts.  How do you get rid of it?

1: <Batch xmlns=”http://schemas.microsoft.com/analysisservices/2003/engine” xmlns:soap=”http://schemas.xmlsoap.org/soap/envelope/”>

   2:   <Delete xmlns="http://schemas.microsoft.com/analysisservices/2003/engine" xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">

   3:     <Object>

   4:       <TraceID>My Trace</TraceID>

   5:     </Object>

   6:   </Delete>

   7: </Batch>

SSAS Trace PowerShell Library

I found that I wanted a convenient way to see what traces were running on a server, create them, delete them, and flush them (i.e., close out the current trace and create a new one, so you can process or otherwise work with events that were just logged).  I have included two versions of my library in this sample.  This first (SsasTraceLibrary.ps1) runs with PowerShell V1.  The second (SsasTraceV2Library.ps1) is basically identical, but uses function header and parameter functionality from PowerShell V2 CTP3.  I keep the V1 version around to deploy to servers (more on this later), but load the V2 in my environment to take advantage of the examples, help, and all of the other V2 goodness.  I would encourage you to go with the V2 version, as it includes easy to use descriptions, examples, and better parameter help.

I created the following functions as part of this library:

Function Description
Get-SsasTrace Get details of a specific trace
Get-SsasTraceExists Check if a specific trace exists
Get-SsasTraces Get all traces running on a server
Start-SsasTrace Start a new trace based on a stored template
Delete-SsasTrace Delete an existing trace
Flush-SsasTrace Stop/Restart an existing trace

A Sample Function

Most of the functions in this library require the SSAS assemblies

1: # Load Required SSAS Assemblies

   2: $asm = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices")

   3: $asm = [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices.Xmla")

You typically follow the pattern of connecting to a server, executing XMLA command, outputting the results, and disconnecting from the server.

1: # Connect to the server

   2: $xmlaClient = new-object Microsoft.AnalysisServices.Xmla.XmlaClient

   3: $xmlaClient.Connect($serverName)

   4:

   5: $xmlStringResult = "" # Initialize the variable so that it can be passed by [ref]

   6:

   7: # Fire off the discover command to return all traces

   8: $xmlaClient.Discover("DISCOVER_TRACES", "", "", [ref] $xmlStringResult, 0, 1, 1)

   9:

  10: # Convert the result to XML to make it easier to deal with

  11: [xml]$xmlResult = $xmlStringResult

  12:

  13: return $xmlResult.return.Root.row

  14:

  15: # Disconnect the session

  16: $xmlaClient.Disconnect()

Occasionally we want to work with the XML result set a little bit to verify the results, but usually nothing major.

   1: # Create the trace

   2:  $xmlaClient.Execute($createTraceXmla, "", [ref] $xmlStringResult, 0, 1)

   3:

   4:  # Convert the result to XML to make it easier to deal with

   5:  [xml]$xmlResult = $xmlStringResult

   6:

   7:  $xmlResultException = $xmlResult.return.results.root | ? {$_.Exception -ne $null}

   8:

   9:  if ($xmlResultException -ne $null)

  10:  {

  11:    throw $xmlResultException.Messages.Error.Description

  12:  }

The PowerShell is really just a wrapper around XMLA commands… it just makes it easier to use.

Using the SsasTraceLibrary.ps1 in the Dev Environment

I’ve found I use these functions a decent bit as part of my day to day operations.  I have the following command in my Profile.ps1 to load all the script files ending with “Library.ps1” in a given directory… I store command libraries like SsasTraceLibrary.ps1 in this folder, so they’re automatically loaded when PowerShell starts.

1: $powerShellScriptsDirectory = “c:PowerShellScripts”

   2: if (!$powerShellScriptsDirectory.EndsWith("")) { $powerShellScriptsDirectory += "" }

   3:

   4: Write-Host Welcome $Env:Username

   5:

   6: foreach($filename in Get-ChildItem $powerShellScriptsDirectory* -Include "*Library.ps1")

   7: {

   8:     & $filename

   9: }

Now, you just have to start PowerShell and run a command like

1: Get-SsasTraces LocalHost

to return all the traces running on your local machine.

Using the SsasTraceLibrary.ps1 in the Server Environment

I mentioned earlier that I also deploy this script to my various AS instances.  I do this because various people need to work on the machine, and I want an easy (read: single click) way to do things like start/stop/flush the trace on the machine.  This also makes it easy to automate these actions as part of an ETL or job.

I use a batch file with the following commands:

1: ECHO Setting System Variables

   2: SET DATA_COLLECTION_PATH=[INSTALLDIR]

   3: SET SSAS_TRACE_UNC=\[OLAPSERVER][TRACE_FILE_SHARE_NAME][OLAPSERVER]_SsasPerformanceErrorMonitoringTrace.trc

   4: SET SSAS_SERVER=[OLAPSERVER]

   5: SET SSAS_TRACE_FILE_SIZE_MB=100

   6:

   7: ECHO Running Commands

   8: REM: Create the Data Collection Trace File

   9: PowerShell -Command "& {&'%DATA_COLLECTION_PATH%DataCollectionSsasTraceLibrary.ps1'; Start-SsasTrace -ServerName '%SSAS_SERVER%' -TraceID 'Analysis Services Performance and Error Trace' -TraceName 'Analysis Services Performance and Error Trace' -UncFileName '%SSAS_TRACE_UNC%' -FileSizeInMB '%SSAS_TRACE_FILE_SIZE_MB%'}"

  10: ECHO Script Complete!

  11: pause

The parameters encased in ‘[‘ and ‘]’ are replaced whenever the scripts are deployed to a server with variables specific to their environment.  Someone can now just run one of the batch files to Start, Stop, or Flush a trace on the server.  I also typically call the file to Flush the trace file as part of my Processing job, so I can immediately load the results into a database for analysis.

Performance

So a question that will always come up when running traces like this is the amount of overhead they require.  And of course they require some, both in terms of CPU to log the events and Disk to write them.  I’ve typically seen this to be in the single digits of CPU, and I always write to a location where there isn’t disk contention.  You’ll of course want to test in your environment, but I haven’t seen a performance hit that makes the ROI of running these traces not worth it.  If you’re concerned, you could consider turning them on/off as part of a scheduled job, or just running them on an as needed basis.  Personally, I’ve seen a huge benefit from running them 24/7 as I capture detailed processing information (how long each step takes), query information (who is doing what, how bad, and how often) and error information (some errors that aren’t caught in any other logs are captured via traces).

Next Steps

Takes these libraries and modify to your heart’s content.  I use a template in the scripts that is my standard, but you can replace it, add more, or whatever you want to do.  You could also add a little bit better error handling if desired.

Conclusion

So, included here are some functions that will help you with some basic functionality around SSAS traces.  Feel free to post back if you have any ideas for improvements or things that would be cool to do.

Cheers,

David

]]>