Posts tagged ‘PerformancePoint Server’

Repeated Dimension Names in PerformancePoint Server Monitor & Analyze

A colleague came to me recently with a problem.  She was implementing a Dashboard in PPS M&A, and she was seeing some unexpected dimension names.  The behavior was specifically affecting her when she was creating a report using an Analytic Grid, but it was consistent inside of PPS M&A everywhere I looked.  The problem seemed to be present with several other hierarchies.


Basically, her hierarchy/dimension names were not being presented to the user as expected.  For example, in her SSAS DB she had a dimension named ‘Revenue Indicator’ and a user hierarchy named ‘Revenue Indicators’.  In the Analytic grid, it looked like the word ‘Indicator’ was repeating… the hierarchy was being displayed as ‘Revenue Indicator Indicators’.  This naming was a bit confusing, and a little unexpected, so I dug into it a little bit.


Evaluate the Problem


My first step was to check the Analytic Grid in question, as well as the dimension in SSAS.  Not that I thought my coworker was lying to me, but I wanted to make sure that what was being displayed to the user was NOT what I expected to see.  Nope, I was not being lied to and deceived… my faith in humanity can continue unabated.  Some transformation was definitely going on between SSAS and PPS.


Determine how PPS is getting the Hierarchy name


Next, I wanted to see how PPS retrieves the names of all the hierarchies and attributes to display to the user.  I fired up Profiler, and captured a trace when I loaded up the Analytic Grid.  I saw PPS using several Discover commands… so I fired off a Discover command based on what I saw in the trace.  It looked something like this:

  1: <Discover xmlns=“urn:schemas-microsoft-com:xml-analysis”>
  2:   <RequestType>MDSCHEMA_HIERARCHIES</RequestType>
  3:   <Restrictions />
  4:   <Properties>
  5:     <PropertyList>
  6:       <Catalog>MyCube</Catalog>
  7:       <Content>SchemaData</Content>
  8:       <Format>Tabular</Format>
  9:     </PropertyList>
 10:   </Properties>
 11: </Discover>

That discover command gave me back the schema for the Cube.  I took a look at the XML… here is the relevant section:

  1: …
  2: <row>
  3:   <CATALOG_NAME>MyCube</CATALOG_NAME>
  4:   <CUBE_NAME>My Cube</CUBE_NAME>
  5:   <DIMENSION_UNIQUE_NAME>[Revenue Indicator]</DIMENSION_UNIQUE_NAME>
  6:   <HIERARCHY_NAME>Revenue Indicators</HIERARCHY_NAME>
  7:   <HIERARCHY_UNIQUE_NAME>[Revenue Indicator].[Revenue Indicators]</HIERARCHY_UNIQUE_NAME>
  8:   <HIERARCHY_CAPTION>Revenue Indicators</HIERARCHY_CAPTION>
  9:   <DIMENSION_TYPE>3</DIMENSION_TYPE>
 10:   <HIERARCHY_CARDINALITY>26</HIERARCHY_CARDINALITY>
 11:   <DEFAULT_MEMBER>[Revenue Indicator].[Revenue Indicators].[All]</DEFAULT_MEMBER>
 12:   <ALL_MEMBER>[Revenue Indicator].[Revenue Indicators].[All]</ALL_MEMBER>
 13:   <DESCRIPTION />
 14:   <STRUCTURE>1</STRUCTURE>
 15:   <IS_VIRTUAL>false</IS_VIRTUAL>
 16:   <IS_READWRITE>false</IS_READWRITE>
 17:   <DIMENSION_UNIQUE_SETTINGS>1</DIMENSION_UNIQUE_SETTINGS>
 18:   <DIMENSION_IS_VISIBLE>true</DIMENSION_IS_VISIBLE>
 19:   <HIERARCHY_ORDINAL>1</HIERARCHY_ORDINAL>
 20:   <DIMENSION_IS_SHARED>true</DIMENSION_IS_SHARED>
 21:   <HIERARCHY_IS_VISIBLE>true</HIERARCHY_IS_VISIBLE>
 22:   <HIERARCHY_ORIGIN>1</HIERARCHY_ORIGIN>
 23:   <HIERARCHY_DISPLAY_FOLDER />
 24:   <GROUPING_BEHAVIOR>1</GROUPING_BEHAVIOR>
 25: </row>
 26: …

So, precisely what I would expect… but not what is coming out of PPS. 


Determining how PPS is transforming the Hierarchy name


I then tried a little experiment… I added a few more hierarchies, to see how PPS would present them:


image


After deploying the changes, I saw the following hierarchies in my Analytic Grid:















SSAS Hierarchy PPS Name
[Revenue Indicator].[Revenue Indicators] Revenue Indicator Indicators
[Revenue Indicator].[Revenue Indicator Types] Revenue Indicator Types
[Revenue Indicator].[Types] Revenue Indicator Types

Result


PPS is retrieving the Dimension name and Hierarchy name, then concatenating them together while removing any duplicated words from the Hierarchy name.  I checked this behavior out in several places inside of PPS (not just in the Analytic Chart)… it appears to be common functionality.


This is actually pretty helpful.  Most of the time when we create SSAS DB’s, they’re accessible using a variety of tools… PPS M&A for Dashboards, via Excel for power users, etc.  The dimensions and hierarchies are rendered in different ways for each tool.  Some will concatenate the Dimension name with the Hierarchy name, and some won’t.  Some will give you a nice hierarchy, some you just get an ugly list.  Naming your dimension appropriately will give you the most user friendly names in each of the tools… in this case, using ‘Revenue Indicator Types’ gives you a user friendly name in both Excel and in PPS, which is what we needed.


This is some pretty handy functionality, but I didn’t find it with a quick search of the ‘net and BOL (which doesn’t mean it’s not there).  If you’re surfacing your SSAS DB through PPS, I would definitely take this into account when determining your naming conventions.


Cheers,



David

Testing and Tuning an Internet Facing PerformancePoint M&A Dashboard

I’ve spent the past 15 months, off and on, working as the technical lead on a performance management project using PerformancePoint Server.  The project was to create a SharePoint Portal where users could view the performance of a school system.  The portal was initially internally facing, but the goal was to make it publicly accessible.  My last project with this particular client was to implement an Internet facing version of the portal. 


Since the school system in question has more than 120,000 students, scalability and performance was definitely an issue.  I spent a week at the Microsoft Technology Center in Atlanta, GA testing and tuning the solution.  We performed our testing in a lab with seven desktop machines (to test with) and three servers to simulate our production environment.  I was joined for the week by one of the PPS M&A Architects, who helped me evaluate the solution.  Everyone at the MTC was super helpful and friendly, and we got a lot of really good advice. 


I want to share some of the things I learned during the performance testing of the solution.


Recommended Reading


A few good things to read before getting started on a venture like this (among all the other white papers about scaling SQL Server, SharePoint, and the like).



Steps


To performance test and tune our SharePoint site, we needed to…



  1. Deploy the site in the test lab.

  2. Create load tests.

  3. Tune Windows Server, SharePoint, IIS, PerformancePoint Server, SQL Server, and Analysis Services as needed.

  4. Run and analyze the load tests.

  5. Modify the PPS design as required.

Just five little items on the list… how hard can that be, right?  Items #2 through #5 were done in iterations.


System Under Test


The application we created uses SharePoint Enterprise Server 2007 (MOSS), PerformancePoint Server Monitor & Analyze 2007 (SP1), and SQL Server 2005 (SP2 CU8).  Both SQL Server RDBMS and SSAS reside on the same server.  The RDBMS supports only SharePoint and PerformancePoint.  Analysis Services is dedicated to the Dashboards. 


Data Source Design


The data source is a SQL Server Analysis Services DB.  It uses a straightforward star schema design.  All of the KPI’s that we’re using in the PPS Dashboards are surfaced using calculated measures in the cube.  This is definitely a good way to go.  This design makes developing the dashboards very easy, it encapsulates the logic of the KPIs, and it helps with efficient caching inside of SSAS.  The cube was less than 1 GB in size for this phase of the project.  Some default aggregations were built, and usage based aggregations were applied against the cube.  Query performance was base lined at a few dozen milliseconds for the largest/most complex queries once the cache was warmed.


Dashboard Design


Our portal has seven dashboards.  Each dashboard has several pages. 



  1. The Overall view contains a scorecard with a Year Member Selection filter.  Each KPI on the scorecard conditionally displays an Analytic Grid and a Web Page report (pointing to a HTML file in SharePoint).

  2. The Compare view contains two scorecards, with up to 14 filters (primarily Member Selection… more on this later).  Seven filters apply to each scorecard; users can make different selections, and compare the two views.

  3. The Look Inside the Data view contains a single scorecard with up to 12 filters (and a few more columns than the Compare view, to show historical values).

The Analytic Charts were all fairly straightforward, with a default view of a calculated measure against time.  The users can drill into the charts. 


The filters were primarily Member Selection, with at most a few dozen members.  We have one MDX filter (for Schools), that returns about 400 members (this will come into play later).


The Dashboard was implemented using Application Security.


SharePoint


I enabled Anonymous Authentication for SharePoint.  The site basically supports all of our Dashboards, as well as a number of Help files and some content to help people use the site and the data.  I customized some of the master pages (to remove features such as SharePoint Help, Search, the breadcrumb trail, and stuff like that), but the site design was fairly trivial for this version of the site.  Excel Services and other MOSS features were not used.


Deploying to a Test Environment


Our friendly neighborhood Microsoft rep arranged for us to have some time in the Microsoft Technology Center in Atlanta, GA.  The provided an environment for us to test in, and they provided three servers (to mimic our production hardware) and seven desktop machines for us to run load tests on.  It took a bit ~1.5 days to install/configure the software, install service packs, and deploy our solution (include a SharePoint site collection, PPS M&A Dashboards, and SSAS DB).


Hardware Under Test


We tested our solution on the following hardware.  Our production hardware was similar.  A hardware network load balancing appliance was used for the web front ends.


Web Front End (1)



  • DL380 G5

  • Dual Core x 2 – 2.66 GHz – 64 bit

  • 8 GB RAM

  • Windows Server 2003 (SP2)

  • IIS 6.0

  • SharePoint Enterprise Server 2007 (SP1+Infrastructure update)

  • PerformancePoint Server Monitor & Analyze (SP1)

Web Front End (2)



  • DL 382 G5

  • Dual Core x 2 – 2.6 GHz – 64 bit

  • 4 GB RAM

  • Windows Server 2003 (SP2)

  • IIS 6.0

  • SharePoint Enterprise Server 2007 (SP1+Infrastructure update)

  • PerformancePoint Server Monitor & Analyze (SP1)

SQL Server (RDBMS and SSAS)



  • DL 585 G2

  • Dual Core x4 – 2.66 GHZ – 64 bit

  • 16 GB RAM

  • Windows Server 2003 (SP2)

  • SQL Server (SP2 CU8)

  • RAID10 disk array

Test Rig


We used Visual Studio 2008 (SP1) Team Systems for Testers (VSTT) to perform our testing.  We used one machine as a Controller and four machines as Agents.  We saved our test results to SQL Server on a separate machine.


Test Machines (x5)



  • Dual core 64 bit

  • 2 GB RAM

  • Vista

This configuration let us scale to 4,000+ users without severely taxing any of the machines.  Due to the network appliance being used for our load balancing, IP Spoofing was not required for our test machines.  We had some issues with using Vista for the agents… it was just a little bit trickier to configure than XP.  We were never able to get the counter collection working for those machines (annoying, but not really an issue since we had so much capacity and all the machines were in the same room).  We didn’t run into any major issues getting our test rig up and running, but it definitely took some time.


Creating the Load Tests


In order to do our tuning and performance testing, we first had to create everything needed to create a load test.  VSTT is a good tool for load testing, but be aware that there is a learning curve.  If you don’t have any experience performing web based load testing, then this could be a significant hurdle.  Luckily, in a previous life I was a Load Test Architect, and I had experience with the tool.


Test Design


I first identified several use cases that I thought would be common for users of our portal.  These consisted of actions such as viewing the top level summary Dashboards, using filters to change the view of the scorecards, clicking on analytic grids, and things like that.  I created a mixture of coded and graphical Web Tests for each small work flow (such as changing a filter four times on a certain Dashboard, or clicking through eight charts on a Dashboard).  I then used Web Tests that call other Web Tests to build up my use cases.  The use cases were then combined in different ratios in each of my load tests so I could simulate a variety of user populations.


Designing your tests correctly is crucial.  If you’re way off on what the users will be doing, all of your performance tuning could be for naught.  I would recommend using a pilot group, or at least some test subjects, to figure out how people will use your portal.


Web Test Implementation


Microsoft Consulting Services provided a test harness for PPS.  The test harness is a coded web test that can parse the XML snippets used by PPS, and perform actions such as changing filter values, activating charts, etc.  It’s a pretty cool tool, and was very helpful.  Unfortunately, it is proprietary, so I can’t share it… you’ll have to get engage MCS if you want to use it.  I did make some modifications to it to change the way it would use filters, to make it work when Web Tests called other Web Tests, etc.  If anyone is using this test harness (or if it does become publicly available), I can share the (minor) modifications that I made.


I constructed all of the visual web tests using Fiddler 2 (in conjunction with a home grown helper application to insert comments, and do a few other useful tasks).  Thanks to the MCS test harness, a few extra tools, and the way the web tests were constructed, the actual test generation was very straightforward.


Testing Considerations


One of the key elements to consider when performing a load test against PPS is that, due to the architecture, the process of loading a single page may involve dozens of calls.  For example, a dashboard that has 15 filters, two scorecards, and two charts would actually make 20 http requests… so if each request takes 1 second, it takes 20 seconds to load the page.  I was primarily interested in the time to last byte for each page, so I used comments while recording my tests to know where to place the transactions, so I could analyze those instead of individual calls.


I implemented reasonable timing and pacing for the individual steps in each of the tests, and made sure to gradually ramp up my users when applying the load.  Testing was slightly weighted towards the Dashboards that were the largest and would have the heaviest usage, though another use case was used to randomly hit every part of the site to bust the cache if possible.


System Tuning


We tuned a few things, or at least reviewed the settings, out of the gate.  During our testing we continuously monitored the servers (and test machines), identified bottlenecks, tweaked settings, then did it all again.


During our initial testing, we found SSAS to be the bottleneck for our solution.  I suspect that our web front ends (SharePoint and/or IIS) could be tweaked a little more, but it wasn’t really necessary in our scenario.


ASP.NET Tuning


We started tuning ASP.NET with some of the suggestions in the article Contention, poor performance, and deadlocks when you make Web service requests from ASP.NET applications.  In the end, I found the performance to be good using the autoConfig=”true” (the formulas listed in the article are automagically applied) with a few tweaks.  I increased the number of threads available significantly… that was a major bottleneck for awhile.  I also increased the minWorkThreads to 50 (otherwise SharePoint stalled after adding a few hundred users while spinning up more threads, even when ramping users up fairly slowly).  I also increased the minFreeThreads and minLocalRequestFreeThreads.  I increased the number of connections to the DB server.  Don’t just blindly apply any of these settings… your mileage will vary.  I went through a number of test iterations carefully monitoring each of the servers and testing these settings. 


I made the following change to the machine.config (on all web front ends):

  1: …
  2:   <system.web>
  3:     <processModel autoConfig=“true” maxWorkerThreads=“400″ maxIoThreads=“400″ minWorkerThreads=“50″ />
  4:     <httpRuntime minFreeThreads=“352″ minLocalRequestFreeThreads=“304″ />
  5:     …
  6:   </system.web>
  7: …
  8:   <system.net>
  9:    <connectionManagement>
 10:      <add address=“10.1.11.11″ maxconnection=“48″/>       ? Database server ip address
 11:   </connectionManagement>
 12:   </system.net>
 13: …
 14: </configuration> 

IIS Tuning


I adjusted the application pool (for both the Performance Management site as well as the Central Administration page). Pinging and Rapid Failure were turned off on the Health tab. I did NOT use web gardens in this configuration, even though I have seen it recommended.  I did not see a difference in performance between using them and not, and was advised by one of the local IIS gurus not to use them.


SQL Server 2005 Tuning


We updated SQL Server to Service Pack 2 – Cumulative Update 8 (build 3257). Before applying the CU, we ran into some significant performance issues.  SSAS did not like some of our very small dimensions, and performance was severely degraded.  The CU resolved the issues. 


Data files, transaction logs, and the Temp DB were spread out over multiple disks.  Our usage of the RDBMS was very light, so we needed minimal tuning here.


SQL Server Analysis Services 2005 Tuning


I updated the Query \ Max Threads element of Analysis Services to 20 to provide additional threads for querying. Adjusting this element needs to be done carefully, as it can improve the performance of the web front ends at the expense of maxing out SQL Server.  I actually ramped this number up quite a bit during testing (with good effect), but found 20 to be adequate after resolving some other issues.  I suggest monitoring the queued requests in IIS along with the queued and active threads in SSAS to determine a good balance here.  I initially found a lot of queuing in the Long Parsing Jobs and Short Parsing Jobs as well, due to some filters.


PerformancePoint Server M&A Tuning


OK, this is important.  Since we Internet facing, we’re supporting anonymous users… a decent number of them, in fact.  I updated the stored procedure [PPSMonitoring]. [dbo].[ParameterValuesCreate] to comment out a section that is not needed for an anonymous site, but that has an extremely negative impact on performance.  This is the piece that stores the current filter selections.  In addition to not being necessary, it was totally thrashing our transaction log. 


I made the following change to the [PPSMonitoring]. [dbo].[ParameterValuesCreate]  sproc:

  1: – init 
  2:     SET @TransactionIsOurs = 0
  3:     IF @@TRANCOUNT = 0
  4:     BEGIN – only if @@TRANCOUNT is 0, we do BEGIN TRAN
  5:         BEGIN TRANSACTION
  6:         SET @TransactionIsOurs = 1
  7:     END
  8: – DD – 20080821 – Commented out the following section to increase performance
  9: –    –If this parameter value already exists (for this login), update. Otherwise, insert.
 10: –    IF(EXISTS(SELECT * FROM [ParameterValues] WHERE Login = @Login AND [ParameterUniqueName] = @ParameterUniqueName))
 11: –    BEGIN
 12: –        SELECT 1
 13: –        UPDATE [ParameterValues]
 14: –        SET 
 15: –            [LastUpdated] = GETDATE(),
 16: –            [SerializedXml] = @SerializedXml
 17: –        WHERE [Login] = @Login AND [ParameterUniqueName] = @ParameterUniqueName
 18: –        IF @@ERROR <> 0 
 19: –        BEGIN
 20: –           RAISERROR (5580001, 16, 1, @tErrMsg, 7, N’ParameterValues’) WITH LOG
 21: –           SET @ReturnCode = -1
 22: –           GOTO exit_label
 23: –        END
 24: –    END
 25: –    ELSE
 26: –    BEGIN
 27: –        –Insert record
 28: –        INSERT INTO [ParameterValues]
 29: –            ([Login], [ParameterUniqueName],[LastUpdated],[SerializedXml])
 30: –        VALUES     
 31: –            (@Login, @ParameterUniqueName, GETDATE(), @SerializedXml)
 32: –        IF @@ERROR <> 0 
 33: –        BEGIN
 34: –           RAISERROR (5580002, 16, 1, @tErrMsg, 8, N’ParameterValues’) WITH LOG
 35: –           SET @ReturnCode = -1
 36: –           GOTO exit_label
 37: –        END
 38: –    END
 39:
 40:     IF @TransactionIsOurs = 1
 41:     BEGIN
 42:         COMMIT TRANSACTION
 43:     END
 44:     RETURN 0


SharePoint Tuning


There is a lot of tuning that can be done to SharePoint; tuning SharePoint was not a focus for us.  We did perform a few changes. 


Allowing the SharePoint Server worker process to consume large amounts of memory can decrease performance. For computers that have more than 4 GB of RAM, the ASP.NET cache size can be constrained with the privateBytesLimit attribute, set on the cache element of the Web.config file. By setting privateBytesLimit=”2576980378” (that is, 60% of 4 GB), you can avoid a scenario in which a server that has more than 4 GB of memory creates an oversized cache.


For a machine with 8 GB of RAM with a limit of 60% of the memory, we would add the following to the web.config. The private bytes limit number is calculated via the formula (<gigabytes of RAM available> * <percentage of RAM to use>* 1,024^3). 


I made the following change to my web.config file for the SharePoint site:

  1: …
  2: <system.web>
  3: …
  4: <caching>
  5:   <cache privateBytesLimit = “5153960755″ />
  6: </caching>
  7: …
  8: </system.web>
  9: …

Tweaking the Dashboard Design


Overall, I was expecting to be able to support about 500 concurrent users on each of the front end servers.  I expected the WFE’s to be the bottleneck.  With the original implementation of the Dashboard, I found we could support up to about 2,000 users… as long as they weren’t using our Comparison and Look Inside the Data pages. 


When testing those pages, SSAS was getting slammed.  I was able to support 1,000 users… but SSAS was at 80% utilization constantly, with occasional spikes and severe performance degradation.  Not a really good situation.


Through testing and profiling, I identified the culprit… it was the School filter.


The School filter is an MDX filter, and it looks like this:

  1: FILTER(
  2:   DESCENDANTS(
  3:     [Entity].[Entity]
  4:     ,[Entity].[Entity]
  5:     ,SELF_AND_BEFORE
  6:   )
  7:   ,[Entity].[Entity Type] = [Entity].[Entity Type].&[SCHOOL]
  8: )

No big deal, right?  Except… each time it is loaded (i.e., a lot) it fires off a few other MDX statements.  They look like this:

  1: SELECT
  2:   {   [Entity].[Entity].[All]
  3:     ,[Entity].[Entity].&[School A]
  4:     ,[Entity].[Entity].&[School B]
  5:     //…about 400 more
  6:   } DIMENSION PROPERTIES MEMBER_TYPE on 0
  7: FROM [MetricsDatamart]
  8: 

and this:
  1: SELECT
  2:   {
  3:    IIF([Entity].[Entity].[All].Parent IS NULL,[Entity].[Entity].[All],[Entity].[Entity].[All].Parent)
  4:   ,IIF([Entity].[Entity].&[School A].Parent IS NULL,[Entity].[Entity].&[School A],[Entity].[Entity].&[School A].Parent)
  5:     // … about 400 more
  6:  } DIMENSION PROPERTIES MEMBER_TYPE on 0
  7: FROM [MetricsDatamart]
  8: 

These are fairly long statements, and they are expensive to run.  SSAS was having to queue dozens of these statements just to parse them when under load.  They are fired for each page load, even though our data is static (i.e., no caching is occurring).  All those inline IIF’s did horrible, terrible things to SSAS.  SSAS was getting very backed up, which caused IIS to queue, which caused extremely long waits on the front end.


To fix the problem, I switched that filter to use a SQL Server RDBMS source.  The data was static, so no big deal there.  I did find that using more than 1 tabular source gave me an error in PPS… I wasn’t able to resolve that on our timeline, and I was able to fix the performance issue anyway.


Final Results


After that one change, the performance of the whole system skyrocketed.  With our heaviest test case, I was able to run 1,500 concurrent users. The WFE processor was at ~70% utilization, averaging 300 requests per second with no contention or queuing. The database server was at ~20 utilization, no issues. Responses were < 1 second, with an average response time of ~.3 seconds. User experience excellent. Even when the response times would increase, page load times were still reasonable (under 10 seconds for pages with more 15 objects).


Next Steps


We got everything pretty much humming for this phase of the project… but we’re building a much larger version for our client in the very near future.  Here are some thoughts on making our solution bigger, faster, better.


SharePoint


We can tune SharePoint quite a bit to increase performance.  This includes lightening some of the templates and base pages, optimizing images and web pages, and making sure all the user supplied content is fairly lean (i.e., not using a 300kb file produced by Word when a 20kb html file could be used).  We can use some of the caching features available in MOSS to improve the performance on the web servers.


PerformancePoint Filters


OK, I saw our filters as a bit of a bottleneck.  For a large installation, I think performance could be significantly improved by building a custom filter using the Monitor SDK.  Some of the key features would be enabling caching and perhaps using XMLA discover queries instead of some of the MDX.  I’d also make the ordering of the elements a little more configurable, and perhaps try and implement a way to create a ‘global filter’ (I find I use a single filter on many different Dashboards… and I’d like to reduce the maintenance/development time/improve consistency.


Servers


Scaling out the web servers and scaling up the SQL Servers looks like a fairly effective strategy here.  Coupled with some custom components and a well thought out design, this looks like an effective strategy with fairly predictable performance characteristics.  We could easily double our capacity right now by snagging two extra WFEs.  To scale by a factor of 10, we’d probably need an extra SQL Server and some minor changes to our strategy.


Key Takeaways


Our testing went well, and we were able to exceed the performance we were expecting.  Here are a few key takeaways from this exercise.



  1. Plan on hunting down some performance bottlenecks.  In a PPS deployment, they can be located on any tier, and it may take a little time to find and fix them.  We had to make adjustments to basically every technology in this instance.

  2. You’re going to want to tune your servers.  It took a little over a week for me to go through that exercise, but I had help, a lab, and experience with all the various components. You’re mileage may vary.  We identified and corrected a number of significant performance issues… everything from tweaking IIS and SQL Server to ensuring the correct Cumulative Update was applied.  Make only small changes for each test iteration, and analyze carefully.  And document the changes you make.

  3. The components you use, and your design, will have extreme ramifications on performance.  If your dashboards heavily use PAS views vs. Excel Services vs. Analytic Charts, your performance characteristics will change dramatically.  Carefully consider the ramifications of the different technologies, and look at where the work is being done in your configuration.

  4. Overall, scalability and performance was good, and everything was pretty much in line with what I saw in the Performance Tuning and Capacity Planning for PerformancePoint Monitoring Server.  Keep in mind that your implementation will probably be be somewhat different than the PPS Team’s test installation, so keep that in mind in your planning.

Happy PerformancePointing…


David

MDX Filters in PPS Monitor

I had a question on how to implement MDX Filters in PerformancePoint Monitor awhile back.  Nick Barclay  provided an answer here.

Some samples Nick provided were:

   1: [Time].[Fiscal Year].Children

and

   1: DESCENDANTS(
   2:   [Geography].[Geography].[All Geographies]
   3:  ,[Geography].[Geography].[Country]
   4:  ,SELF_AND_BEFORE) 

Depending on how you have implemented your cube, you can also do some other clever things.  On a recent project, I had a Date dimension with a hierarchy that would present the current month as ‘Current Month’, so MDX such as the following :

   1: {[Dim Date].[Calendar].[Month Name].&[Current Month].Lag(6):[Dim Date].[Calendar].[Month Name].&[Current Month]}

could be used to provide the last 6 months contained in the cube as a drop down list:

   1: Nov 2007
   2: Dec 2007
   3: Jan 2008
   4: Feb 2008
   5: Mar 2008
   6: Current Month

Another useful possibility is to provide a filter based on a hierarchy, but an individual member (in this case the ‘Unknown’ member) should not be displayed in the pick list:

   1: FILTER(
   2:     DESCENDANTS(
   3:          [Entity].[Entity]
   4:         ,[Entity].[Entity]
   5:         ,SELF_AND_BEFORE
   6:     )
   7:     ,NOT [Entity].[Entity].CURRENTMEMBER IS [Entity].[Entity].&[Unknown]
   8: )

Creating a Scorecard Transform in PerformancePoint Server Monitor


Overview

I’ve spent a couple of days doing some prototyping around PerformancePoint Monitor… there have been a few things that I’ve wanted to accomplish, so when the PerformancePoint Monitoring SDK was released, it seemed like the way to do it.  However, there aren’t a lot of resources around Scorecard Transforms out there, so I thought I’d document some of the things I found.  This documentation won’t be exhaustive, but it will hopefully point everyone in the right direction.  Keep in mind this is just prototype code… a number of enhancements should be made before putting this into production.

I’ve created a sample project (and a simple dashboard) to do the following:

  1. Perform a Magic Number transformation – I’ve had clients want a message to be returned to the user instead of a value… such has having conditional logic in a calculated member that returns percentages between 0 – 1, then -1 if no data exists, -2 if the KPI is invalid in the chosen scenario, etc.
  2. Display text on the scorecard that is database driven – I have a request to provide database driven, text metadata for KPIs on each scorecard.

The dashboard contains two scorecards (‘Sample Scorecard’ that the transforms are applied to, and ‘Sample Scorecard 2′ which the transforms are not applied to), with two KPIs.  You will need to publish the objects in the .bswx file (included in the Visual Studio solution) in order to see the results of the transforms.

You can download the sample project here.

Where to Start

The first thing to do is to create a DLL containing your transform.  I started with a C# class library.  Remember that your DLL will have to be strongly named.  The basic steps will be to run SN.exe to create a key file, then using the resulting .snk file in your project.

Once you’ve created your project, you’ll need to implement a class that inherits from IGridViewTransform, and implement three methods in your class:

  1. public GridViewTransformType GetTransformType()
  2. public string GetId()
  3. public void Execute(GridViewData viewData, PropertyBag parameters, IGlobalCache cache)

The first method is used to determine when in the lifecycle of a scorecard the transform is run.  Your options are PreRender, PreQuery, PostQuery, and PerUser.  Here’s a sample:

   1: /// <summary>
   2: /// Returns the type of transform that will be applied
   3: /// </summary>
   4: /// <returns>Grid view transform type</returns>
   5: public GridViewTransformType GetTransformType()
   6: {
   7:     return GridViewTransformType.PreRender;
   8: }

The second method simply returns the name of the transform:

   1: /// <summary>
   2: /// Returns the name of the transform
   3: /// </summary>
   4: /// <returns>Name of transform</returns>
   5: public string GetId()
   6: {
   7:     return "TransformMagicNumberToText";
   8: }

The Execute method is what we’re really interested in… this is run every time a scorecard is displayed.  OK, that’s not quite true… week in mind that every time a scorecard is run for an individual for a particular set of parameters, it get’s cached for awhile… keep that in mind when debugging.  An IISRESET will make sure the transform is run again.

   1: /// <summary>
   2: /// This method is called each time a Scorecard is rendered
   3: /// </summary>
   4: /// <param name="viewData">Scorecard view that the transform acts on</param>
   5: /// <param name="parameters">Parameters passed to the method</param>
   6: /// <param name="cache">Global cache of objects</param>
   7: public void Execute(GridViewData viewData, PropertyBag parameters, IGlobalCache cache)
   8: {
   9:  
  10: }

First, we’ll set up some variables that we’ll use later.  The idea is that there are some of these transforms that should only be run on certain scorecards, columns, KPIs, etc.  I’m prototyping these for a publicly accessible dashboard… so I want to keep performance in mind.  The final version will probably utilize caching in order to increase performance.  Right now, I just threw in some basic functionality to make the transform a little more ‘selective’.

   1: // These will be read from a config file or a database in production
   2: // Use to limit the columns in a Scorecard that this transform will run on
   3: string columnNames = "|Magic Column|Another Column|";  
   4: // Use to limit what scorecards this transform will run on
   5: string scorecardNames = "|Sample Scorecard|Another Scorecard|";
   6: // Value to replace, and text to replace it with; this would probably really be a list
   7: decimal magicValue = Convert.ToDecimal(-1); 
   8: string replacementText = "Invalid Data";
   9:  
  10: // Columns we're interested in that are on the scorecard
  11: List<GridHeaderItem> columnsInScorecard = new List<GridHeaderItem>();

Next, we’re going to get the name of the scorecard, so we will only run the transform on scorecards we want.

   1: // Get the Scorecard that is currently being transformed
   2: Scorecard scorecard = cache.GetScorecard(viewData.ScorecardId);
   3: // Get the name of the scorecard
   4: string scorecardName = scorecard.Name.Text;
   5:  
   6: // Validate that this scorecard should have the transform applied
   7: // If not, short-circuit
   8: if (scorecardNames.IndexOf("|" + scorecardName + "|") < 0)
   9: {
  10:     return;
  11: }

Next, we’ll run through the list of GridHeaderItems that exists on the Scorecard.  We’ll build a list of the the GridHeaderItems who’s display text matches the columns we want to be able to transform.

   1: // Get the row and column headers
   2: List<GridHeaderItem> rowHeaders = viewData.RootRowHeader.GetAllHeadersInTree();
   3: List<GridHeaderItem> columnHeaders = viewData.RootColumnHeader.GetAllHeadersInTree();
   4:  
   5: // Iterate through the Column headers to get a list
   6: // containing the columns we're interested in
   7: foreach (GridHeaderItem ghi in columnHeaders)
   8: {
   9:     // See if the column exists in our list
  10:     if (columnNames.IndexOf("|" + ghi.DisplayText + "|") >= 0)
  11:     {
  12:         // Add it to the list if so
  13:         columnsInScorecard.Add(ghi);
  14:     }
  15: }

Now we’re going to look through all the the columns and rows on the scorecard.  We’ll look at each display condition, see if it matches our criteria, then replace it if it does.  We accomplish changing a number to text by removing the initial Display Element, and replacing it with a new one.

Now we have a basic transform that, for a particular scorecard and column, will replace specific numbers with text.

   1: // Iterate through the list of columns we're interested in
   2: foreach (GridHeaderItem columnHeader in columnsInScorecard)
   3: {
   4:    // Look at each row on the scorecard
   5:    foreach (GridHeaderItem rowHeader in rowHeaders)
   6:    {
   7:        // Variable to hold the index that we want to remove
   8:        // This may need to become a list, or another type of structure
   9:        int indexToRemove = -1;
  10:  
  11:        for (int i = 0; i < viewData.Cells[rowHeader, columnHeader].DisplayElements.Count; ++i)
  12:        {
  13:            GridDisplayElement gde = viewData.Cells[rowHeader, columnHeader].DisplayElements[i];
  14:            
  15:            // Look for number in the display properties
  16:            // In this case, look for the Value, not the formatted display
  17:            if (gde.DisplayElementType == DisplayElementTypes.Number)
  18:            {
  19:                // Find our magic value that we use as a key
  20:                // to know when to replace the value 
  21:                if (gde.Value == magicValue)
  22:                {
  23:                    indexToRemove = i;
  24:                }
  25:            }
  26:        }
  27:  
  28:        if (indexToRemove > -1)
  29:        {
  30:            viewData.Cells[rowHeader, columnHeader].DisplayElements.RemoveAt(indexToRemove);
  31:  
  32:            GridDisplayElement newGde = new GridDisplayElement();
  33:            newGde.DisplayElementType = DisplayElementTypes.Text;
  34:            newGde.Text = replacementText;
  35:  
  36:            viewData.Cells[rowHeader, columnHeader].DisplayElements.Add(newGde);
  37:        }
  38:    }
  39: }

 

Deploying the Transform

Alright, so we’ve written and compiled our first transform.  Now, all we need to do is actually install it.  There isn’t any documentation (at the time of writing this blog) in the PerformancePoint Monitoring SDK on how to install a Scorecard Transform… but there is documentation on how to Install Report Viewer Extensions.  I took a guess that the process would be pretty much the same (albeit instead of using the <CustomReportViews> section you use the <ScorecardTemplateExtensions> section in the config files), and it appears so.  You can just follow those steps with that minor change.

Here is an example of what your new <ScorecardTemplateExtensions> section will look like:

   1: ...
   2: <CustomViewTransforms>
   3:   <add key="ExpandNamedSets" value="Microsoft.PerformancePoint.Scorecards.GridViewTransforms.ExpandNamedSets, Microsoft.PerformancePoint.Scorecards.Server, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" />
   4:   <add key="RowsColumnsFilterTransform" value="Microsoft.PerformancePoint.Scorecards.GridViewTransforms.RowsColumnsFilterTransform, Microsoft.PerformancePoint.Scorecards.Server, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" />
   5:   <add key="AnnotationTransform" value="Microsoft.PerformancePoint.Scorecards.GridViewTransforms.AnnotationTransform, Microsoft.PerformancePoint.Scorecards.Server, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" />
   6:   <add key="UpdateDisplayText" value="Microsoft.PerformancePoint.Scorecards.GridViewTransforms.UpdateDisplayText, Microsoft.PerformancePoint.Scorecards.Client, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" />
   7:   <add key="ComputeRollups" value="Microsoft.PerformancePoint.Scorecards.GridViewTransforms.ComputeRollups, Microsoft.PerformancePoint.Scorecards.Client, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" />
   8:   <add key="ComputeAggregations" value="Microsoft.PerformancePoint.Scorecards.GridViewTransforms.ComputeAggregations, Microsoft.PerformancePoint.Scorecards.Client, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" />
   9:   <!-- add key="ApplyDefaultFormatInfo" value="Microsoft.PerformancePoint.Scorecards.Client.ApplyDefaultFormatInfo, Microsoft.PerformancePoint.Scorecards.Client, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"/-->
  10:   <!-- New Scorecard Transforms -->
  11:   <add key="TransformReplacePropertyText" value="ScorecardTransformPrototypeLibrary.TransformReplacePropertyText, ScorecardTransformPrototypeLibrary, Version=1.0.0.0, Culture=neutral, PublicKeyToken=dc084a0be77df9c1" />
  12:   <add key="TransformMagicNumberToText" value="ScorecardTransformPrototypeLibrary.TransformMagicNumberToText, ScorecardTransformPrototypeLibrary, Version=1.0.0.0, Culture=neutral, PublicKeyToken=dc084a0be77df9c1" />
  13:   <!-- End New Scorecard Transforms -->
  14: </CustomViewTransforms>
  15: ...

After modifying your three web.config files and GACing your assembly, your scorecard should be ready to run!

Debugging your Assembly

OK, so you’re brand new assembly is running… and not exactly doing what you’re expecting it to.  Or perhaps you just want to see what’s going on in all those objects that you’re using in your code.  Unless you’re already developing on your SharePoint box, you’ve now got a few hoops to jump through in order to be able to debug.  The configuration I use is to develop locally, then deploy to a server… so I’ll run through how to get that working.

First, you need to setup remote debugging on your server.

Next, you’ll need to put your debug symbols into the GAC with your your assembly.

Then, you’ll need to restart IIS so your new assembly gets loaded (otherwise, you’ll probably get some funky results).

I went ahead and created a batch script to do this (included in the Visual Studio solution… I created a folder on the SharePoint server for this to live in)… your filenames and paths will be different, but this is the basic concept:

   1: @ECHO OFF
   2: REM: This batch file will pull the assembly and debug symbols
   3: REM: from the development machine, GAC the assembly, and 
   4: REM: put the debug symbols into the GAC folder.
   5:  
   6: REM: Copy the scorecard files to this server
   7: COPY "\\DDARDEN\C$\Users\ddarden\Documents\Visual Studio 2005\Projects\ScorecardTransformPrototype\ScorecardTransformPrototypeLibrary\bin\Debug\Score*.*" .
   8: ECHO Copied the scorecard files to the server
   9:  
  10: REM: Add the DLL to the GAC
  11: "C:\Program Files\Microsoft Visual Studio 8\SDK\v2.0\Bin\gacutil.exe" /i ScorecardTransformPrototypeLibrary.dll
  12: ECHO GAC'd the DLL
  13:  
  14: REM: Copy the debug symbols to the GAC directory to enable remote debugging
  15: COPY ScorecardTransformPrototypeLibrary.pdb C:\WINDOWS\assembly\GAC_MSIL\ScorecardTransformPrototypeLibrary\1.0.0.0__dc084a0be77df9c1
  16: ECHO Loaded the symbols
  17:  
  18: REM: Recycle IIS to reload the assembly
  19: IISRESET
  20: ECHO Reset IIS

Now you’re going to need to attach to the correct process on the remote machine.  You want the w3wp process… there may be a few, so you might have to experiment a little to find the right one (in my configuration, I want the one running under the network service – NOT the SharePoint domain service account).  When you attach to a process, look at the output to see if your assembly got loaded with symbols.  The process your looking for will have the other PerformancePoint Server Monitor assemblies loaded, such as:

   1: 'w3wp.exe' (Managed): Loaded 'C:\WINDOWS\assembly\GAC_MSIL\Microsoft.PerformancePoint.Scorecards.WebParts\3.0.0.0__31bf3856ad364e35\Microsoft.PerformancePoint.Scorecards.WebParts.dll', Skipped loading symbols. Module is optimized and the debugger option 'Just My Code' is enabled.
   2: 'w3wp.exe' (Managed): Loaded 'C:\WINDOWS\assembly\GAC_MSIL\Microsoft.PerformancePoint.Scorecards.Client\3.0.0.0__31bf3856ad364e35\Microsoft.PerformancePoint.Scorecards.Client.dll', Skipped loading symbols. Module is optimized and the debugger option 'Just My Code' is enabled.
   3: 'w3wp.exe' (Managed): Loaded 'C:\WINDOWS\assembly\GAC_MSIL\Microsoft.PerformancePoint.Scorecards.Server\3.0.0.0__31bf3856ad364e35\Microsoft.PerformancePoint.Scorecards.Server.dll', Skipped loading symbols. Module is optimized and the debugger option 'Just My Code' is enabled.
   4: 'w3wp.exe' (Managed): Loaded 'C:\WINDOWS\assembly\GAC_MSIL\Microsoft.PerformancePoint.Scorecards.DataSourceProviders.Standard\3.0.0.0__31bf3856ad364e35\Microsoft.PerformancePoint.Scorecards.DataSourceProviders.Standard.dll', Skipped loading symbols. Module is optimized and the debugger option 'Just My Code' is enabled.
   5: 'w3wp.exe' (Managed): Loaded 'C:\WINDOWS\assembly\GAC_MSIL\Microsoft.PerformancePoint.Scorecards.Client.resources\3.0.0.0_en_31bf3856ad364e35\Microsoft.PerformancePoint.Scorecards.Client.resources.dll', Skipped loading symbols. Module is optimized and the debugger option 'Just My Code' is enabled.
   6: 'w3wp.exe' (Managed): Loaded 'C:\WINDOWS\assembly\GAC_MSIL\Microsoft.PerformancePoint.Scorecards.WebControls\3.0.0.0__31bf3856ad364e35\Microsoft.PerformancePoint.Scorecards.WebControls.dll', Skipped loading symbols. Module is optimized and the debugger option 'Just My Code' is enabled.
   7: 'w3wp.exe' (Managed): Loaded 'C:\WINDOWS\assembly\GAC_MSIL\Microsoft.PerformancePoint.Scorecards.Script\3.0.0.0__31bf3856ad364e35\Microsoft.PerformancePoint.Scorecards.Script.dll', Skipped loading symbols. Module is optimized and the debugger option 'Just My Code' is enabled.
   8: 'w3wp.exe' (Managed): Loaded 'C:\WINDOWS\assembly\GAC_MSIL\Microsoft.PerformancePoint.Scorecards.Common\3.0.0.0__31bf3856ad364e35\Microsoft.PerformancePoint.Scorecards.Common.dll', Skipped loading symbols. Module is optimized and the debugger option 'Just My Code' is enabled.
   9: 'w3wp.exe' (Managed): Loaded 'C:\WINDOWS\assembly\GAC_MSIL\Microsoft.PerformancePoint.Scorecards.WebParts.resources\3.0.0.0_en_31bf3856ad364e35\Microsoft.PerformancePoint.Scorecards.WebParts.resources.dll', Skipped loading symbols. Module is optimized and the debugger option 'Just My Code' is enabled.
  10: 'w3wp.exe' (Managed): Loaded 'C:\WINDOWS\assembly\GAC_MSIL\Microsoft.PerformancePoint.Scorecards.Script.resources\3.0.0.0_en_31bf3856ad364e35\Microsoft.PerformancePoint.Scorecards.Script.resources.dll', Skipped loading symbols. Module is optimized and the debugger option 'Just My Code' is enabled.
  11: 'w3wp.exe' (Managed): Loaded 'C:\WINDOWS\assembly\GAC_MSIL\Microsoft.PerformancePoint.Scorecards.WebControls.resources\3.0.0.0_en_31bf3856ad364e35\Microsoft.PerformancePoint.Scorecards.WebControls.resources.dll', Skipped loading symbols. Module is optimized and the debugger option 'Just My Code' is enabled.

If you see them being loaded, but you don’t see yours, then you probably messed up registering your assembly.

Now, you should be able to set break points, and debug your assembly!

Conclusion

OK… those are the steps to create a new Scorecard Transform, deploy it to the server, and to attach to the process and debug it.  Happy Monitoring!