Archive for the ‘Performance Point Services’ Category

Dynamic Rows, Ordering by Score and multiple use of a filter Member UniqueName

Pingback from:

A recent requirement that I encountered on a customer site was their requirement that the rows that appeared on a Performance Point Scorecard were ordered by the traffic light status. All rows that have a big red status appear at the top of the scorecard, followed by the amber coloured members and followed by the green lights. Now this is very easy to do once the scorecard is displayed by clicking on filter mode and sorting by a column, but this approach has a couple of issues with it. 1) You have to do this every time as you can’t set the default sort order. 2) This approach assumes that you have a column that corresponds to your the score value.


In this blog I touch upon a few areas

1) How to create dynamic rows in a scorecard. When you design a score card, the rows are static. There is a technique that can be applied to make the rows dynamic

2) How to use a Filter Link “Member UniqueName” source value twice in a single scorecard, once for the purposes of filtering the KPI values and once for the dynamic creating of rows

3) How to extent the dynamic rows concepts to include ordering.


As have been blogged about elsewhere, the designed behaviour of dimension members on a performance point scorecard is that once you put a dimension member on a row, it becomes a static row member and new members are not added (or removed) based upon any filter you select. However as indicated here, there is a way to solve this. I used this technique to list members on my performance point rows; however a small extension to this same technique allowed me to order by the indicator status as well. I have used Adventure Works as a source for my new Dashboard. The basic Dashboard looks like;



I have my indicator set on the column Quote and I have selected Band by Stated score and supplied the MDX for the score. See the MDX sample below.


([Measures].[Reseller Sales Amount] – [Measures].[Sales Amount Quota])


[Measures].[Sales Amount Quota]


When creating the score card, I placed my KPI onto columns and then I drag the [Employees]. [Employees]. [Employees] attribute onto rows. This is just so I can render a scorecard. Next I create a Dashboard and on this dashboard I create a Filter based upon the Calendar Year. The next step is to drag both scorecard and filter on the dashboard page.



Now the fun starts to happen. You will notice that I have the select Year Filter on the Employee Performance scorecard twice. This is because I want to do two things with the filter. 1) I want to it to actual work as a filter for my KPI so that we can see an employees performance by a year. 2) I want to re-use that filter in a MDX statement that will determine what rows to display on the scorecard and how to order themThe first filter is straight forward. I connect the Top Row Filter to the Filters Dashboard Item End Point and use the Member UniqueName as the Source Value.


Now onto the second requirement which is to generate the rows of the scorecard based up the filter. I create a new filter link and this time I connect the Dashboard Item Endpoint to the DisplayValue. I then configure the Filter Link Formula as shown below. My MDX formula consists of an exists function wrapped inside an order function. The exists function gives me a set of employees that matches the selected year in the filter. The Order then sorts this list according to a value. The value that I am supplying is the same formula that I use to calculate the score. The sharp eyed observer will spot a problem here. I am using SourceValue when the Dashboard Item Endpoint refers to DisplayValue. Shouldn’t I refer to DisplayValue instead? I was hoping to be able to do this and everything appeared to work great, except the sort was not working. The Employees came out in the same order every time. The syntax to work with Display is strtomember(‘[Date].[Calendar Year].[DisplayValue]’) however this did me no good. Instead, I need to use the SourceValue and corresponding Member UniqueName a second time. Unfortunately the PPS scorecard builder does not allow you to do this. To get around this you have to edit set .bsmx XML file. More on this later on.




My MDX used above.




,”Sales Targets”




([Measures].[Reseller Sales Amount],SourceValue )

([Measures].[Sales Amount Quota],SourceValue )



([Measures].[Sales Amount Quota],SourceValue )






I put in the Filter Link Formula that refers to SourceValue and leave the Source Value entry on the Filter Link Editor as Display Value.

I now publish and have a look at my new scorecard.






The scorecard displays but you can see that the sort order is not working. (I have put the a BudVar column onto the scorecard to make it very obvious what the field used to generate the score is reading.)

At this stage I have to close my scorecard in the scorecard builder app. Then I browse to the windows folder that I have saved the .bswx file, take a copy and then edit the file in notepad. I found the fastest way to fine the correct section in this file was to search for order(exists( which are the first bit of MDX of my Filter Link Command.



The first bit of the highlighted section reads SourceColumnName=”DispalyValue”. Change this to SourceColumnName=”MemberUniqueName”. Save the file. Open the file again in Dashboard Designer and publish again. This time your scorecard should run as you want.




Finally! Now this solution is not ideal as you have to edit a .bsmx file which raises all sorts of questions about future maintenance of this edit. It is annoying that the system supports this functionality, but you can’t edit it directly. I would be advise caution to anyone who needs this. Is it really necessary in the solution? If it is, document the steps and put in place some procedure that future edits to the bsmx won’t forget about this edit.


A General Caching Overview for Performance Point Services for SharePoint 2010

The purpose of this article is to briefly describe the caching structure of Microsoft Office Performance Point Server (PPS) and to help give a better understanding the caching structure and its function. We’ll discuss what information gets cached, where caching takes place and runtime considerations that effect operations.

PPS main functions are working with BI Dashboards, collecting and assembling data from various sources during creation and servicing of said Dashboards. In many real word scenarios, Dashboards are created and updated infrequently, but viewed often. This leads itself to caching being useful and an important part of PPS. Let’s explore.

So what information gets cached?

Here is a quick list.

  • Query results:   Recently executed queries to the data sources.
  • Rendered views:   HTML pages and graphics.
  • Filter data:   Choices available on dashboard filters.
  • Query metadata:   The information about dimensions, levels, hierarchies, and members, which is required to support navigation and rendering.
  • Application objects:   The definitions of views, dashboards, data sources, and other application data.

In general, it’s what one might expect. Basically information on the Dashboard, Dashboard objects, and the data collected that is needed to populate the Dashboard.

PPS Caching Structure

PPS (Performance Point Services for SharePoint) are web parts, web services, and a SharePoint service application. It is commonly a multiple-server configuration with various servers assigned to varying tasks. As such, this complicates the caching model and often times make it difficult to get a good understanding for the whole PPS caching model. Let’s look at a diagram.

Here is a basic caching structure overview of a two server configuration.

(1 Application Server/1 Windows Front End (WFE))


This diagram has the caching broken down into three areas. There is nothing fundamental about this grouping but it does help us here with conversation points. Here is some more detail.

Data: In general, this is the data gathered and queried that is needed to populate a Dashboard. Data caching can take place on the raw query results themselves but can often be further processed data sets. A good example of this would be a scorecard grid data. Caching takes place on many different data sources not just Analysis Services data. Because all data collection is done on the Application Server, this is where this caching takes place. We can adjust the duration of this caching using the Cache Lifetime and Refresh Interval settings PPS data sources.

Application Objects: This caching area contains PPS objects such as, KPIs, Scorecards, Data Sources, Indicators, etc., which are stored in SharePoint Lists and Documents Libraries and which are needed to create Dashboards. Caching for these objects exist on both the Application Server and WFE but only has one setting for both. Cache settings can be adjusted through Power Shell cmdlets but is normally not needed. The Application Server and the WFE manage their own cache separately and the cache gets set during first object retrieval often referred to as lazy loaded.

Rendered Views: This cache consists of the rendered HTML, JAVA SCRIPT and graphic images generated by the PPS. This is the information that the web browser uses to create the Dashboard. One point I’d like to make here, is not to get this confused with any caching on the IIS or browser. This rendered view cache is completely within PPS and thus will be in addition to any IIS and browser static HTML caching. All of these caches affect the system as a whole, but for the purpose of the article, we’ll concentrate on the PPS side of things.

For all but the rendered view cache, the cache is set on first use. The rendered view cache is different in that it is set after a certain number of times requested. The idea is that PPS doesn’t want to cache every single event as this would be very space intensive, but rather cache HTML requests that are requested often. The default setting is 2 requests to set the rendered cache. This means that the HTML for the view is cached on the second request and is used on the third request.

The render view cache on either the Application Server or the WFE is always checked for validity through the Application Server. Because of this behavior, the cached values (renders) on the WFE should always have the exact same cached entry on an Application Server as well. The only exception to this is if the rendered View cache is disabled for the Application Server but this is not common.

Setting of this cache is normally not needed but can be set through Power Shell cmdlets. This includes an option for disabling the cache for the WFE, Application Server, or both. More on setting cache later.

Other notes/thoughts:

  • Another thing to note is that caches are not shared across servers as each server contains and services its own cache. This can have an effect on multiple server deployments of PPS. Specifically, changes to Excel Services, SharePoint List and SQL data sources do not invalidate cache entries on data source changes and thus cached data may vary between Application Servers. The refresh interval expiration of the data source will have to expire to resynchronize everything. Remember, this is only true for these data sources. Changes to Analysis Services data sources will be detected by the changed timestamp and the current cache entries will be invalidated. i.e. Reprocessing an AS cube.
  • Excel data source data is stored internally in Performance Point Services and is not cached.

Checking Cache Validity

Deciding when a cache if valid or missed, is done internally through use of cache keys. They are set up to incorporate all needed data to determine the uniqueness of the cache entries for the certain request or use.

Here is a quick list of some of the information used to determining whether PPS cache is valid.

  • User identity and privileges
  • Data Source timestamp. (AS data source only)
  • PPS Data Source Cache Lifetime or Refresh Interval expiration
  • Application Object version
  • Dashboard Filter change
  • User locale setting

Some important points:

  • As stated before, data and schema update will create cache misses. For example, an Analysis Services cube gets reprocessed. This is a nice behavior, as it allows us not to have to use the data source timeout as a way of monitoring for changing data.
  • Even though render view cache is stored on the WFE, it is still affected by data source changes on the Application server. More on this magic later.
  • Analysis Services data sources can utilize user roles from the Analysis Services database for determining privilege based effects on cache validity. This is important as it may allow different users with the same Analytic Services database role to share cache entries. Thus, the database role settings can have a big effect on caching performance if user based security is used. If the security model uses the Unattended Service Account, the user and credentials accessing the database are always the same so this isn’t as important.

Note on Caching Effectiveness

A very important thing to keep in mind is the user privileges and credentials affect caching a great deal. The choice of data source authentication can have the much bigger effect on PPS caching performance and then adjusting the cache settings.In general, creating PPS data sources configured to use the Unattended Service Account will allow for much more and effective caching. This is because data source authentication will always be from the same user. When data sources are configured as per-user, the credentials of the user requesting the dashboard may be used. Each user may receive different results from the data sources and thus the cache system and may not be able to share results between users, and is much less effective. More on this in a later article but it is such an important point, it needed to be made here.


Configuring cache

As a general rule of thumb, all PPS caching parameters default to useful values and should rarely, if ever, need to be changed. As such there are only two places which cache timeouts can be set and the options are rather limited.

Note: Once again, please keep in mind the above sections statement. Cache timeout settings likely will not affect PPS performance as much as other factors including system configuration and user environment.

Data cache settings:

The data cache timeout setting is the one setting that the Dashboard author might interact with. This is configured when creating/editing a PPS Data Source in Dashboard Designer. This setting affects most areas of data caching including queried results, filter data, data source metadata, and rendered views. In practice these are separate caches but are all set from this value. This value defaults to 10 minutes and is generally considered a good value. Setting this time to 0 will have the effect of disabling data level caching but it is not the recommended setting.

As of the time of this writing, changes to values in SPList, Excel Services, and SQL tabular data sources do not invalidate the data cache. Values will only be picked up by Performance Point after the data cache expires and thus the name Refresh Interval is used. This is an important thing to remember if one using one of these data sources.

As mentioned earlier, data caching invalidates for Analysis Services data source changes. Because of this it might be more useful to set this Cache Lifetime to a greater value for Analysis Services data sources then the other data sources. The trade off to this is greater memory usage on the Application Server. This value defaults to 10 minutes, but an administrator should feel free to change this value for performance reasons. Just remember to track memory usage.

Here is a screen shot of PPS Dashboard Designer where the data caching is set.




Rendered View cache setting:

The Application Object and HTML rendered view caches are set through power shell cmdlets. The rendered view cache value takes the form of number of hits before caching begins. Currently this defaults to two which is considered a good value. Through this mechanism, the cache can also be completely disabled. A user or Administrator should rarely, if ever, change either of these settings as the effect on performance will be large. For instructions on setting, please contact Microsoft customer service for more information and guidelines.

Application Object cache setting:

The Application Object is also set through power shell cmdlets, this in the form of a timeout and defaults to 15 seconds. As with the rendered view cache this value should rarely if ever changed due to its large effect on performance and perceived system reliability. For instructions on setting, please contact Microsoft customer service for more information and guidelines.

Next Article

This completes the very brief whirlwind tour of PPS caching. We have briefly talked about a few runtime considerations that affect caching. In the next article we will delve further into user security and other factors that contribute to the caching performance. We will also take a look at how we might monitor caching runtime performance and take a closer look at how rendered view cache is implemented.


  • How Security affects cache. User Vs. Application
  • User persona, Observer Vs. Explorer
  • Effectiveness in authoring Dashboards.
  • SQL roles, how they play a part

Monitoring Cache performance

  • Windows performance counters.

V1 vs. V2

  • Usage of SQL roles
  • User security per data source vs. Application security.

Further Description of Render View Cache ?

  • How it’s implemented. (Always checked through the App. Server etc.)

Frank Geczi II   SDET, SharePoint BI


PerformancePoint – Using SSAS Actions


The latest project I worked on had a heavy dose of PerformancePoint and one of the requirements was to launch Analysis Services Actions from PerformancePoint dashboard objects. I knew this was possible but did not no all the limitations that exist when you want to do this. Most of the available post reference PerformancePoint 2007 so I thought it was time to update the topic.

I mentioned there are a lot of caveats to doing actions in PerformancePoint so I came up with this quick flowchart to first see if you can even consider doing them.

Yes at first is was confusing enough to me that I needed this!


So assuming you meet the conditions above in the diagram let’s show you how to enable action usage.

First we’ll look at the scorecard which can only use Drillthrough actions from Analysis Services. Drillthroughs are simple enough to create so I won’t focus on that but a couple things you’ll need to make sure are set here are Measure group members cannot be set <All> for PerformancePoint to recognize it and you need to change the Default property to true. The Default property is a little hidden because it is in the Additional Properties section at the bottom of the action.


Once the Drillthrough is setup properly in SSAS then you must build your PerformancePoint KPI so it will pickup the action. This means that in Dashboard Designer you must change the Calculation property on each field on your KPI that you wish to allow users to launch the action must be set to Data value. This is a change in the property value name which was called Source data in PPS 2007.


When you save the KPI your scorecard should automatically be updated. Only any field you changed you can now right-click on and select Show Detail, which will launch your SSAS action.


Analytic grids and charts do not require this KPI change. They can do URL, Drillthrough and Reporting Services actions by simply right-click on the measure you want to get details on then go to Additional Actions and select the Action name.


So just to just to summarize if you’re still having problems doing actions in PerformancePoint here are your requirements:

  • You can only use actions that are regular measures (Cannot be a dimension attribute or calculation).
  • Scorecards can only do Drillthrough actions but the action you create must be the default.
  • Analytic Grid and Charts can do URL, Drillthrough, and SSRS actions. Drillthrough action does not need to be set to the default to use here.
  • The SSAS action cannot be apart of the <All> Measure Group Member
  • If using an unattended service account for PPS make sure it has read definition on the cube.
  • Make sure you are in the visitor permission group on the SharePoint site.

Apply Custom Scorecard Transforms to a PerformancePoint Dashboard

By using custom extensions, you can extend the capabilities of PerformancePoint Services in Microsoft SharePoint Server 2010. This scenario shows you how to apply a scorecard transform that changes the way a scorecard is displayed without actually changing the scorecard itself. Before applying the transform, a scorecard might resemble the top chart in the image on the right.

After applying the scorecard transform, the scorecard might resemble the bottom chart in the image on the right.

The scorecard transform that is described in this scenario causes a scorecard to be displayed in a more compact format, which makes it easier for you to fit more items on a dashboard page.

To get started, see Apply custom scorecard transforms to a PerformancePoint dashboard

Featured: Custom Scorecard Transforms

Click to enlarge sample scorecard image

Apply custom scorecard transforms to a PerformancePoint dashboard



Cascading Filters – Sp1 Feature


I downloaded and installed SharePoint 2010 SP1 so I could check out the new additions the PerformancePoint.  Typically you would only see hotfixes included in a service pack, but in this one we get a few new features like cascading filters and a new apply filters button (with saving defaults filter) options.  Let’s take a quick tour of these new features to see how you use them and what they do to your dashboard pages.

Cascading Filters

This is not a new concept, but for PerformancePoint it is.  In Reporting Services you have always had the ability to setup parameters so that the selection in one parameter list would be able to filter the available values in another parameter list.  Well now this has been added to PerformancePoint and it is available with the Multidimensional Filter types – Member Selection, MDX Query, and Named Set filter types.  When you go to create a new filter of one of these types you will see a new setting in them.  This new option is to select a measure (metric) that will be used to pass a query to the other filter to return the list of available values that satisfy that query.

Member Selection


The new selection is the ‘Filter measure:’ option and the informational dialog box states the following:

Select the measure used to determine which values to display when this filter is driven by another filter.

This is the measure (metric) that will be used in combination with the filter member values passed to this filter to display the available list of values to the end-user to select from. So if I had a filter that was for Product Category and passed that to another filter that was Product Subcategory and the Product Subcategory was configured with ‘Sales Amount’ measure then the Product Subcategory filter would display a list of items that had ‘Sales Amount’ for the Product Category items that were selected.  A tad bit confusing perhaps, but this is how it works.

MDX Query


Named Set


This option is not available with the other filter types, just the ones displayed above – Member Selection, MDX Query, and the Named Set.

Ok, so now that you have a tour of that new option lets setup a dashboard with a couple of filters and a report.

Create Filters

Member Selection – Product Category


Member Selection – Product Subcategory


Create Analytical Grid Report


In this report I used the Product hierarchy and chose only the Product Name descendants of All, picked the Calendar Year hierarchy, and placed the Sales Amount measure in the background. I also used the filter option to remove blank rows and columns.

Create Dashboard and Connect the Items


For the connections I connected the two filters together and then connected the Product Subcategory to the Product Sales report.

Connection to the Product Subcategory filter

image  image

Connection to the Product Sales Analytical Grid Report – uses a connection formula as well

image  image  image

In this example I am leveraging a Connection Formula.  The reason I am doing this is because the hierarchies that are involved in this example.  I am not referencing the same hierarchy in each item and I want to be able to display the product names in the report instead of the subcategory values.  So I am taking the display name in the subcategory filter and using that in a formula to return the children (product names) in the report.

Deployed Dashboard in SharePoint


You can see that the subcategory filter is filtered by the category filter and only the ‘Tv and Video’ subcategory members are being listed.  The subcategory filter selection is also filtering the report which is displaying all of the product names that are associated with the ‘Television’ subcategory.

If we make another selection in the category list we will see everything get updated again.


Pretty slick.

Ok, now on to the other new feature that was added into the service pack 1 – Apply Filters Button.

Apply Filters Button

When you setup a dashboard now you will see a new selection in the Details pane in the Filters section called ‘Apply Filters Button’.


So what is this for?  Hmmm, is this something similar to Reporting Services perhaps?  Answer – Yes, with an added bonus.


If you drag and drop this onto the dashboard page and go into the edit settings for this new item you will get some options you can configure.  The first one is the text that you would like to be displayed on the dashboard page for the button.  And the next one is whether or not you would like to provide a checkbox for the end users to be able to save their selections for this dashboard page – this will be stored as their default values for these filters.  This means that when they come back to this page at a later time these filter selections will automatically be selected for them.  In the past the last selection of items from the filters was always saved and stored for the users, but now they have the control to determine which values get saved (if you want them to – optional).


The other thing about this new feature is that when you make selections from the filters the items in the dashboard (with the exception of linked filters) will not be filtered.  In order to get the other items to filter on the dashboard you need to click the button.  Once you do this your scorecards and reports will refresh and display the data based on the selections in the filters (assuming they are connected of course).

I was hoping that this feature might somehow allow you to retain your default member selection settings in the initial filter setup, but that does not appear to be the case.  The application still retains the last selection by the user unless you provide them the ability to save their own defaults with the new ‘Apply Filters Button’ option.

Anyway, these are just a couple of the new features along with hotfixes that are available in service pack 1.

Check out more information here:


By the way, after I upgraded to SP1 the build version of Dashboard Designer was the following:



Prior to the upgrade it was:



I had installed a hotfix prior to doing the service pack 1 install.  I wanted to check out some other fixes before this release – PerformancePoint Services 2010 (PPS) Hotfixes.