Archive for the ‘Reporting Services’ Category

Enabling Kerberos Authentication for Reporting Services/SSAS


Nice Article from Rob_Carrol at

By default, Reporting Services uses Windows Integrated Authentication, which includes the Kerberos and NTLM protocols for network authentication. Additionally, Windows Integrated Authentication includes the negotiate security header, which prompts the client to select Kerberos or NTLM for authentication. The client can access reports which have the appropriate permissions by using Kerberos for authentication. Servers that use Kerberos authentication can impersonate those clients and use their security context to access network resources.

You can configure Reporting Services to use both Kerberos and NTLM authentication; however this may lead to a failure to authenticate. With negotiate, if Kerberos cannot be used, the authentication method will default to NTLM. When negotiate is enabled, the Kerberos protocol is always used except when:

  • Clients/servers that are involved in the authentication process cannot use Kerberos.
  • The client does not provide the information necessary to use Kerberos.

An in-depth discussion of Kerberos authentication is beyond the scope of this post, however when users execute reports that are configured to use Windows Integrated Authentication, their logon credentials are passed from the report server to the server hosting the data source. Delegation needs to be set on the report server and Service Principle Names (SPNs) set for the relevant services. When a user processes a report, the request must go through a Web server on its way to a database server for processing. Kerberos authentication enables the Web server to request a service ticket from the domain controller; impersonate the client when passing the request to the database server; and then restrict the request based on the user’s permissions. Each time a server is required to pass the request to another server, the same process must be used.

Kerberos authentication is supported in both native and SharePoint integrated mode, but I’ll focus on native mode for the purpose of this post (I’ll explain configuring SharePoint integrated mode and Kerberos authentication in a future post). Configuring Kerberos avoids the authentication failures due to double-hop issues. These double-hop errors occur when a users windows domain credentials can’t be passed to another server to complete the user’s request. In the case of my customers, users were executing Reporting Services reports that were configured to query Analysis Services cubes on a separate machine using Windows Integrated security. The double-hop issue occurs as NTLM credentials are valid for only one network hop, subsequent hops result in anonymous authentication.


The client attempts to connect to the report server by making a request from a browser (or some other application), and the connection process begins with authentication. With NTLM authentication, client credentials are presented to Computer 2. However Computer 2 can’t use the same credentials to access Computer 3 (so we get the Anonymous login error). To access Computer 3 it is necessary to configure the connection string with stored credentials, which is what a number of customers I have worked with have done to workaround the double-hop authentication error.

However, to get the benefits of Windows Integrated security, a better solution is to enable Kerberos authentication. Again, the connection process begins with authentication. With Kerberos authentication, the client and the server must demonstrate to one another that they are genuine, at which point authentication is successful and a secure client/server session is established.


In the illustration above, the tiers represent the following:

  • Client tier (computer 1): The client computer from which an application makes a request.
  • Middle tier (computer 2): The Web server or farm where the client’s request is directed. Both the SharePoint and Reporting Services server(s) comprise the middle tier (but we’re only concentrating on native deployments just now).
  • Back end tier (computer 3): The Database/Analysis Services server/Cluster where the requested data is stored.

In order to enable Kerberos authentication for Reporting Services it’s necessary to configure the relevant SPNs, configure trust for delegation for server accounts, configure Kerberos with full delegation and configure the authentication types for Reporting Services. These steps are outlined in greater detail in the “Manage Kerberos Authentication Issues in a Reporting Services Environment” whitepaper in the resources section at the end of this article.

Service Principle Names (SPNs) are unique identifiers for services and identify the account’s type of service. If an SPN is not configured for a service, a client account will be unable to authenticate to the servers using Kerberos. You need to be a domain administrator to add an SPN, which can be added using the SetSPN utility. For Reporting Services in native mode, the following SPNs need to be registered

–SQL Server Service SETSPN -S mssqlsvc/servername:1433 Domain\SQL

For named instances, or if the default instance is running under a different port, then the specific port number should be used.

–Reporting Services Service SETSPN -S http/servername Domain\SSRS SETSPN -S http/ Domain\SSRS

The SPN should be set for the NETBIOS name of the server and the FQDN. If you access the reports using a host header or DNS alias, then that should also be registered

SETSPN -S http/ Domain\SSRS

–Analysis Services Service SETSPN -S msolapsvc.3/servername Domain\SSAS

Next, you need to configure trust for delegation, which refers to enabling a computer to impersonate an authenticated user to services on another computer:

Location Description
Client 1. The requesting application must support the Kerberos authentication protocol.

2. The user account making the request must be configured on the domain controller. Confirm that the following option is not selected: Account is sensitive and cannot be delegated.

Servers 1. The service accounts must be trusted for delegation on the domain controller.

2. The service accounts must have SPNs registered on the domain controller. If the service account is a domain user account, the domain administrator must register the SPNs.

In Active Directory Users and Computers, verify that the domain user accounts used to access reports have been configured for delegation (the ‘Account is sensitive and cannot be delegated’ option should not be selected):


We then need to configure the Reporting Services service account and computer to use Kerberos with full delegation:




We also need to do the same for the SQL Server or Analysis Services service accounts and computers (depending on what type of data source you are connecting to in your reports).

Finally, and this is the part that sometimes gets over-looked, we need to configure the authentication type correctly for reporting services to use Kerberos authentication. This is configured in the Authentication section of the RSReportServer.config file on the report server.







This will enable Kerberos authentication for Internet Explorer. For other browsers, see the link below. The report server instance must be restarted for these changes to take effect.

Once these changes have been made, all that’s left to do is test to make sure Kerberos authentication is working properly by running a report from report manager that is configured to use Windows Integrated authentication (either connecting to Analysis Services or SQL Server back-end).


Manage Kerberos Authentication Issues in a Reporting Services Environment

Configuring Kerberos Authentication for Microsoft SharePoint 2010 Products

How to: Configure Windows Authentication in Reporting Services

RSReportServer Configuration File

Planning for Browser Support


Connecting SSRS Report to Bing Maps

Hy there,

Today i read an interesting article at talking about drill through actions in  order to connect and map your Report Data to external sites such as Bing or Google Maps.

Here is the link :

Categorias:Geographic Mapping

RsScripter – Nice tool to Help you Migrate Reporting Services

PingBack from:

If you are using SSRS then chances are high to come across a scenario where you want to deploy reports developed on development machine to the production server, there are various ways to do this and one of them is to use RS Scripter tool. Here are the steps to do this –

1. Creating Report Deployment Script

  1. Download the Reporting Services Scripter –
  2. Update the servers.xml; change the reportservice value if your ReportServer name is different.

    <server label=”SQL2000” reportservice=”http://localhost/ReportServer/ReportService.asmx” />

    <server label=”SQL2005” reportservice=”http://localhost/ReportServer/ReportService2005.asmx” />

  3. Run the RSScripter.exe.
  4. Click on Options, and select the options selected in images below( for more information about Scripting Options check the readme file – –clip_image001



  5. Change the SQL 2005 RS EXE Location in the Global tab as per the location of SQL server and Apply –clip_image004
  6. Select SQL 2005 from the Report Server dropdown and click on Get Catalog, all the reports, data source, Schedules, Roles etc present in report server will be displayed –


  7. Select all the reports, data source, schedules and roles you want to transfer.


Change the output directory path (Should not be machine/user specific e.g. Desktop) e.g. C:\AClickReports and click on Script. The script for loading the specified items will be generated to the specified folder.

2. Deploying the Reports

  1. Ensure that you have IIS and dot net framework 2.0 is installed on report server.
  2. While installing SQL server 2005 select reporting services option.
  3. Follow the report server configuration steps and make sure “Report Server Configuration Manager” is all green ticks?clip_image008
  4. Follow these steps to publish reports –
    1. Extract the AClickReports folder to the report server, C: Drive.
    2. Open the “RS Scripter Load All Items.cmd” file located in AClickReports folder for editing.
    3. Change the SCRIPTLOCATION Script variable value to the location where AClickReports folder is copied (Should be same to the output directory path selected in step 3.8)

        SET SCRIPTLOCATION=C:\ AClickReports\


    4. Change the RS Script variable value to the location where Microsoft SQL Server is installed (if it is different from the default C drive).

      SET RS=”C:\Program Files\Microsoft SQL Server\90\Tools\Bin\RS.EXE”

    5. Run the RS Scripter Load All Items.cmd batch file, it will publish the reports to the report server.
  5. Follow these steps to verify that reports are deployed correctly
    1. Go to http://localhost/Reports/Pages/Folder.aspx,clip_image011
    2. Open the Data Sources you should see data sources selected in step 1.7clip_image013
    3. Open the data source and update the data source details as per the configuration of report server(if they are different) and apply.rs3
    4. Go back to Home page and click on the report folder selected in step 1.7 , select one of the reports and click on Edit button, then click on the Data Sources link ; Make sure that the data source is configured properly.clip_image017
    5. Repeat the steps from 2.5.2 to 2.5.4 for other data sources and report folders

Syncronize Reports across SSRS Servers

SSRS 2008 R2 OLAP reports and overwriting MDX parameters

Pyngback from:

SSRS 2008 (and SSRS 2008R2) have some tremendous reporting enhancements, but unfortunately introduced a feature that caught some people by surprise. The issue is a rather intricate one, and goes as follows:

Suppose you create a report against an OLAP cube. You start by creating a report dataset, using the OLAP query designer. You drag some columns from the cube metadata into the result set area, and you also define one of the OLAP dimensions as a parameter. SSRS creates a second dataset with MDX code that retrieves member values from the dimension attribute you defined as a parameter in the primary dataset. (This allows SSRS to display the parameter values in a report dropdown).

With me so far?

Now suppose you want to modify the MDX code for the 2nd dataset (which SSRS builds as a hidden dataset), to implement some custom behavior for the dropdown. For example, instead of showing all cities in a dropdown, you only want to show those cities with sales (or sales in the last 6 months, etc.) So you retrieve the MDX code for the 2nd (hidden) dataset, and you modify the MDX code. You run the report again, and the dropdown shows values the way you want to show them.

OK, so life is good…but this story isn’t over.

Now go back to the original (primary) dataset and add a column/measure. (For example, maybe initially the report had 4 measures, and you have to add a fifth). So you add a measure to the primary dataset, you save it, you modify the report layout to add the new measure/column, and then you run the report…

And then you discover something – the dropdown parameter isn’t showing the values in the dropdown according to the logic you implemented when you manually modified the MDX code. As a matter of fact, the dropdown is showing the available parameter values as if you’d never modified the MDX code at all!!!

Here’s what happened – when we changed the primary dataset to add a new column, SSRS (well, specifically, Business Intelligence Development Studio) overwrote our custom MDX code with the original MDX code for the dimension parameter!!!

OUCH! This is actual behavior in SSRS 2008 and 2008 R2. As for a workaround, there’s good news and bad news. The good news is that there’s a workaround in the form of a setting. The bad news (well, more like “not great news”) is that BIDS doesn’t expose the setting – we need to modify the actual XML code in the RDL file.

So let’s look at an actual example:

Suppose we create a simple OLAP report against AdventureWorks 2008. We want to show Internet Sales Amount by Product Category, and we want the user to select a country from the Customer Dimension/Country Attribute (Figure 1)

Figure 1:

After we save this dataset, SSRS creates a 2nd dataset that retrieves the countries from the Customer dimension. By default this 2nd dataset doesn’t show in the DataSet window pane. However, we can right-click and show Hidden Datasets, which will wind up displaying the 2nd dataset (which SSRS called CustomerCountry, in Figure 2):

Figure 2:

If we right-click on this 2nd dataset and modify the query, SSRS loads the query designer with the generated MDX code (Figure 3). The code retrieves all countries in alphabetic order, plus the All Country (Customer) total.

Figure 3:

Now suppose we want to modify the query (and therefore the contents of the dropdown) to show the countries in order of sales for the country. So we modify the MDX query to use the MDX ORDER function, to show the countries in order based on sales, in Figure 4: (i.e. country with the highest sales will be first in the dropdown)

Figure 4:

Now we run the report (which shows Product Category and Internet Sales amount, and the prompt for Country with the correct sort order), in Figure 5:

Figure 5:

OK, now let’s go back to the original main dataset, and add a new column (Internet Gross Profit), in Figure 6:

Figure 6:

We add the column to the designer as well, and then go run the report (Figure 7). Notice how the dropdown appears to have “lost” the behavior we implemented with our custom MDX code. This is the root of the behavior in SSRS 2008. What happened was that when we modified the dataset in the previous step, SSRS overwrote our custom MDX changes (without warning us!) To prevent this behavior, we need to modify the RDL file manually, to add a new setting (next step).

Figure 7:

We need to open the RDL file manually, which you can do by right-clicking on the report in Solution Explorer and taking the option “View Code”. At the end of the XML content for the hidden dataset, add the following line: <rd:SuppressAutoUpdate>true</rd:SuppressAutoUpdate> (Figure 8).

Figure 8:

Now to test, go back and repeat step 4 (to put the custom MDX code back into the hidden dataset), and then pull up the main dataset and add another column, such as Internet Freight Cost (Figure 9). Previously, this would overwrite our custom MDX code. However, because of the SuppressAutoUpdate setting, SSRS will not update the MDX code.

Figure 9:

So if we go run the report after making the dataset change, our MDX code now stays!

Figure 10:

Posted at 10:00 PM in SetFocus BI/SQL | Permalink

SSAS Dynamic Parameters and SSRS/Dmvs querys


Posted on February 1, 2012 by devinknight

Dynamic Management Views are a very powerful tool that are part of Analysis Services in 2008 and 2008 R2.  They give you the ability to monitor Analysis Services usage and metadata.

MSDN articles on SSAS DMVs –

Good post on using SSAS DMVs –

A report developer could leverage the capability to read Analysis Services metadata to build a report that can dynamically change the type of information that is displayed on a report.

Let’s walk through a basic example of how we can Dynamic Management Views to build a dynamic Reporting Services report.  I’m starting with a basic report that has already been built off of an Analysis Services cube that looks like this:


As you can see it shows Internet Sales by category over several calendar years.  The MDX query used for this is pretty standard as you can see.

Select       [Measures].[Internet Sales Amount] ON Columns        ,NON EMPTY ([Date].[Calendar].[Calendar Year]        ,[Product].[Product Categories].[Category]) ON Rows    From        [Adventure Works]

I start by changing the query to use an in session calculation with the WITH clause so I can basically alias it to [Measures].[Generic Measure].  By aliasing the measure it will make since more later because the field name doesn’t identify with any specific measure.

With       Member [Measures].[Generic Measure]            AS STRTOVALUE(“[Measures].[Internet Sales Amount]”)

Select       [Measures].[Generic Measure] ON Columns        ,NON EMPTY ([Date].[Calendar].[Calendar Year]        ,[Product].[Product Categories].[Category]) ON Rows    From        [Adventure Works]

The next step is to create a parameter in the report so I can pass in the measure name to the query dynamically.  Eventually I will populate this parameter using a DMV but for now I’ll just have it as a text field that I can type in the measure name.

Now to make this query dynamic with my newly created @MeasureName parameter I replace the dataset query with an expression that looks like this:

=”With ”   +”    Member [Measures].[Generic Measure] ”    +”        AS STRTOVALUE(“””+Parameters!MeasureName.Value+”””)  ”

+”Select ”   +”    [Measures].[Generic Measure] ON Columns ”    +”    ,NON EMPTY ([Date].[Calendar].[Calendar Year] ”    +”    ,[Product].[Product Categories].[Category]) ON Rows ”    +”From ”    +”    [Adventure Works] ”


You’ll notice that I’ve basically changed the query to a string in the expression except for the measure name which I’ve changed to use my new parameter.  This report will now work dynamically but the user would have to type in the measure they would like to see in the report.  Not only would they have to type it but they must know the MDX for it.  For example, users would have to type [Measures],[Internet Tax Amount].  Obviously, that’s not going to happen so we need to make this simpler for the user.

This is where Dynamic Management Views help.  We can use the mdschema_measures DMV to return back a list of measures from our cube.  The following query will return back all my measures names, MDX, and format:

SELECT Measure_Name, Measure_Unique_Name, Default_Format_String   FROM $System.mdschema_measures    WHERE Cube_Name = ‘Adventure Works’    ORDER BY Measure_Name

Unfortunately, not all format strings translate to Reporting Services but this could be manipulated using expressions.  Using this query we can create a new dataset and have that dataset populate our @MeasureName parameter.  The dataset cannot be used in entered into a dataset using the Query Builder because it’s not really MDX.  So you must select to build an expression on the new dataset and paste in the query.  It does not require an actual expression so remote the equal sign before the query.


With the dataset completed go back to the @MeasureName properties and change the available values to use the Measure_Unique_Name and Measure_Name fields to populate the parameter dropdown box.


Now that the parameter dropdown box it populated with the DMV query go ahead and preview the report.  You’ll see you can select any measure front the dropdown box and it will change the information shown on the report as shown below:



SSRS Data Driven Subscriptions to Support Alarmistic

Here’s the scenario…  My company is still working with a SQL Server 2000 based process. While in 2011, there will be major changes to the processes that will involve Oracle and newer MS SQL Server technologies, the fact of the matter is, today, there exists a database structure and poor applications logic that allow for bad data to occur.  Here is my example in general terms, plus the use of Data Driven Subscriptions used to automate monitoring these issues.

Daily (well, actually nightly), we have new items to bill, moving from database A to database B.  The original process is simply several T-SQL Stored Procedure (SP) calls to move data from one database to staging tables on another database, and then perform further SP calls to manipulate data, and then finally move from staging tables into target tables, again via SP calls.  (Yes, SSIS should have been used, but this existed way before I arrived on the scene.)  Well, there is a date stamp associated with the records being moved reflecting the date of the move.  The problem is, for some reason yet-to-be-determined, the date stamp sometimes comes over as NULL.  There are defaults in the table to add the GETDATE() value, but still, the date stamp disappears!  The disappearance happens maybe once every month or so across thousands of records, so the occurrence is rare to say the least.  The thing is, downstream processes will break if the date stamp field is not populated.  So to manage this, we actually had an IT resource each morning run a SELECT script to identify any NULL date values.  Can you imagine, coming into the office, opening SQL Server Management Studio (SSMS), and running this script daily, only to hope no records are returned?  Wow, right?  If there are records found, then appropriate action is then taken.

For a Data Driven Subscription in SSRS (2005, specifically), I simply created a nice report in SSRS to monitor what records match the NULL date state.  Mind you, running the report daily in SSRS to hope nothing appears is as bad as running a SQL script every day in SSMS.  Also, scheduling a daily report via normal Subscriptions is similarly bad.  So instead, I created a Data Driven Subscription.  The key to the Data Driven Subscription is how I populated the recipients of the subscription.  I used a script similar to the following:


SELECT TOP 1 [Recipient1] = ‘’

  FROM TargetTable WITH (NOLOCK)

  WHERE TargetDate IS NULL



If there is even one record pulled from this query, then the Data Drive Subscription fires an EMAIL to the recipient (or more if you choose to add EMAIL addresses delimited by semi-colons, although we send the EMAIL to a Outlook email group).   This way, instead of running a daily query or report, the report is generated and sent only when the scenario occurs.

Do you find yourself in this scenario?  I know in my company, I have employed this a few more times to watch and monitor data.  I am sure there are more robust ways to monitor data. Heck, we should have solved the issue outright by figuring out what went wrong.  But, we do know our processes are changing, so this problem will go away sooner than later, and creating a Data Driven Subscription is quick and easy!  Plus, I got kudos for removing daily tasks from IT resource, so yay for me!!!  heheheh… Please keep in mind, though, not all versions of SQL Server 2005 and up has Data Driven Subscriptions, so check please check your documentation.  More information on Data Driven Subscriptions can be found here on MSDN.

So what do you think?  Automation, right?  Anything where a user, developer, IT resource or such, does a regular task should be automated, right?  That was the opportunity I saw, and with SSRS, I was able to solve and implement with east!

Please let me know if you have any questions.  Thanks for reading!  And see you next month for T-SQL Tuesday!