Agile BI http://agilebi.com A community for sharing ideas about business intelligence development using agile methods. Mon, 11 Jan 2016 20:55:23 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.2 Presentations at Atlanta’s BI Edition SQL Saturday http://agilebi.com/blog/2016/01/11/presentations-at-atlantas-bi-edition-sql-saturday/ Mon, 11 Jan 2016 20:55:23 +0000 http://6.1550 Continue reading ]]> I presented this weekend at SQL Saturday #477 in Atlanta. It was a great event, very well organized. I appreciate all the attendees at my sessions – there were some great questions and comments. I promised that I’d publish my slides and sample code, so here it is.

Getting Started with SSIS Script Tasks and Components

This session was an introduction to the scripting objects in SSIS, and how they can be used to extend the built in functionality. Download the files here.

Testing Data and Data-Centric Applications

This session was on testing data-centric applications, both during development and how you can continue validating your data in production. Download the files here.

Thanks again to eveyone who attended!

]]>
You Don’t Have Time for Testing?! http://agilebi.com/blog/2015/09/03/you-dont-have-time-for-testing/ Thu, 03 Sep 2015 16:51:22 +0000 http://6.1549 Continue reading ]]> I wrote a post for the Pragmatic Works blog that I thought would be interesting for my readers, so I’m posting a teaser here. If you want to read the whole post, go here.

I’ve been a big advocate of testing for applications, databases, data warehouses, BI and analytics for a while now. Not just any testing, but real tests that help you truly verify the state of your code, applications and data. I like Test Driven Development, but really any approach that focuses on automated, repeatable tests that verify meaningful functionality I find hugely beneficial. And almost no one I’ve ever talked to about this topic has disagreed with me. (There was that one guy, but he was a FoxPro developer, so…) But there’s often a point where the conversation goes sideways.

Continue reading….

]]>
Advanced Scripting in SSIS http://agilebi.com/blog/2015/08/17/advanced-scripting-in-ssis/ Mon, 17 Aug 2015 15:51:15 +0000 http://6.1548 Continue reading ]]> Last week I presented a session on Advanced Scripting for SSIS for Pragmatic Works. Thanks to everyone who attended, and all the great questions. I’ve had a few requests for the samples I used, so I wanted to make those available. You can download them from my OneDrive.

]]>
Follow Up for Continuous Delivery Presentation at CBIG http://agilebi.com/blog/2015/05/07/cbig-continuous-delivery/ Thu, 07 May 2015 19:43:45 +0000 http://6.1545 Continue reading ]]> I presented Continuous Delivery for Data Warehouses and Marts at the Charlote BI Group Tuesday night. They have a great group there and I look forward to going back.

This is one of my favorite topics, and I always get good questions. CBIG was no exception, with some great questions on managing database schema changes when using continuous delivery, how continuous delivery and continuous deployment differ, and how to manage this in a full BI environment.

One question came up that I needed to verify – “Can you call an executable from a post-deployment script in SSDT?” The scenario for this was running a third-party utility to handle some data updates. I have confirmed that the post-deployment scripts for SSDT can only execute SQL commands, so you can’t run executables directly from them. However, as we discussed at the meeting, you can add additional executable calls into the MSBuild scripts I demonstrated to manage that part of your deployment process.

I promised to make my presentation and demos available, so here they are. Please let me know if you have any questions.

]]>
Going to the PASS Business Analytics Conference http://agilebi.com/blog/2014/04/07/going-to-the-pass-business-analytics-conference/ Mon, 07 Apr 2014 13:42:39 +0000 http://6.1537 Continue reading ]]> I found out recently that I’ll be able to attend the PASS Business Analytics Conference this year, which I’m pretty excited about. Also, I’m not presenting at this conference, so I will actually get to relax and enjoy the sessions by other speakers. If you haven’t registered yet, now’s a good time*.

There’s a lot of great content at this conference, and it’s a bit challenging in some time slots to decide on exactly what I want to see most. However, there are 3 sessions that I will definitely be attending:

AV-400-M – Deep Dive into Power Query Formula Language

Power Query is a cool technology for data transformation – one that I believe (hope) will continue to evolve and become relevant outside of just Excel. And it’s usefulness in quicking mashing up some data inside Excel is outstanding. This is a focused session on the formula language, which I’m interested in, and it’s being delivered by Matt Masson and Theresa Palmer-Boroski. Matt does a great job on these types of presentations. I haven’t seen Theresa present yet, but I’m confident she’ll do a great job, and this will be a good session.

ID-100 – Creating an End-To-End Power View Reporting Solution

Devin Knight (a co-worker at Pragmatic Works) is delivering this, and he puts together great sessions. Power View is one of those technologies that I don’t spend a lot of time with, but I know I need to know it better, and this session should help with that. Devin has a lot of practical experience with Power View, so this will be a great opportunity to get a real world look at what’s involved.

SA-102 – Analytics and OLTP Systems: Is Hekaton A Game-Changer?

Hekaton is the new in-memory technology in 2014. It’s primary focus is on improving the performance of OLTP applications, but Todd McDermid will be looking at it from the perspective of delivering analytics. He’ll be answering the question of whether it can be used to deliver a single database that suited for both transactional processing and analytics, and I’m very interested to see the results. I feel like the Hekaton technologies could have a place in the BI world, but I haven’t had a chance to go out and really investigate it myself. Thanks to Todd’s efforts, I won’t have to.

There are a lot of great sessions, and those are just 3 of the ones that appealed to me.  I’m really looking forward to attending, and I hope to see you there.

*If you aren’t already registered, you can use the discount code BABQ9B to get $150 off your registration.

]]>
Demo Materials for PASS Session BIA-302 – Building a Supportable ETL Framework http://agilebi.com/blog/2013/10/20/demo-materials-for-pass-session-bia-302-building-a-supportable-etl-framework/ Sun, 20 Oct 2013 19:01:01 +0000 http://8.106 Last week I presented the session “” at the PASS 2013 Summit.

Here’s a link to the demo that I went through.

BIA-301 – Demo Materials

]]>
Q & A From “Unit Tests for SSIS Packages” http://agilebi.com/blog/2013/04/08/qa-from-unit-tests-for-ssis-packages/ Mon, 08 Apr 2013 20:26:14 +0000 http://6.1531 Continue reading ]]> I did a webinar this week for Pragmatic Works’ “Free Training on the T’s”. The topic was “Unit Tests for SSIS Packages”. If you attended, thanks for taking the time! If not, the recording is now available at the link above. You can also download the slides from my SkyDrive, and ssisUnit can be downloaded from CodePlex.

We had great turnout for the session, and a lot of great questions. I didn’t have time to address all of them during the webinar, and I had a number of requests to share my answers with all the attendees, so I thought I’d write up a blog post on the questions that I didn’t get a chance to answer.

Testing Practices

“For test drive development, should a “final test” (of sorts) be performed after the “refactor” phase to ensure that the act of refactoring didn’t negatively alter the code?”

Yes, absolutely. You want to run your tests after any code changes. This helps ensure that code continues working as expected, and that you don’t have hidden “side effects” associated with your changes.

“How would you unit test a package that is performing transforms in the source query?”

I would create a known set of data in a test version of the source database. I would then create a unit test for the data flow task that checks that the output includes the data with the proper transformations. This will get much easier in a future release, when it will support data flow component testing in the unit test harness, allowing you to test the results of a source component directly.

“What is the best way to incorporate SSIS testing on existing packages to automate..as this also needs requirement breakdown?”

I approach this in the same way that I approach adding tests to an existing .NET application. Start by identifying a subset of existing packages  to work with, and add tests to those. Once those have adequate coverage, move on to the next set. As far as requirements breakdown – yes, you do need to understand what the package is supposed to do in order to test it properly. You can add some simple test cases without a great deal of analysis (for example, did the task execute successfully?) but to get real value out of the tests, they do need to check that the task carries out the requirements as intended.

There is the option of generating unit tests automatically for existing packages via the ssisUnit API. While this can improve code coverage, I would caution you not to rely on it to verify real functionality.

“Sorry I missed the first part of the talk. so not sure if you already talked about this. But how do you recommend creating test data for dimensions if you need to use synthetic data? Are there any tools you recommend using?”

Remember that unit and integration testing is more about verifying functionality, and less about performance. So I like to create a small amount of handcrafted test data that hits the specific scenarios that I need to validate. I find that most of the tools out there for generating data tend to work well if I want large volumes of test data, but not so well for concrete examples. For example, when I need to validate that the package handles a Type 2 SCD update that affects multiple rows correctly, I need 3 to 5 rows of test data that are very specific. Data generators don’t do that very well.

Automating ssisUnit in Builds

“Can we automate the ssisUnit with build & deployments in TFS?” and “How would you enable automated testing with ssisUnit for Continuous Integration?”

The simplest way to to incorporate ssisUnit into your builds is by calling the command line test execution tool. It’s called ssisUnitTestRunner2008.exe (substitute the appropriate version number for your version of SSIS) and you can find it in the folder where you installed ssisUnit.

Another approach is to use the ssisUnit API, but this requires some level of .NET or PowerShell coding.

Compatibility

“Is ssisUnit backward compatible with VS2008?” and “Is this framework compatible with MS SQL Server 2012?”

It does work against 2008 and 2012. You can download the 2008 version directly from the CodePlex site. For 2012, you need to download and compile the source currently, which can be done with the free Express version of Visual Studio. The next release will have separate setups for each version.

“Does the SSIS unit testing work with evaluation edition of BIDS 2008 R2?”

Yes, it does.

“Will this connect with the Microsoft Parallel Data Warehouse?”

I have not tested this myself. However, because it uses standard OLEDB and ADO.NET connection technology for database access, I don’t see any reason why it wouldn’t work.

“I am a SQL 2008 R2 user.  What version of VS is John running?  What were those testing menu options?”

I was running SQL 2012 for the demo. The menu option for running ssisUnit was created using the External Tools menu option under the Tools menu in VS.

Setting Up and Using Tests

“I’m lost.  How did you set up the tests using the GUI and then link it to the SSIS package?  Did i miss that?”

During the demo, I showed a prebuilt ssisUnit test, that I opened in the ssisUnit UI tool. The unit tests are linked to packages through Package Reference objects, which basically refer to a package by it’s file path, or location in SQL Server.

“Can you use a different operator then equal to? Like greater than, less than, different than?”

Yes, the Asserts in ssisUnit can use expressions. These are C#-based expressions that evaluate to True or False. You can use an expression like “((DateTime)result).Date==DateTime.Now.Date” to check that the result of the Assert command is equal to today’s date. “result”, in the expression, represents the object returned by the command associated with the Assert. You can apply pretty much any C# / .NET operation to the result object. For more examples, check out this page.

“When performing SCD operations, the details of the row need to be inspected rather than just count of rows. How would you test this? Within ssisUnit or another tool?”

Well, I’d use ssisUnit, but I’m not exactly unbiased Smile. Currently, you can do this using the SQL Command, which enables you to retrieve specific row details from the database. In the future, you will be able to do this more directly by testing it in the data flow.

“Did you say the file command can accommodate various file types such as xml, csv, pipe delimited, etc.?”

I didn’t say that (I don’t think), but it will work with pretty much any file. Internally, it uses standard .NET file operations, so it’s very similar to the capabilities of the File System task in SSIS. It doesn’t actually care what format the file is in, it can copy, move, and delete it regardless. For line counts, it counts the number of carriage return / line feeds in the file.

“Do you have a guide on how to setup ssisUnit with SSIS? We have tried using this before, but we couldn’t successfully run the test.”

I’m sorry to hear you had problems with it. The Getting Started guide is admittedly light on content, and I really need to provide a more detailed walkthrough. Please vote for this here if you would like this prioritized over the other changes in the queue.

“I noticed in the code that ssisUnit doesn’t handle password protected packages. When will this be supported?”

I’ve added an issue to track this here. Please vote for it if you use password protected packages, as it will help me prioritize how quickly to address it.

“If possible, can you demo if a container can be executed? Especially a For loop or For Each loop?”

I didn’t have time to demo this during the presentation. Good thing too, because there was an error around handling of containers. This has now been fixed in the source code version of the project.

“When testing in a 64bit environment, is there a specific way to execute ssisUnit when data sources are 32bit?”

Currently, this requires that you compile a 32bit version of the source. In the next release, I will provide both 32-bit and 64-bit versions of the test execution utility.

“Will the new version of ssisUnit for 2012 actually include some bug fixes? We’ve tried the current ssisUnit and it’s pretty buggy.” and “Not a question, just feedback: So there are over 30 bug reports and enhancement requests on CodePlex (including UI bugs, missing parts like handling complex Data Flow logic unit testing, etc.) posted since the posting date in 2008. If you address some of these (particularly the UI bugs) in a new release we might try it again. The biggest lack my organization found is that there’s no support for Data Flow tasks, which are 80% or more of our ETL testing. Just some feedback to keep in mind if you make any updates in the future. I’ll gather any additional bugs that we found in our evaluation back in 2011 and add them to the site when I have time.”

Again, sorry that you have had difficulties with it, and I’d definitely appreciate any feedback you can add to the site. For data flow testing, I do use that fairly successfully today, and a number of other users do as well. Admittedly, component level testing for the data flow would be nice (and it is being worked on) but I’m curious about what is blocking you from using it today. If you can submit your issues to the site, I will look at how to address them.

As far as the number of outstanding requests and UI bugs, well. unfortunately, I don’t get paid for working on open source. So focus on this project tends to take a backseat to demands from paying work. That being said, I do want to address as many of the issues as possible, but I have to prioritize my time pretty heavily these days. If there are items that particularly annoy you, please vote for them, as I do use the votes to determine what I should work on. For the UI, I really don’t enjoy UI work (nor, as evidenced by the current GUI, is it my strongest skill as a developer), so you are unlikely to see any significant updates on that front on the open source project. However, Pragmatic Works has taken an interest in unit testing as of late, and we are investigating offering an enhanced UI that’s integrated with Visual Studio as part of our BI xPress product.

“This makes sense to me, but I don’t understand how I setup a package to match up to the unit test.”

Each test case can reference tasks by name or GUID. GUID is more accurate, as names can conflict if you have multiple tasks with the same name in different containers.

“In package 1, you were getting a table count as a task and then writing a unit test to check the results of that test.  Are you returning the result of the SELECT to a variable and then checking the value stored in the variable or are you directly testing the return value from the execute SQL task?”

The Execute SQL task in that test was designed to store the results of the SQL statement in a variable, so the test is written to check the value of the variable (using a VariableCommand) after the task executes. If you need to get a value directly from the database as part of the test, you can use a SqlCommand.

“Do unit tests always fall under Miscellaneous?”

Yes – the SSIS project structure doesn’t allow for custom folders.

“Would you have a more complex example for the test than the result of a table count?”

The ProductSample test on the website illustrates a few other test scenarios. If there are additional scenarios that you’d like to see examples for, please add an issue or discussion item on the site.

Miscellaneous

“Where I can get some xml to parse my .dtsx packages and only extract all the SQL code and associated SSIS task name?”

I’m not really sure how this relates, honestly. However, to do this, you’d need to either write a fairly complex XSLT transform, or use the SSIS API. Unfortunately, I don’t know of any public example code that illustrates this.

Sharing the Presentation

Many, many variations of “Can I get the slides?”

I mentioned this at the top of the post, but just in case: The recording of the webinar is available here: “Unit Tests for SSIS Packages”. You can  download the slides from my SkyDrive, and ssisUnit can be downloaded from CodePlex.

Thanks again to everyone who attended, and thanks for all the great questions.

]]>
Cannot Create a BI Semantic Model Connection to Tabular Cube http://agilebi.com/blog/2012/11/03/cannot-create-a-bi-semantic-model-connection-to-tabular-cube/ Sat, 03 Nov 2012 04:00:32 +0000 http://7.175 Continue reading ]]> Here’s the scenario:

Within the PowerPivot Gallery in SharePoint 2010 you create a new “BI Semantic Model Connection”.  In the “New BI Semantic Model” page you specify the name of the connection, the Workbook URL, and Database.  When you click OK the following error is displayed along with a checkbox to “Save the link file without connection validation”.  If you tick that checkbox and click OK then the BISM connection is created and works fine.

There were errors found while validating the page:

Cannot connect to the server or database.

The documentation from Microsoft does a really good job of explaining what is going on and what to do:

http://msdn.microsoft.com/en-us/library/hh230972.aspx#bkmk_ssas

Here’s the text:

Grant Analysis Services Administrative Permissions to Shared Service Applications


Connections that originate from SharePoint to a tabular model database on an Analysis Services server are sometimes made by a shared service on behalf of the user requesting the data. The service making the request might be a PowerPivot service application, a Reporting Services service application, or a PerformancePoint service application. In order for the connection to succeed, the service must have administrative permissions on the Analysis Services server. In Analysis Services, only an administrator is allowed to make an impersonated connection on behalf of another user.

Administrative permissions are necessary when the connection is used under these conditions:

  • When verifying the connection information during the configuration of a BI semantic model connection file.
  • When starting a Power View report using a BI semantic model connection.
  • When populating a PerformancePoint web part using a BI semantic model connection.

To ensure these behaviors perform as expected, grant to each service identity administrative permissions on the Analysis Services instance. Use the following instructions to grant the necessary permission.

Add service identities to the Server Administrator role

  1. In SQL Server Management Studio, connect to the Analysis Services instance.
  2. Right-click the server name and select Properties.
  3. Click Security, and then click Add. Enter the Windows user account that is used to run the service application.

    You can use Central Administration to determine the identity. In the Security section, open the Configure service accounts to view which Windows account is associated with the service application pool used for each application, then follow the instructions provided in this topic to grant the account administrative permissions.

Go ahead and follow the directions to check Central Administration to determine the identify of the SharePoint service account, just make sure to select the correct application on the Credential Management page.  In this case, “Service Application Pool – SharePoint Web Services System” should be the correct application.

Make note of the service account and it to the Analysis Services server admin group.

Also, make sure you’re adding the service account as a server admin, not a database admin.

If that doesn’t work it could be that you didn’t add the right service account.  A good way to find out exactly what service account is being used, without having to fumble around with Central Administration, is to use the SQL Server Profiler.  Start a new Profiler session on the Analysis Services server.  While the Profiler is running, attempt to create another BISM connection.  This is the result:

Look for the “Error” event class.  The service account listed under NTUserName is the the account that needs to be added as a server admin for Analysis Services.

 

]]>
BISM Connection to Tabular Cube and Data Link Properties http://agilebi.com/blog/2012/11/03/bism-connection-to-tabular-cube-and-data-link-properties/ Sat, 03 Nov 2012 03:04:57 +0000 http://7.164 Continue reading ]]> Here’s the scenario:

You have a PowerPivot Gallery within SharePoint 2010.  Within the gallery is a BISM connection that points to a Tabular cube.  You attempt to use the BISM connection to open a new Excel workbook by clicking on the Excel icon.

After Excel opens you get a Data Link Properties dialog box.

Trying to modify the Data Link Properties in any way is futile.  Entering a value for “Location:” won’t do anything, Analysis Services only runs with Windows NT Integrated security (so entering a a specific username and password is useless), and clicking the down arrow to select the initial catalog will yield a Data Link Error…

The following system error occurred: The application has failed to start because its side-by-side configuration is incorrect. Please see the application event log or use the command-line sxstrace.exe tool for more detail.

And once you click OK you get another lovely error…

Login failed. Catalog information cannot be retrieved.

Chances are you need to install the “Microsoft Analysis Services OLE DB Provider for Microsoft SQL Server 2012” from the which is located here:

http://www.microsoft.com/en-us/download/details.aspx?id=29065

Microsoft® Analysis Services OLE DB Provider for Microsoft® SQL Server® 2012

      The Analysis Services OLE DB Provider is a COM component that software developers can use to create client-side applications that browse metadata and query data stored in Microsoft SQL Server 2012 Analysis Services. This provider implements both the OLE DB specification and the specification’s extensions for online analytical processing (OLAP) and data mining.
]]>
How to Open Dashboard Designer in SharePoint 2010 http://agilebi.com/blog/2012/11/03/how-to-open-dashboard-designer-in-sharepoint-2010/ Sat, 03 Nov 2012 02:23:12 +0000 http://7.149 Continue reading ]]> So you’ve got a new Sharepoint 2010 server configured and now you want to create a PerformancePoint dashboard.  The first thing you need to do is create a new “Business Intelligence Center” site.  You can find it under the Data category in the Create window.

Once the site is created you’ll see a page that looks like this:

Ok.  Now we’re ready for business.  In order to create a PerformancePoint dashboard you’ll need to open Dashboard Designer and there are several ways to do this.

Method #1

If this is your very first time opening Dashboard Designer you’ll need to open it from the Business Intelligence Center.  On the right hand side of the page you see three orange headings: Monitor Key Performance, Build and Share Reports, and Create Dashboards.  Hover over the first heading, Monitor Key Performance, and click on the third link in the center of the page called “Start using PerformancePoint Services“.  The next page that opens will have a large button called “Run Dashboard Designer”.  Click it.  You may be prompted to run an application, don’t be alarmed, this is actually the install for Dashboard Designer.  It takes just a minute or so to complete and once finished Dashboard Designer will open.  Of course it goes without saying that the install for Dashboard Designer only happens the very first time you click the “Run Dashboard Designer” button.  After that, any time you click the button Dashboard Designer will open right up.

Method #2

Now that Dashboard Designer has been installed you can open it from the Start menu.  Click on All Programs, expand the SharePoint folder, and click on PerformancePoint Dashboard Designer.  Make sure you expand the “SharePoint” folder and not the “Microsoft SharePoint 2010 Products” folder. 

Method #3

Once you have some PerformancePoint objects deployed to SharePoint you can open Dashboard Desiger from the PerformancePoint Content list.  By default you can access this list from the left-hand nav bar, but if it’s not there you can get to it by clicking “All Site Content”.  To open Dashboard Designer from the PerformancePoint Content list, click the down arrow for one of the items and select “Edit in Dashboard Designer”.

Method #4

The final, and in my opinion, best way to open Dashboard Designer is by using a URL.  Not a URL to the “Run Dashboard Designer” page from Method #1, but a URL that opens Dashboard Designer directly.  To do this you’re going to have to do a little sleuthing.  First thing to do is navigate to the “Run Dashboard Designer” page from Method #1.  Assuming you’re using Internet Explorer, make sure the Menu bar is displayed in your browser.  Click on View and select “Source”.  This will open up the HTML source code behind the webpage.  Since we know that clicking the button automagically opens Dashboard Designer, do a search in the HTML source code for the text on the button, “Run Dashboard Designer”.  Notice the OnClick method of the button is calling a Javascript function called “OpenDD”. 

Now search for the OpenDD function name, it will probably be defined towards the top of the document.

Once it has been found, copy the function code and past into a Management Studio query window and transform it into working SQL code.  You don’t have to do this, but I recommend it because the final expression is a bit tricky to do in your head.  At least, it’s tricky for me to do in my head…

Here’s the actual SQL code…

DECLARE @designerRedirect AS VARCHAR(MAX)
SET @designerRedirect = '_layouts/ppswebparts/designerredirect.aspx'

DECLARE @siteCollection AS VARCHAR(MAX)
SET @siteCollection = '/sites/BI/'

DECLARE @siteLocation AS VARCHAR(MAX)
SET @siteLocation = '/sites/BI/Finance'

DECLARE @siteCollectionUrl AS VARCHAR(MAX)
SET @siteCollectionUrl = 'http://SQL2012Dev' + @siteCollection --= 'location.protocol + "//" + location.host' + @siteCollection

DECLARE @siteLocationUrl AS VARCHAR(MAX)
SET @siteLocationUrl = REPLACE(@siteLocation, @siteCollection, '') --=siteLocation.replace(siteCollection,"");

DECLARE @URL AS VARCHAR(MAX)
SET @URL = 'http://SQL2012Dev' + @siteLocation + @designerRedirect + '?SiteCollection=' + @siteCollectionUrl + '&SiteLocation=' + @siteLocationUrl

SELECT @URL

Make sure the values reflect your environment.  When you run the query you should end up with something like this:

http://SQL2012Dev/sites/BI/Finance/_layouts/ppswebparts/designerredirect.aspx?SiteCollection=http://SQL2012Dev/sites/BI/&SiteLocation=Finance/

To test that it works, copy the URL into your browser and verify that Dashboard Designer opens with your content.

]]>