Uncategorized – Agile BI http://agilebi.com A community for sharing ideas about business intelligence development using agile methods. Mon, 07 Apr 2014 13:42:39 +0000 en-US hourly 1 https://wordpress.org/?v=6.2.2 Going to the PASS Business Analytics Conference http://agilebi.com/blog/2014/04/07/going-to-the-pass-business-analytics-conference/ Mon, 07 Apr 2014 13:42:39 +0000 http://6.1537 Continue reading ]]> I found out recently that I’ll be able to attend the PASS Business Analytics Conference this year, which I’m pretty excited about. Also, I’m not presenting at this conference, so I will actually get to relax and enjoy the sessions by other speakers. If you haven’t registered yet, now’s a good time*.

There’s a lot of great content at this conference, and it’s a bit challenging in some time slots to decide on exactly what I want to see most. However, there are 3 sessions that I will definitely be attending:

AV-400-M – Deep Dive into Power Query Formula Language

Power Query is a cool technology for data transformation – one that I believe (hope) will continue to evolve and become relevant outside of just Excel. And it’s usefulness in quicking mashing up some data inside Excel is outstanding. This is a focused session on the formula language, which I’m interested in, and it’s being delivered by Matt Masson and Theresa Palmer-Boroski. Matt does a great job on these types of presentations. I haven’t seen Theresa present yet, but I’m confident she’ll do a great job, and this will be a good session.

ID-100 – Creating an End-To-End Power View Reporting Solution

Devin Knight (a co-worker at Pragmatic Works) is delivering this, and he puts together great sessions. Power View is one of those technologies that I don’t spend a lot of time with, but I know I need to know it better, and this session should help with that. Devin has a lot of practical experience with Power View, so this will be a great opportunity to get a real world look at what’s involved.

SA-102 – Analytics and OLTP Systems: Is Hekaton A Game-Changer?

Hekaton is the new in-memory technology in 2014. It’s primary focus is on improving the performance of OLTP applications, but Todd McDermid will be looking at it from the perspective of delivering analytics. He’ll be answering the question of whether it can be used to deliver a single database that suited for both transactional processing and analytics, and I’m very interested to see the results. I feel like the Hekaton technologies could have a place in the BI world, but I haven’t had a chance to go out and really investigate it myself. Thanks to Todd’s efforts, I won’t have to.

There are a lot of great sessions, and those are just 3 of the ones that appealed to me.  I’m really looking forward to attending, and I hope to see you there.

*If you aren’t already registered, you can use the discount code BABQ9B to get $150 off your registration.

]]>
Demo Materials for PASS Session BIA-302 – Building a Supportable ETL Framework http://agilebi.com/blog/2013/10/20/demo-materials-for-pass-session-bia-302-building-a-supportable-etl-framework/ Sun, 20 Oct 2013 19:01:01 +0000 http://8.106 Last week I presented the session “” at the PASS 2013 Summit.

Here’s a link to the demo that I went through.

BIA-301 – Demo Materials

]]>
Cannot Create a BI Semantic Model Connection to Tabular Cube http://agilebi.com/blog/2012/11/03/cannot-create-a-bi-semantic-model-connection-to-tabular-cube/ Sat, 03 Nov 2012 04:00:32 +0000 http://7.175 Continue reading ]]> Here’s the scenario:

Within the PowerPivot Gallery in SharePoint 2010 you create a new “BI Semantic Model Connection”.  In the “New BI Semantic Model” page you specify the name of the connection, the Workbook URL, and Database.  When you click OK the following error is displayed along with a checkbox to “Save the link file without connection validation”.  If you tick that checkbox and click OK then the BISM connection is created and works fine.

There were errors found while validating the page:

Cannot connect to the server or database.

The documentation from Microsoft does a really good job of explaining what is going on and what to do:

http://msdn.microsoft.com/en-us/library/hh230972.aspx#bkmk_ssas

Here’s the text:

Grant Analysis Services Administrative Permissions to Shared Service Applications


Connections that originate from SharePoint to a tabular model database on an Analysis Services server are sometimes made by a shared service on behalf of the user requesting the data. The service making the request might be a PowerPivot service application, a Reporting Services service application, or a PerformancePoint service application. In order for the connection to succeed, the service must have administrative permissions on the Analysis Services server. In Analysis Services, only an administrator is allowed to make an impersonated connection on behalf of another user.

Administrative permissions are necessary when the connection is used under these conditions:

  • When verifying the connection information during the configuration of a BI semantic model connection file.
  • When starting a Power View report using a BI semantic model connection.
  • When populating a PerformancePoint web part using a BI semantic model connection.

To ensure these behaviors perform as expected, grant to each service identity administrative permissions on the Analysis Services instance. Use the following instructions to grant the necessary permission.

Add service identities to the Server Administrator role

  1. In SQL Server Management Studio, connect to the Analysis Services instance.
  2. Right-click the server name and select Properties.
  3. Click Security, and then click Add. Enter the Windows user account that is used to run the service application.

    You can use Central Administration to determine the identity. In the Security section, open the Configure service accounts to view which Windows account is associated with the service application pool used for each application, then follow the instructions provided in this topic to grant the account administrative permissions.

Go ahead and follow the directions to check Central Administration to determine the identify of the SharePoint service account, just make sure to select the correct application on the Credential Management page.  In this case, “Service Application Pool – SharePoint Web Services System” should be the correct application.

Make note of the service account and it to the Analysis Services server admin group.

Also, make sure you’re adding the service account as a server admin, not a database admin.

If that doesn’t work it could be that you didn’t add the right service account.  A good way to find out exactly what service account is being used, without having to fumble around with Central Administration, is to use the SQL Server Profiler.  Start a new Profiler session on the Analysis Services server.  While the Profiler is running, attempt to create another BISM connection.  This is the result:

Look for the “Error” event class.  The service account listed under NTUserName is the the account that needs to be added as a server admin for Analysis Services.

 

]]>
BISM Connection to Tabular Cube and Data Link Properties http://agilebi.com/blog/2012/11/03/bism-connection-to-tabular-cube-and-data-link-properties/ Sat, 03 Nov 2012 03:04:57 +0000 http://7.164 Continue reading ]]> Here’s the scenario:

You have a PowerPivot Gallery within SharePoint 2010.  Within the gallery is a BISM connection that points to a Tabular cube.  You attempt to use the BISM connection to open a new Excel workbook by clicking on the Excel icon.

After Excel opens you get a Data Link Properties dialog box.

Trying to modify the Data Link Properties in any way is futile.  Entering a value for “Location:” won’t do anything, Analysis Services only runs with Windows NT Integrated security (so entering a a specific username and password is useless), and clicking the down arrow to select the initial catalog will yield a Data Link Error…

The following system error occurred: The application has failed to start because its side-by-side configuration is incorrect. Please see the application event log or use the command-line sxstrace.exe tool for more detail.

And once you click OK you get another lovely error…

Login failed. Catalog information cannot be retrieved.

Chances are you need to install the “Microsoft Analysis Services OLE DB Provider for Microsoft SQL Server 2012” from the which is located here:

http://www.microsoft.com/en-us/download/details.aspx?id=29065

Microsoft® Analysis Services OLE DB Provider for Microsoft® SQL Server® 2012

      The Analysis Services OLE DB Provider is a COM component that software developers can use to create client-side applications that browse metadata and query data stored in Microsoft SQL Server 2012 Analysis Services. This provider implements both the OLE DB specification and the specification’s extensions for online analytical processing (OLAP) and data mining.
]]>
How to Export BLOB data to Files in SSIS http://agilebi.com/blog/2011/12/09/how-to-export-blob-data-to-files-in-ssis/ Fri, 09 Dec 2011 19:03:43 +0000 http://9.112 Continue reading ]]> Overview

This topic will cover how to export data from BLOB fields in SQL Server and save those files out to the filesystem. This is accomplished within VB.Net code, but could just as easily be ported over to C#.

DBServer: SQL Server 2008 SP2
Development Environment: Visual Studio 2008

Package Setup

  1. Add an Execute SQL Task that retrieves the BLOB information from the Database
    • ResultSet: Full result set
    • Parameter Mapping: Add any necessary parameters for your stored procedure
    • Result Set
      • Result Name: 0 (zero)
      • Variable Name: Doesn’t matter, so long as the variable is of type System.Object
  2. Connect the Execute SQL Task to a For Each Loop Container (FELC)
    1. Collection: Foreach ADO Enumerator
    2. ADO object source variable: Result Set Variable Name from Step 1
    3. Enumeration mode: Rows in the first table
    4. Variable Mapping: Map out the columns from the Result Set to Package Variables (Index is 0-based)
      1. NOTE: Be sure to setup the BLOB output variable as a System.Object
  3. Add a Script Task to output the BLOB information to the file system

Script Code (VB.Net)

[code language=”vb”]

Public Sub SaveMemoryStream(ByVal buffer2 As Array, ByVal FileName As String, ByVal ms As MemoryStream)
Dim outStream As FileStream

outStream = File.OpenWrite(FileName)
outStream.Write(buffer2, 0, ms.Position)
outStream.Flush()
outStream.Close()
End Sub

Public Sub Main()
Dim Folder_Path As String
Dim File_Name As String
Dim s_File_Name As String
Dim buffer() As Byte
Dim oStream As System.IO.MemoryStream = Nothing
Dim oFileStream As System.IO.FileStream = Nothing
Dim buffer2() As Byte

Folder_Path = Dts.Variables("Working_Directory").Value.ToString() + ""
File_Name = Dts.Variables("File_Path").Value.ToString()

s_File_Name = Folder_Path & File_Name

Try
buffer = CType(Dts.Variables("Blob").Value, Byte())
oStream = New System.IO.MemoryStream(buffer)
oStream.Write(buffer, 0, buffer.Length)

buffer2 = oStream.ToArray()
SaveMemoryStream(buffer2, s_File_Name, oStream)

‘Close the stream
oStream.Flush()

Dts.TaskResult = ScriptResults.Success
Catch ex As Exception
Dts.Events.FireError(0, "Create Temp Image", ex.Message, "", 0)
Dts.TaskResult = ScriptResults.Failure

Finally
If Not oStream Is Nothing Then
oStream.Close()
End If

oStream = Nothing

If Not oFileStream Is Nothing Then
oFileStream.Close()
End If

oFileStream = Nothing
End Try

Dts.TaskResult = ScriptResults.Success
End Sub
[/code]

]]>
Demo Materials for PASS Session BIA-304–Building a Reporting Services Framework http://agilebi.com/blog/2011/10/16/demo-materials-for-pass-session-bia-304building-a-reporting-services-framework/ Sun, 16 Oct 2011 16:45:51 +0000 http://8.96 Continue reading ]]> I presented my session on ‘Building a Reporting Services Framework’ at the PASS Summit 2011 on Friday.  I had a great time at the summit, both presenting and attending all the great sessions.  A wonderful time was had by all.

Here’s a link to the demo materials I went through.  You probably don’t want to just push any of this out to production as is, but it should be a good start.  I also used the Adventure Works 2008 R2 database, but it isn’t included in this package.

]]>
Automatically Using the Latest Version of an SSRS Template http://agilebi.com/blog/2011/07/12/automatically-using-the-latest-version-of-an-ssrs-template/ Wed, 13 Jul 2011 03:46:33 +0000 http://8.92 Continue reading ]]> The ability to add Templates to SSRS (such as documented here and here and here) is a neat feature.  Basically, you just create an RDL and put it in the “C:Program Files (x86)Microsoft Visual Studio 9.0Common7IDEPrivateAssembliesProjectItemsReportProject” directory… now you can create a new report starting from the template by selecting it from the “New Items” dialog.  This is great when you don’t want to start from scratch on every report.

The only thing I’ve found to be a friction point is that every time someone updates the Template (which you’re storing somewhere in source control) you need to copy it over to that folder.  That’s no big deal, but when you have a lot of developers that all have to do that frequently, it’s a pain.

The easy way to solve this is with symlinks.  You can’t use a shortcut, but you can create a soft symlink to point back at your template in source control.  That way you will always create a report off of the latest version in your repository.  (Yeah, you still have to keep that up to date… no getting around that.)

[sourcecode language="bash" padlinenumbers="true" wraplines="true"]
mklink "C:Program Files (x86)Microsoft Visual Studio 9.0Common7IDEPrivateAssembliesProjectItemsReportProjectTemplatePortrait.rdl"  "C:SourceControlframeworksSsrsFrameworkReportingTemplatesTemplatePortrait.rdl"

mklink "C:Program Files (x86)Microsoft Visual Studio 9.0Common7IDEPrivateAssembliesProjectItemsReportProjectTemplateLandscape.rdl" "C:SourceControlframeworksSsrsFrameworkReportingTemplatesTemplateLandscape.rdl"
[/sourcecode]

That’s all there is to it!

]]>
Using SSRS queries with parameters with Netezza http://agilebi.com/blog/2011/07/05/using-ssrs-queries-with-parameters-with-netezza/ Wed, 06 Jul 2011 06:00:52 +0000 http://8.82 Continue reading ]]> Our company has started using SQL Server Reporting Services 2008 R2 to handle most of our reporting against Netezza.  This works pretty well over all using the Netezza 6.0 OLE DB driver.  There are a few things to be aware of though.

I found a number of people were using expressions to construct a SQL statement to send for the data set.  This works well, but it’s a little bit of a pain to both write and maintain.  Luckily, we can use parameters (with just a few things to be aware of).  That means we can write SQL that looks like this:

[sourcecode language="sql" padlinenumbers="true"]
SELECT
     date_key
    ,date
    ,good_day
    ,day_of_week
FROM
    date_sample
WHERE
    1=1
    AND day_of_week IN (@day_of_week_multi)
    AND date IN (@date)
    AND date_key =  @date_key
    AND good_day = @good_day 
ORDER BY
    date_key
;
[/sourcecode]

You can use parameters in the normal way with Netezza, but there are a few details to be aware of.  You just imbed the parameter in the query like

[sourcecode language="sql" padlinenumbers="true"]
WHERE cola = @param
[/sourcecode]

and create a corresponding parameter in your data sets.  The OLE DB driver handle surrounding surrounding the parameter with quotes when required (strings and date times), and it automatically escapes any single quotes in the string.

Parameters are ordinal, so you need to create a parameter (in order) for each value you want to send. There parameters in your query will be replaced in order based on the parameters you specified in the dataset.  No big deal, but something to be aware of.  You can either use a either a normal parameter (such as ‘@myparam’) or a question mark (‘?’).

Hears the rub… the OLE DB driver also automatically escapes single quotes so you don’t have to worry about that.  This is great… but there is a little bit of an issue when you want to use multi-value parameters.  Basically, the normal method of using SQL like

[sourcecode language="sql"]
WHERE col_a IN (@multi_value_param)
[/sourcecode]

and an expression like this for a parameter

[sourcecode language="vb"]
=JOIN(@multi_param, "’,’”)
[/sourcecode]

doesn’t work because the single ticks will always be escaped (so you’ll see a value like “’val1’’,’’val2’’,’’val3’” instead of “’val1’,’val2’,’val3’” in the resulting SQL).  No problem… we just need a different approach.  We’ll use something in the WHERE clause like

[sourcecode language="sql"]
STRPOS(@multi_param, '|' || col_a|| '|') > 0
[/sourcecode]

with an expression like

[sourcecode language="vb"]
=”|” + JOIN(@multi_param, "|”) + “|”
[/sourcecode]

This will bracket each value in a special “end of column” character such as “|”, and thus look for an exact match.  You may need to use a different character, or handle escaping, depending on your data.  You can always use multiple characters if you’d like (such as “~~~” as a delimiter).  This is a little hacky, but not too bad.  Remember to add the delimiter to the start and end of the string as well so you can match those values as well.

I tested this on a 1.5 billion row table against a similar query that uses an IN clause.  I found the two methods were essentially equal in performance, though that might degrade for an extremely large number of parameter values.  These both provide exact matches against the value, and they provide equivalent results.

But… what about using a multi-select on a date?  Unfortunately, that get’s slightly more complicated, but not too bad.  All you need to do is write a little code such as

[sourcecode language="vb" padlinenumbers="true"]
    ''' <summary>
    ''' Convert an array of Dates into a string.
    ''' </summary>
    ''' <param name="inputArray">Array of dates to convert.</param>
    ''' <param name="delimiter">Delimiter to place between each item in the array.</param>
    ''' <param name="formatString">Format string for the date.</param>
    ''' <returns>String of formatted datetimes separated by a delimiter.</returns>
    ''' <remarks></remarks>
    Public Function JoinDate(ByVal inputArray() As Object, ByVal delimiter As String, ByVal formatString As String) As String
        Dim output As String = String.Empty

        Dim i As Integer
        For i = 0 To inputArray.Length - 1
            output += (CDate(inputArray(i))).ToString(formatString) + delimiter
        Next i

        ' Trim the trailing delimiter off the string
        If output.Length > 0 Then
            output = output.Substring(0, output.Length - delimiter.Length)
        End If

        Return output

    End Function
[/sourcecode]

And use an expression like

[sourcecode language="vb"]
="|" + Code.JoinDate(Parameters!date.Value, "|", "yyyy-MM-dd") + "|"
[/sourcecode]

And you’re good to go!  This will convert a multi-select DateTime variable (which is an array of DateTime objects) into a string containing dates formatted as you request.

So for this example where we are constraining on a multi-value string parameter, a multi-value date parameter, plus a few other standard guys, we end up with

[sourcecode language="sql"]
SELECT
     date_key
    ,date
    ,good_day
    ,day_of_week
FROM
    date_sample
WHERE
    1=1
    AND strpos(@day_of_week_multi, '|' ||day_of_week || '|') > 0
    AND strpos(@date, '|' || date || '|') > 0  
    AND date_key =  @date_key
    AND good_day = @good_day 
ORDER BY
    date_key
;
[/sourcecode]

Here’s a sample that shows how to use integer, datetime, and string values (with some examples of multi-value parameters thrown in).  Just use the SQL script to build and populate the test table, set the data source appropriately, and play around with it.

]]>
How to Extract SAP Data in SSIS http://agilebi.com/blog/2011/05/25/how-to-extract-sap-data-in-ssis/ Wed, 25 May 2011 19:12:53 +0000 http://9.99 Continue reading ]]> This tutorial will step you through installing the Microsoft SAP ADO.Net connector and extracting data from SAP, all via SSIS. But first, some background information.

Overview

This week brought me the unique challenge of extracting data from SAP within SSIS. My boss had already done some work on researching components, and the official word was that we were going to use a product from Theobald Software — Extract IS. After downloading a trial version and working on getting connected to SAP (which took waaaay longer than it should have, namely because of sparse documentation), I was disappointed to realize that the component was not returning all of the records within my dataset. No matter what date range I specified for my table extract, I still received only 2,476 records — FAR fewer than the multi-millions in the table.

That brings us to where we are today. While troubleshooting my problems with the Theobald solution, I ran across the Microsoft SAP ADO.Net connector. I figured it wouldn’t hurt to see if I had the same issues with this connector; that way, I could rule out if it was a Theobald issue or not. Turns out that I was able to successfully connect to SAP and extract my data!!

Now, onto the important stuff: How to make it work!

Making It Work

There were many moving parts involved in getting this working, from installing SAP Adapters on my local machine to installing custom RFC Functions on the SAP box. We will go over all of that in detail in this post.

Configuring Local Environment

We need to start with installing components on the local machine first, since we will use some of the output later on for installing necessary items in the SAP environment.

Installing Necessary Software

  1. Visual C++ 2005 SP1 Redistributable: If you do not already have it installed, grab the download from here. Accept defaults for installation
  2. WCF LOB Adapter SDK Install: Grab the install files from here, picking the appropriate download file for your system architecture (x86 vs x64). Under Choose Setup Type, select Custom. In the resulting window, ensure that Runtime and Samples are both selected to install to local disk. Accept remaining defaults.
  3. BizTalk Adapter Pack install: Grab the install files from here, picking the appropriate download file for your system architecture (x86 vs x64). Under Choose Setup Type, select Custom. In the resulting window navigate to Base Adapters –> Microsoft BixTalk Adapter 3.0 for mySAP Business Suite, and ensure it is selected to install to local disk. Also, navigate to ADO Providers –> .Net Framework Data Provider for mySAP Business Suite, and ensure it is selected to install to local disk. Accept remaining defaults.

You should now have an ADO.NET SAP connector available for your use in SSIS:

You will need to secure the following information from your SAP Administrator in order to connect to SAP from SSIS:

  • Application Server Host
  • System Number
  • Client
  • Language
  • Username
  • Password

All the options under Messaging Server and Destination can be left blank in the SAP ADO.Net connection manager configuration page. Finally, click Test Connection to ensure that everything is good to go.

Configuring SAP Environment

The details of the SAP Environment configuration are beyond the scope of this blog post [especially since someone else installed the custom RFC function and set permissions, not me =)]. However, in the attached PPT Presentation, you can find detailed instructions for SAP configuration, from slides 28-39. Also, reference the attached help file for more detailed information regarding SAP configuration and setup.

Usage

Now that we have both pieces of the puzzle working together, it’s time to get some data! Fire up BIDS, and create a new Integrations Services Project. In that new project, add a Data Flow Task to the Control Flow. Within the Data Flow Task, drag over ADO NET Source from your toolbox. Having already configured the SAP Connection Manager above, select it as the ADO.NET connection manager from the drop down box. Under Data access mode, select  SQL command. Then, proceed to type in your SQL Command.

NOTE: SQL Syntax for SAP is ever so slightly different from normal T-SQL syntax. Have a look here for some important information. Pay particular attention to dates, if you are using those as part of your WHERE clause.

Click OK to exit the ADO.NET Source Editor. Next, connect up an OLE DB Destination to dump your data; open the destination and map the columns appropriately. If you have not already done so, create the table into which you will dump your data.

Hit F5 on the keyboard and let the good times roll!!

Conclusion

In this post, you learned how to install the SAP ADO.Net Adapters, configure the necessary RFC Functions on the SAP side, and extract data from a SAP System. I’ve uploaded the files referenced in this post to my SkyDrive. If you have any questions, please feel free to leave a comment!

]]>
XML as Source in SSIS — Part 2 http://agilebi.com/blog/2011/05/09/xml-as-source-in-ssis-part-2/ Mon, 09 May 2011 19:54:11 +0000 http://9.86 Continue reading ]]> Now that our data is cleansed, onto loading the data into our table! The cleanest, most sane approach that I have found to parsing through XML within a SSIS script task is by utilizing Linq to XML. More information can be found on this here. In order to use Linq to XML within your script task, you need to add the following references:

[sourcecode language=”csharp”]
using System.Xml;
using System.Linq;
using System.Xml.Linq;
[/sourcecode]

The flow of the package basically as follows: For each file in the input folder, read the contents into a StreamReader, pick out the pieces of XML that we care about, and repeat.

The XML files that I’m working with have three levels. For illustrations’ sake, we will call them Grandfather, Father, and Child. The Grandfather (Route) has multiple children (Trips), who then have multiple children (Stop(s)).

Assign Values to Variables

Now that we have added the necessary references to our script task, the next task is to code the task.

[sourcecode language=”csharp”]
foreach (var routes in (from route in StreamReader(outfilename, "ExtRoute")
select new
{
//Route-level attributes
RouteKey = (string)route.Attributes("RouteKey").FirstOrDefault(),
ActualStartTime = (string)route.Attributes("start").FirstOrDefault(),
ActualStopTime = (string)route.Attributes("end").FirstOrDefault(),
PlannedStartTime = (string)route.Attributes("DepartDate").FirstOrDefault(),
PlannedRunTime = (string)route.Attributes("PlannedElapsedTime").FirstOrDefault(),
ActualDriverName = (string)route.Attributes("driver").FirstOrDefault(),
ActualDriverID = (string)route.Attributes("driverid").FirstOrDefault(),
ActualTruckID = (string)route.Attributes("truck").FirstOrDefault(),
RouteUnPlanned = (string)route.Attributes("unplanned").FirstOrDefault(),
RoutingDate = (string)route.Attributes("RoutingDate").FirstOrDefault(),
Route = (string)route.Attributes("id").FirstOrDefault(),
Branch = (string)route.Attributes("group").FirstOrDefault(),
Trips = (from trip in route.Elements()
select new
{
//Trip-level attributes
TripKey = (string)trip.Attributes("TripKey").FirstOrDefault(),

//Stop-level attributes
Stops = (from stop in trip.Elements()
select new
{
LocationType = (string)stop.Attributes("LocationType").FirstOrDefault(),
Longitude = (string)stop.Attributes("Longitude").FirstOrDefault(),
Latitude = (string)stop.Attributes("Latitude").FirstOrDefault(),
CustomerName = (string)stop.Attributes("locationName").FirstOrDefault(),
CustomerNumber = (string)stop.Attributes("LocationRefNumber").FirstOrDefault(),
ActualSequence = (string)stop.Attributes("actualSequence").FirstOrDefault(),
GeoStop = (string)stop.Attributes("geoStop").FirstOrDefault(),
ArrivalTime = (string)stop.Attributes("arrivalTime").FirstOrDefault(),
DepartureTime = (string)stop.Attributes("departureTime").FirstOrDefault(),
StopDuration = (string)stop.Attributes("stopDuration").FirstOrDefault(),
ActualDistance = (string)stop.Attributes("ActualDistance").FirstOrDefault(),
DeltaDistance = (string)stop.Attributes("DeltaDistance").FirstOrDefault(),
DeltaTimeDriven = (string)stop.Attributes("DeltaTimeDriven").FirstOrDefault(),
WorkTime = (string)stop.Attributes("WorkTime").FirstOrDefault()
})
})
}
))
[/sourcecode]

The code above uses the IEnumerable C# function to quickly and efficiently query only the XML nodes that we care about. The following code is used for this purpose:

[sourcecode language=”csharp”]
{
foreach (var trip in routes.Trips)
foreach (var stop in trip.Stops)
{
DefaultBuffer.AddRow();
DefaultBuffer.TripKey = trip.TripKey;
DefaultBuffer.RouteKey = routes.RouteKey;
DefaultBuffer.CustomerName = stop.CustomerName;
DefaultBuffer.CustomerNumber = stop.CustomerNumber;
//etc…

}
[/sourcecode]

Be sure to notice the beginning and ending curly brackets. Don’t forget them — they’re important!!

Conclusion

I hope that these two articles have given you some insight into a quick and efficient way to import XML data into a SQL database. Any thoughts, suggestions, or criticisms are welcomed!

]]>