SQL Azure, CREATE TABLE, and USE

I’ve been playing around with the SQL Azure CTP for a little bit, and generally, it’s going well. However, as with any new technology, there are plenty of things to learn.  I’m planning a series of posts around SQL Azure to share what I’m learning about it. And yes, there will be some SSIS thrown in there, too – what good is a database in the cloud if you can’t get your data in and out?

One of the first things I did was create a new database and some tables (rather obvious, I suppose – you can’t really do much in SQL without that). Something that you will likely encounter immediately when creating tables is the difference between what you can do in SQL Server and what SQL Azure supports. Primarily, it’s related to physical options affecting the storage. As a comparison, here’s the script that SQL Server Management Studio generates if you right-click on a table and choose Script Table As…Create To.

USE [MyDatabase]

GO 

 

SET ANSI_NULLS ON

GO 

 

SET QUOTED_IDENTIFIER ON

GO 

 

SET ANSI_PADDING ON

GO 

 

CREATE TABLE [dbo].[MyTable](

    [MyKey] [int] IDENTITY(1,1) NOT NULL,

    [MyString] [varchar](30) NOT NULL,

    [UpdateID] [int] NOT NULL,

    [UpdateDate] [datetime] NOT NULL,

 CONSTRAINT [pkMyTable] PRIMARY KEY CLUSTERED (

    [MyKey] ASC)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY],

 CONSTRAINT [akMyTable] UNIQUE NONCLUSTERED (

    [MyString] ASC)WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY])

 ON [PRIMARY] 

GO 

 

SET ANSI_PADDING OFF

GO 

 

ALTER TABLE [dbo].[MyTable] ADD

  CONSTRAINT [dfUpdateDate]  DEFAULT (getdate()) FOR [UpdateDate]

GO

Here’s the same CREATE TABLE script, but trimmed down to just the items SQL Azure supports:

SET QUOTED_IDENTIFIER ON

GO 

 

CREATE TABLE [dbo].[MyTable](

    [MyKey] [int] IDENTITY(1,1) NOT NULL,

    [MyString] [varchar](30) NOT NULL,

    [UpdateID] [int] NOT NULL,

    [UpdateDate] [datetime] NOT NULL,

 CONSTRAINT [pkMyTable] PRIMARY KEY CLUSTERED (

    [MyKey] ASC)WITH (STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF),

 CONSTRAINT [akMyTable] UNIQUE NONCLUSTERED (

    [MyString] ASC)WITH (STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF))

GO

 

ALTER TABLE [dbo].[MyTable] ADD

  CONSTRAINT [dfUpdateDate]  DEFAULT (getdate()) FOR [UpdateDate]

GO

This is all documented in the SQL Azure documentation on MSDN, under the Transaction SQL Reference. And, as expected, most of the options that aren’t supported are related to physical storage.

One item that does stand out a bit, though, is USE. The USE statement is supported, but only if it references the current database, as in USE MyDatabase when you are connected to MyDatabase. Executing USE MyOtherDatabase when you are connected to MyDatabase will result in an error. Instead, you have to disconnect from MyDatabase and connect to MyOtherDatabase. It does make some sense not to allow users to switch databases in a multi-tenant model (I can picture all sorts of interesting hacks being created if that were possible). I do wonder, though, why it was included at all, as it is fairly useless in it’s current form. Maybe a future enhancement?

Anyway, if you are interested in SQL Azure, what this space for more updates as I continue working with it.

Posted in Uncategorized | Comments Off on SQL Azure, CREATE TABLE, and USE

Adding Custom Components to SSIS

Below are the basic steps for adding custom components to SSIS; however, for more in-depth information I highly recommend reading Matthew Roche’s post about deploying and testing custom SSIS components

1. Register the custom assembly in the Global Assembly Cache (GAC)

Copy the DLL to the c:windowsassembly directory.  The easiest way is to drag and drop the DLL using Windows Explorer (right clicking on the DLL and doing a copy/paste will not work).  Alternatively you can use the command line utility called GACUTIL.EXE (for more info about GACUTIL see the link above to Matthew’s Roche’s post).  Either way you will need Admin rights. 

2. Copy the custom assembly to Visual Studio’s “special folder” 

Copy the DLL to the appropriate subfolder in the C:Program FilesMicrosoft SQL Server90DTS directory.  For Control Flow Tasks put the custom assembly in the Tasks folder; for Data Flow Tasks use the PipelineComponents folder.  Remember that Visual Studio is a 32 bit application.  If you’re running on a 64 bit machine, make sure to use the Program Files (x86) path, otherwise you won’t see your component listed in the Choose Toolbox Items window of Step 3.

3. Add the component to the SSIS toolbox 

Open an Integration Services project in Visual Studio, right click anywhere in the Toolbox, and select Choose Items.  Go to the appropriate tab in the Choose Toolbox Items window and select the appropriate components.  

Posted in SSIS | Tagged , , , | Comments Off on Adding Custom Components to SSIS

Presenting at the SQL Server Best Practices Conference

I’m going to be presenting two sessions at the SQL Server Best Practices Conference on August 24-26 in Washington, DC. This conference has a different focus than many of the technical conferences that I’ve presented at in the past. Instead of technical deep dives, the focus is on providing attendees with information on the decision points they are likely to encounter in their projects, and guidance on how to choose the best course of action at those points. It promises to be a very interesting conference.

I’ll be presenting on two topics. The first, Unit Testing SSIS Packages, is a topic I’ve presented on before, but not with this focus. I’m a big fan of unit testing and the test driven development (TDD) model, but I’ve found that there are numerous challenges in applying this approach to SSIS. So my session will focus on the best practices you can use to make this successful.

My other topic, SQL Server BI in the Cloud, is a new one for me (and for pretty much everyone, considering that SQL Azure is still pre-CTP). However, I’ve been doing some work with it and other cloud based models for BI, and there are some clear key decision points that you need to consider when embarking on this type of project. I’m really looking forward to this one, as it’s a new area, and that always prompts good discussions.

If you are attending, please consider dropping by for my sessions. If you’re not registered for the conference, there’s still time, and there’s a great lineup of speakers.

 

BestPractices_banner640x96

Posted in Uncategorized | Comments Off on Presenting at the SQL Server Best Practices Conference

The Path Specified Cannot Be Used At This Time

This morning I needed to check the properties for one of our SharePoint application pools so I logged onto the server and opened Internet Information Services (IIS) Manager.  To my surprize the root was empty; it should have automatically connected to the localhost.  To my even greater surprize, all attempts to manually get IIS to connect to the server yeilded the message “The path specified cannot be used at this time”. 


After a quick search I found a post by Old Nick stating that this problem is easily resolved by restarting the Windows SharePoint Services Timer service.  That’s pretty obvious, right?  Sure enough, restarting that service solved the problem.  The next time I opened IIS it connected to the localhost. 

Posted in SharePoint | Tagged , , | Comments Off on The Path Specified Cannot Be Used At This Time

PPS Filters: Beware of Changing Keys

The other day I thought it would be fun to arbitrarily change the keys for all the members in one of my dimensions.  Actually, there was a legitimate reason but after the fact I realized there was a much better way to solve my problem that didn’t involve changing the keys.

Nevertheless, the point is this: If you have a PerformancePoint (PPS) filter linked to an attribute in a dimension, and the keys change, your filter will break. 

Lets say you have a PPS dashboard containing a filter and a scorecard.  The filter is linked to the Category attribute in the Product dimension.  If the keys in the Product dimension change from Bikes having a key of 3 and Accessories having a key of 4 to Bikes having a key of 4 and Accessories having a key of 5, then whenever you select Bikes in the filter the scorecard will display the results for Accessories.  If these were the only two members in your dimension I’m not sure what would happen if you selected Accessories but I assume the scorecard would just show empty cells.  Or spontaneously combust.

To fix this all you have to do is edit the filter by clicking on Member Selection, going through the wizard, and publish/deploy your dashboard.

Posted in MDX, PPS | Tagged , , | Comments Off on PPS Filters: Beware of Changing Keys

Boost Performance in PerformancePoint Server

I found this and thought I’d pass it on.

http://performancepointblog.com/2009/03/performancepoint-fix-slow-rendering-dashboard-issues-with-internet-explorer-8/

When implementing a PPS dashboard, the browser of choice should be Internet Explorer 8.

Whereas IE7 only supports up to 2 concurrent connections to the server, IE8 supports up to 6 concurrent connections to the server.  In cases where there are a lot of objects on the dashboard you should see a significant increase in performance.  This is because each object on the dashboard (each filter, each scorecard, each chart) requires a connection to the server.  Instead of rendering only 1 or 2 objects at a time, IE8 can render up to 6 at a time.

Posted in PPS | Tagged , | Comments Off on Boost Performance in PerformancePoint Server

View Permissions for Reporting Services in SharePoint Integrated Mode

Setting up security for SSRS in SharePoint integrated mode can be a bit tricky, particularly if you want to set up some of your users to only be able to run reports, but not to be able to modify or change them. If you give the users the standard Contribute permission level in SharePoint, they have the ability to view and execute reports, but they can also delete existing reports or add new ones. If you assign only the Read permission level to the users, they won’t even be able to see the reports in the document library.

Based on my experiences with it, there seems to be a requirement that users have the Edit Items permission, even if you want them to only view reports.  Till that’s resolved, this will get you to the minimal set of permissions needed. To enable the desired behavior, you have to create a new permission that has a specific set of privileges, including the View Item and Edit Item permissions. Note that not all of the permissions listed below are 100% required, but they are recommended for correct SharePoint operation.

To set it up so users can see and run reports, but not add or delete them, do the following:

  1. Go to the site that contains the Reports Library that you want to secure.
  2. Select Site Actions..Site Settings..Modify All Site Settings.
  3. Under Users and Permissions, select Advanced Permissions.
  4. Choose Settings..Permission Levels.
  5. Choose Add a Permission Level.
  6. Give the new permission level a name like "Report Reader", a description like "Minimal permissions to view reports.", and select the following permissions:
    1. List Permissions:
      1. Edit Items (hopefully, the requirement for this will be removed soon, as it gives the user the ability to edit the name and description of the report)
      2. View Items
      3. Open Items
      4. View Versions
      5. View Application Pages
    2. Site Permissions:
      1. View Pages
      2. Browse User Information
      3. Use Remote Interfaces
      4. Open
  7. Click Create to create the new permission level.

Now, you can create a new SharePoint group that uses the Report Reader permission level, or assign it directly to a user.

This seems to be either a bug in the product or a configuration problem on my part. However, if it’s a configuration problem, it’s occurring on all the SharePoint boxes I have access to, and I can’t find any information on how to fix it. If you do have any information on how to avoid giving the Edit Items permission, please post it here in the comments.

Posted in Uncategorized | Comments Off on View Permissions for Reporting Services in SharePoint Integrated Mode

[Double Clicking Bracketed Text in SSMS]

I’m a big fan of SQL Server Management Studio; it’s a really great tool with a tremendous amount of functionality.  Lately I’ve been working in the new 2008 version of SSMS and a few days ago I noticed some peculiar behavior: when you double click on text that is surrounded by brackets in an MDX query editor, only the text is selected, not the brackets; however, when you double click on text that is surrounded by brackets in a regular SQL query editor, the text and the brackets are selected – even if the text within the brackets contains spaces.  How interesting is that?! Below are a couple screenshots to illustrate what I’m talking about:

Selected text in an MDX editor:

Here you can see that I double clicked on the word “Adventure” and only the word “Adventure” is selected.

Selected text in a regular SQL editor:

Here you can see that I double clicked on the word “Test” (you’ll just have to trust me on this!) and the entire text within the brackets, including the brackets, is selected.

Posted in SSMS | Tagged , , , | Comments Off on [Double Clicking Bracketed Text in SSMS]

Quick Tip for Specifying Columns in Source Components

It’s generally accepted* that when you are using an OLE DB Source or an ADO.NET Source, you should specify all the columns, rather than selecting the table from the dropdown, or using a SELECT * SQL statement.

If you’re like me, then you probably don’t have the columns in all your tables memorized, and you don’t like typing, so getting a complete list of columns is a bit painful. You either have to use the query designer to create the whole query, or go to SSMS and create the SELECT statement from there. However, there is a simpler way. Just type “SELECT * FROM [your table here]” into the source component’s SQL Command box, and then click the Build Query button. The query designer will automatically expand the * into the specific column names. Just click OK as soon as the designer opens, and you end up with a fully populated column list.

 

*For some of the reasons why this is generally accepted from an SSIS perspective, see these posts from Jamie Thomson: http://blogs.conchango.com/jamiethomson/archive/2006/02/21/SSIS_3A00_-SELECT-_2A002E002E002E00_-or-select-from-a-dropdown-in-an-OLE-DB-Source-component_3F00_.aspx and https://blogs.conchango.com/jamiethomson/archive/2006/01/05/2554.aspx (point #4). If you want reasons from a general SQL standpoint, search for “never use ”SELECT *” in SQL” in your favorite search engine.

Posted in Uncategorized | Comments Off on Quick Tip for Specifying Columns in Source Components

Presenting at Midlands SQL PASS Chapter

I’ll be presenting at the Columbia, SC PASS user group on July 7. I’ll be speaking on “Getting Started with Analysis Services 2008”, which is a demo heavy introduction to using SQL Server Analysis Services 2008. If you want to use SSAS, but aren’t sure quite where to start, this session should help.


If you are in the Columbia area on July 7, please drop by. You can find more information here:

http://midlands.sqlpass.org/

Posted in Uncategorized | Comments Off on Presenting at Midlands SQL PASS Chapter