Posts tagged ‘Microsoft BI’

Scale Up or Scale Out for SQL Server Data Warehouses

Historically, scale up has been the model for Microsoft data warehouses. Running a large, multi-terabyte data warehouse meant buying a lot of hardware for a single server, and hoping that it would be enough, once the warehouse was fully loaded and under use. If the hardware wasn’t sized properly, you could be looking at big costs for purchasing a new server, with more capacity for memory, disk, and CPUs.

Over the past several months, though, there have been a number of announcements in the SQL Server space that change that. We now have the option of scaling our warehouses up or out. Project “Madison”, which is the integration of the massively parallel processing (MPP) technologies from the DATAllegro acquisition, promises to allow SQL Server 2008 to scale out to 100s of terabytes in the warehouse, by distributing processing among multiple commodity servers. Even though it’s not been officially released yet, I’ve seen several demos of the functionality, and it looks promising. The advantage of this approach is that as you need additional capacity, you simply add additional servers.

On the scale up front, last week Microsoft announced “SQL Server Fast Track Data Warehouse”, which is a set of reference architectures for symmetrical multi processing (SMP) data warehousing. These are single server configurations that are optimized for data warehousing workloads, and have been tested and validated. These take much of the guesswork out of sizing your data warehouse server. However, you still have to provide good estimates of query volume and size to use the reference architectures effectively.

So now the question becomes, should you target a scale up or scale out approach for your data warehouse? One of the deciding factors is going to be your data volume. The Fast Track reference architectures are currently targeted towards 4 to 32 terabyte warehouses. Given current hardware restrictions, that’s the practical limit for a single server. However, as the hardware continues to get better, that number is expected to go up. “Madison”, on the other hand, can scale well past 32 terabytes. So if your current data needs are greater than 32 terabytes, I’d be looking closely at “Madison”.

What if your current needs are less than 32 terabytes, but you expect to grow past that point over the next couple of years? Well, fortunately, the Fast Track reference architectures are designed to offer an easy transition to “Madison”, when your needs grow to that point. And if you expect your data volumes to stay below the 32 terabyte mark, then the Fast Track reference architectures certainly offer a greater degree of confidence that you are getting the appropriate configuration for your warehouse.

It’s always nice to have options, and improving the scaling abilities of SQL Server should certainly help Microsoft in the large data warehouse marketplace. However, the roadmap for how this might apply to the Analysis Services component of SQL Server hasn’t really been directly addressed yet. It would seem logical to offer the same sort of solutions in that space. It will be interesting to see which direction Microsoft takes on that.

Impact of the PerformancePoint Server Changes

If you follow business intelligence news at all, then you probably saw the news from Microsoft last week that PerformancePoint is becoming a component of SharePoint. However, it won’t be all of PerformancePoint – the Plan portion will see one additional service pack (SP3), then development will cease. The Monitor and Analyze portions of the product will become part of the SharePoint Enterprise license.


Reaction has been mixed. Generally, many people see the advantage in including the Monitor and Analyze functionality in SharePoint, as it will open that functionality to a much broader audience. This lines up nicely with Microsoft’s “BI for the masses” vision that they have working toward for several years. It also lines up with the more recent marketing message, “People-ready BI”. Seeing that SharePoint is becoming the place that many users go to do their work, it makes sense to incorporate their BI tools in the same location. I think that offering PerformancePoint Services (the new name for the Monitor and Analyze functionality under SharePoint) as part of SharePoint will make it easier to include BI functionality in new applications and lower the barrier to adoption of this functionality in organizations of all sizes.


The negative reactions are primarily around two things: discontinuing Plan, and not having a full-client story (besides Excel). I understand the reactions around discontinuing Plan. Version 1 had some rough edges (OK, a lot of rough edges), but Microsoft has a history of quickly releasing subsequent versions with much better functionality, and usually having a very good product by version 3. Breaking this pattern caught a lot of people by surprise. Version 1, while lacking in a few key areas, was definitely usable. Some of Microsoft’s customers are using it in production, and even more partners had made significant investments in it. Fortunately, while Mariner had done some work with it, we had not invested heavily in it. We were more focused on the Monitor and Analyze portions of the product. In part, this was because we recognized that performance management is a specialized discipline, requiring some specific skill sets. Just because you can deliver successful solutions on Microsoft technology doesn’t necessarily mean that you can deliver successful performance management solutions. I think that was a point of confusion for many partners (the “one stop shop” approach is very popular in the partner community), and that lead to Microsoft not having as strong of a partner base to support the product as they had hoped. On the other hand, there were some really strong partners in the performance management space who did some great things with Plan, and I can certainly empathize with those that made big investments and are now disappointed by the change in direction.


Mauro Cardarelli, a SharePoint MVP, had an interesting post on his concerns that making PerformancePoint available as part of SharePoint raises the same concerns. Competent  delivery of SharePoint solutions doesn’t necessarily correlate to competent delivery of BI functionality, and successful delivery of BI solutions doesn’t mean that you can deliver good SharePoint solutions. Since this was one of the challenges for Plan, it will be interesting to see how it plays out going forward. In the short term, I’d encourage companies to be sure that their vendors either have both sets of skills (and can demonstrate that they’ve used them in the same project), or look for best-of-breed partners who are willing to work together.


The full-client story is a concern. The current direction seems to be for Excel to become the full client for consuming Analysis Services data, and for SharePoint to become the thin client interface. I’m definitely in favor of SharePoint as the thin-client interface, but using Excel as the full client leaves a pretty big gap in the story. It used to be that you could recommend ProClarity desktop to fill that gap, but since ProClarity is in a support only mode now, that’s not a good option. In time, more of the functionality of ProClarity should surface in Excel and SharePoint, but that’s still some time off. And Excel, while improving as an Analysis Services client, is still not on par with a dedicated desktop client built to expose the full functionality of Analysis Services. Hopefully that will improve over the next couple of releases of Excel, but in the meantime it creates opportunities for third parties to fill the gap.


Overall, I think this move will promote broader adoption on the Monitor and Analyze functionality in Microsoft’s customer base, and will strengthen the value proposition for moving to SharePoint Enterprise licenses. It’s a good thing for Microsoft, and good for customers who have already invested in SharePoint. However, it remains to be seen what impact not having a planning component or a strong full client application in the BI stack will have.


Some other reactions from around the web:


Chris Web (he also has a set of links featuring other reactions)


Nigel Pendse


Cindi Howson