tim laqua dot com Thoughts and Code from Tim Laqua

23Jan/132

Hiding SSAS Attributes With Powershell

The challenge here was that we have a cube with some great data as well as some low level detail (customer detail, think PII) that we want to expose to a larger audience. The problem is that the low level detail is not something that the new audience needs (or in some cases, is allowed) to have access to. The obvious answer here is Dimension security - why not use that? Two reasons. First, Dimension security is slow. Second, even if it wasn't slow, to hide all the data in many (30+) attributes is tedious to setup, and when new attributes were added we would have to make sure and disable access to those as well. To be clear, we're not just hiding attributes here, we're creating an entire copy of the existing SSAS database and hiding attributes in the copy.

# Import Required Libraries
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices") >$NULL
 
# Connect to server
$server = New-Object Microsoft.AnalysisServices.Server
$server.Connect('localhost')
 
# Make a metadata copy of the existing cube
$newdb = $server.Databases.GetByName('POS Data').Clone()
 
# Change ID and Name of SSAS DB
$newdb.Name = 'POS Data - Limited'
$newdb.ID = 'POS Data - Limited'
 
# Drop Existing SSAS DB if it exists
$server.Databases['POS Data - Limited'].Drop()
 
# Add the new copy to the server
$server.Databases.Add($newdb)
 
# Sync our copy of the new database with the server's copy
$newdb.Update([Microsoft.AnalysisServices.UpdateOptions]::ExpandFull)
 
# Grab the cube we want to work with from the new database
$cube = $newdb.Cubes.GetByName('POS Data')
 
# Hide the Location Dimension
$cube.Dimensions.GetByName('Location').Visible = $false
 
# Hide all attributes and hierarchies in the Customer dimension
$cube.Dimensions.GetByName('Customer').Attributes |  %{$_.AttributeHierarchyEnabled=$false; $_.AttributeHierarchyVisible=$false;}
$cube.Dimensions.GetByName('Customer').Hierarchies | %{$_.Visible=$false; $_.Enabled=$false;}
 
# Enable the key attribute in the customer dimension - it won't work if the key isn't enabled
$cube.Dimensions.GetByName('Customer').Attributes['Dim Customer'].AttributeHierarchyEnabled=$true
 
# Enable the Market attribute in the customer dimension
$cube.Dimensions.GetByName('Customer').Attributes['Market'] | %{$_.AttributeHierarchyEnabled=$true; $_.AttributeHierarchyVisible=$true;}
 
# Hide the Location Database attribute in the Product dimension
$cube.Dimensions.GetByName('Product').Attributes['Location Database'] | %{$_.AttributeHierarchyEnabled=$false; $_.AttributeHierarchyVisible=$false;}
 
# Add a new member to the role granting Read Only permissions in the cube
$newMember = new-object Microsoft.AnalysisServices.RoleMember("domain\tlaqua")
$newdb.Roles['Role 1'].Members.Add($newMember)
 
# Push our updates to the server
$newdb.Update([Microsoft.AnalysisServices.UpdateOptions]::ExpandFull)
 
# Process the new database
$newdb.Process([Microsoft.AnalysisServices.ProcessType]::ProcessFull)
 
# Disconnect from the server
$server.Disconnect()

This approach has two notable downfalls. First, you have to think up a different database name because the original db with the low level detail still exists on the server. Second, you have to ProcessFull the clone that you made. It doesn't close the data, it clones the metadata. All in all, works great for us, this particular cube is only processed once a week and the 20 minutes we lose processing data in to the clone is more than acceptable (looks really clean to the users as well).

3Feb/122

SSAS Cache Isn’t Making Cents

I stole the pun from my Business Analyst, Mr. John Seiler 😉 Now on to my issue - when SSAS caches the value for my [Actual] measure, it seems to do so based on the results of the first query that requests that coordinate. In this particular cube, there's bunches of tiny fractions and depending on how you slice it, it aggregates a little different. This is a fun problem in itself, but the part that drives me (and the Finance department) crazy is that if you go and slice on something OTHER than that first query that created the cache, the values they see don't always add up to the "Grand Total" in Excel - aka. "All"

These are the queries used for this test:

Query A

SELECT 
	{Actual} ON 0
FROM [GL] 
CELL PROPERTIES VALUE

Query B

SELECT 
	{Actual} ON 0,
	NON EMPTY Hierarchize({DrilldownLevel({[Account].[Accounts].[All]},,,INCLUDE_CALC_MEMBERS)}) ON 1
FROM [GL] 
CELL PROPERTIES VALUE

Query C

SELECT 
	{Actual} ON 0,
	NON EMPTY Hierarchize({DrilldownLevel({[Date].[Fiscal].[All]},,,INCLUDE_CALC_MEMBERS)}) ON 1
FROM [GL] 
CELL PROPERTIES VALUE
GO

Results (Cache was cleared with ClearCache on the entire Database before each Series)

Series 1 Series 2 Series 3

So basically the Grand Total of this GL cube is a random number based on whoever sneaks the first query in after the cache is cleared (processing, sync, etc).

And for all of you that think the MDX script breaks everything everywhere - I did comment out my entire MDX script before running these tests.

6Nov/101

Analysis Services Writeback – Working with Weight Expressions

While writeback has been around for a while, it didn't really get easy to do until Excel 2010 was released because wrapping a UI around it was cumbersome. Now that we have a simple writeback UI via the What-If analysis dialogs in Excel, it's pry worth poking around at how to get things to allocate the way you want.

Here's our starting point in the Adventure Works cube (note, I commented out all the scope statements in the MDX script that distribute quarterly quotas/targets to months - in AW, quotas only exist in the data warehouse at the calendar quarter level):

Let's start with something simple - Equal Allocation:

6Apr/102

Excel 2007 Hangs When Trying to Edit an OLAP Cube Filter

Ok, it doesn't hang every time you try to edit an OLAP cube filter, but sometimes - it appears to. In reality, I've never seen it permanently hang - just kind of go away for a while. Here's the basic symptom that the business will report to you:

"Excel freezes (or hangs) when I try to change this filter"

Most of us have seen this at one point or another and shrugged it off as a busy time or processing is going on or there are cats clogging up the tubes, etc. Tonight, I finally decided to figure out what's causing it.

15Jan/101

Clone Analysis Services Partitions with PowerShell

Most of us with large Analysis Services cubes partition our cubes by month or year or some other time-based slice and we have all, at one point or another, developed some way to create partitions for new months on-demand. Often, the solution to this seems to be a C# console application or SSIS package using AMO to create a new partition based off an existing partition. The problem I see with this is that maintaining it requires opening up the project or package, making changes, re-compiling, deploying, testing, deploying to production, verifying, etc. It also requires that whoever is going to maintain it is comfortable with C#.

To simplify the maintenance and get rid of the "black box" factor that utility apps like this tend to have, I put together a PowerShell script to do the same thing and a stored procedure to call the script. Really, it doesn't matter what you use as you're most likely using an almost identical chunk of code to get your new partition created - my argument is that using PowerShell instead of C# or SSIS reduces the cost of maintenance, improves readability, and facilitates better understanding throughout your team.

19Nov/092

Wouldn’t it be fun if Cubes could talk?

I didn't say "wouldn't it be useful" because after putting a test together, asking a cube questions with no context tends to return answers that it probably shouldn't have returned. In BI, it is incredibly important to understand what exactly it is you're asking for - if we just say we want "sales" and return an answer, nobody really knows what we meant by "sales." Sure, in various circles, "sales" means the same thing - but once you start talking to different areas, departments, etc - the meaning of the word starts to shift.

But I digress - asking cubes questions is still pretty fun and some of the random things it returns when you point it at your own cubes can be flat out hilarious.

Here's a few questions thrown at the Adventure Works cube in the Adventure Works DW 2008 Analysis Services database

24Oct/090

Charting Analysis Services 2008 Trace Events

The other day I was running some Analysis Services traces for some reason or another and ran across Chris Webb's post on (Visualising Analysis Services Trace Information in Reporting Services). After looking over that post, I thought it'd be interesting to visualize various types of processing, queries, etc - basically take a trace and then graph out what happened and when. Here's a few samples of what the project yielded:

  • Red: Cube Processing on a 2 Minute schedule
  • Green: Cube Processing on a 1 hour schedule
  • Blue: Cube Processing on a 1 hour schedule
  • Black: Query Activity

Most of the activity here is from SSRS subscriptions firing around 8AM
8AM MSRS Subscription Processing

7Mar/096

SQL Server Analysis Services Command (XMLA) Agent Job Step Reports Success On Command Failure

I ran in to this the other day, did some googling, and really did not like what I saw for workarounds. In SQL Server 2005, when an XMLA job step fails (returns an Exception node in the XML response), the job step still reports success (because it's defining success as "did I get a response") - this has been fixed in SQL Server 2008. Common workarounds are using ascmd.exe or SSIS to handle the XMLA commands (ish - both of those solutions add a lot of complexity for a simple problem). So, I came up with a workaround that checks the text of the previous job step for the substring "<Exception ". It's been working thus far, with no issues.

After each XMLA command step, insert a T-SQL step to verify that the XMLA command step succeeded:

DECLARE @JobName VARCHAR(64)
 
SET @JobName = ‘Name Of Job This Step Belongs to’
 
DECLARE @Message VARCHAR(1024)
 
SELECT TOP 1 @Message = CAST([message] AS VARCHAR(1024))
FROM msdb.dbo.sysjobhistory a
INNER JOIN msdb.dbo.sysjobs b
ON a.job_id = b.job_id AND b.[NAME] = @JobName
ORDER BY run_date DESC, run_time DESC, step_id DESC
 
IF @Message LIKE%<Exception %RAISERROR (@Message, 17, 1)

UPDATE (2009-04-03): Added , step_id DESC to ORDER BY clause - when the XMLA job fails instantly (say you tried to process a nonexistant partition), run_time doesn't have enough granularity to sort properly.

Once your done, your job steps will look something like this:

xmla-command-verification-steps

21Jul/082

Dynamic MDX queries in TSQL

Say you want to run the same MDX query for each row in a given rowset. I needed to do this for alerting purposes, where there were different alert thresholds for different attribute values in a given dimension attribute. After struggling with passing a variable to the query argument of the OPENROWSET command, I finally found the documentation that clearly stated that the query argument CAN'T be a variable. Or a concatination of a string and a variable. I still don't understand why... but the suggested workaround is to construct a giant TSQL string and run it using the EXEC command.

Ok - but how do we get the results of the query? Basically, the only way to do this is to create a temporary table in the current scope and do an INSERT INTO that temp table in your giant TSQL query. It all ends up looking something like this: