Reporting Services (SSRS/MSRS) 2008 Error: Set used with the complement operator must have all members from the same level
When you use the Not In operator in a SSRS 2008 MDX query filter to exclude a named set, it uses a the complement operator in the constructed MDX. This is fine as long as "all members [are] from the same level." Since you got this error, they are not 😉 You can get around this by using the Except() MDX function instead of letting SSRS use the Complement operator
In the ReportServerService log, you'll see something like this:
Microsoft.AnalysisServices.AdomdClient.AdomdErrorResponseException: Query (..., ...) Set used with the complement operator must have all members from the same level.
Hierarchy: Calendar Date
Operator: Not In
Filter Expression: [Today]
Hierarchy: Calendar Date
Filter Expression: Except([Time].[Calendar Date].[Calendar Date].MEMBERS, [Today])
UPDATE 2012-03-16: Please also take a look at Slowly Changing Dimensions with MD5 Hashes in SSIS which we have determined to be fastest, most efficient approach to maintaining Type 1 dimensions.
A few days ago, one of our SSIS packages that maintained a Type 1 Slowly Changing Dimension (SCD) of about 1 million rows crept up to 15 minutes of runtime. Now this doesn't sound too bad, but this is part of our hourly batches, so 15 minutes is 25% of our entire processing window. The package was using the Slowly Changing Dimension Wizard transformation - we were doing the standard OLEDB Source (which basically represented how the SCD "should" look) and then sending it to the SCD transform and letting it figure out what needed to be inserted and updated. One option was to switch to lookups instead of the SCD wizard to speed things up, maybe even some fancy checksum voodoo for the updates (see http://blog.stevienova.com/2008/11/22/ssis-slowly-changing-dimensions-with-checksum/ for an example). Then after thinking about it a little more - why are we sending a million rows down the pipeline every hour? We know only a small percentage of these are new - and another small percentage needs to be updated. Well, we can just write a quick SQL query to get us just those sets and the package would be much more efficient!
Wait a tick - why would we give the rows to SSIS if all it is going to do insert one set and update the other? Let's just do it all in T-SQL:
I run this one pretty frequently when we need to figure out what procs are killing a complex ETL process and what exactly about them is making the server cry. So basically, if it's on the development server, I'll do a
DBCC FREEPROCCACHE and a
DBCC DROPCLEANBUFFERS, run the entire set of ETLs, then run this query and then dig deeper in to the query plans that look suspect (high *Scan counts usually, sometimes lots of Hash Matches or Merge Joins). On a production server, the clearing of the proc cache and dropcleanbuffers can be problematic so I'll often just run the query after a scheduled ETL run. If you want to see the query plans mapped out visually, click on the query_plan value and SSMS will open up the XML. Then save that XML file as a .sqlplan file. Once you have that, close the XML and then open the .sqlplan file.
While working with some SSIS logging, it occured to me that there's really no good way to globally apply the same logging to all packages on your server. For our production SSIS server, it would make sense to log all OnError events to the Application Event log. Then, whatever event log monitoring app you use can notify you of the package error (with details!). Thus, this app was born - an app to remove pre-existing LogProviders and add in the intended LogProvider and options.
This is really just an example of the implementation that worked for my purposes - I'm sure some of you would prefer to log elsewhere. This should be enough to get you going. Please drop a line in the Comments if you post alternative approaches.
SSISForcedLogging.Console.exe "Z:\SSIS Packages" or SSISForcedLogging.Console.Exe "Z:\SSIS Packages\Package1.dtsx"
Originally, we were charged with figuring out how to display SSAS cube measure descriptions via ToolTip in Excel 2007. If that's your plan, forget it - after some reading up on the interwebs, it appears that Excel doesn't even request the Description property. Additionally, if you want to add a description to Calculated Members, you have to hack it in (yuck).
So we went with a simple, albeit relatively crude (but effective), alternative - implementing a URL action for Cells so users can easily link out to a definition of the measure they're looking at.
Create a new action in your cube (Open up the cube definition, Actions tab) and configure similar to this:
Name: View Member Definition Action Target Target Type: Cells Target object: All cells Action Content Type: URL Action expression: "http://i.domain.com/doc/Defs.aspx#" + [Measures].CurrentMember.Name Additional Properties Invocation: Interactive Description: View Member Definition "View Definition Of " + [Measures].CurrentMember.Name + "..." Caption is MDX: True
I ran in to this the other day, did some googling, and really did not like what I saw for workarounds. In SQL Server 2005, when an XMLA job step fails (returns an Exception node in the XML response), the job step still reports success (because it's defining success as "did I get a response") - this has been fixed in SQL Server 2008. Common workarounds are using ascmd.exe or SSIS to handle the XMLA commands (ish - both of those solutions add a lot of complexity for a simple problem). So, I came up with a workaround that checks the text of the previous job step for the substring "<Exception ". It's been working thus far, with no issues.
After each XMLA command step, insert a T-SQL step to verify that the XMLA command step succeeded:
DECLARE @JobName VARCHAR(64) SET @JobName = ‘Name Of Job This Step Belongs to’ DECLARE @Message VARCHAR(1024) SELECT TOP 1 @Message = CAST([message] AS VARCHAR(1024)) FROM msdb.dbo.sysjobhistory a INNER JOIN msdb.dbo.sysjobs b ON a.job_id = b.job_id AND b.[NAME] = @JobName ORDER BY run_date DESC, run_time DESC, step_id DESC IF @Message LIKE ‘%<Exception %’ RAISERROR (@Message, 17, 1)
UPDATE (2009-04-03): Added
, step_id DESC to
ORDER BY clause - when the XMLA job fails instantly (say you tried to process a nonexistant partition), run_time doesn't have enough granularity to sort properly.
Once your done, your job steps will look something like this:
During impact analysis for any changes to existing database tables, cube dimensions, cube measures, etc, it's nice to know which reports are going to horribly break before your end-users let you know about it 😉 All of the rdl content for reports uploaded to SSRS is stored in the ReportingServices database in the Content column of the Catalog table (in binary, of course) so here's what I came up with to get the list of soon to be broken reports:
SELECT [Path], ContentText FROM ( SELECT [Path], CAST(CAST([content]AS varbinary(8000)) AS VARCHAR(8000)) AS [ContentText] FROM [catalog] cat WITH(nolock) WHERE [TYPE]=2 ) a WHERE ContentText LIKE '%ColumnName%' OR ContentText LIKE '%columnname%' OR ContentText LIKE '%MeasureName%' OR ContentText LIKE '%AttributeName%' OR ContentText LIKE '%etc...%'
Say you want to run the same MDX query for each row in a given rowset. I needed to do this for alerting purposes, where there were different alert thresholds for different attribute values in a given dimension attribute. After struggling with passing a variable to the query argument of the OPENROWSET command, I finally found the documentation that clearly stated that the query argument CAN'T be a variable. Or a concatination of a string and a variable. I still don't understand why... but the suggested workaround is to construct a giant TSQL string and run it using the EXEC command.
Ok - but how do we get the results of the query? Basically, the only way to do this is to create a temporary table in the current scope and do an INSERT INTO that temp table in your giant TSQL query. It all ends up looking something like this:
We wanted to export an SSIS package that was stored on the server in the msdb.dbo.sysdtspackages90 table to a .dtsx file so we could poke at it.
Here's what we came up with:
declare @SQLcommand varchar(max) set @SQLcommand = 'bcp "SELECT cast(cast(packagedata as varbinary(max)) as varchar(max)) FROM msdb.dbo.sysdtspackages90 WHERE name=''PackageName''" queryout "c:\output.dtsx" -T -c' exec xp_cmdshell @SQLcommand
Alternatively, you can just Export the package via SSMS 😉