tim laqua dot com Thoughts and Code from Tim Laqua


SSIS, BIDS and 64bit System DSNs: Who’s on first?

After an hour or so of changing settings, package configurations, protection levels, passwords, connection strings, etc - you might just still have yourself one of those dreaded red boxes (and a headache). And you thought inserting data using ODBC was easy...

You might even have yourself some logging that indicates an error like this:
[AS400 [19]] Error: ADO NET Destination has failed to acquire the connection {43E6AE37-24E8-46F6-8AB0-689DB6531167}. The connection may have been corrupted.

Assuming you are using a System DSN for your connection (this article is based on an ODBC DSN using the iSeries Access ODBC Driver to write to DB2, but it applies to any ODBC driver with 32bit and 64bit versions where a System DSN is used for the connection).

Before going on, you should understand and have checked the PackageProtectionLevel property in your package. To get any connection that requires a password to run in BIDS, you can't use DontSaveSensitive (unless you're using package configurations to enter the password or an explicit connection string). This is by far the most common reason connections fail to validate - but there's many resources out there already on that situation, so I won't go in to it.

Now that we know it SHOULD work - why doesn't it? In short, because you don't have a 64-bit version of the System DSN you selected. What sense does that make? Let me show you.

Tagged as: , Continue reading

Monitoring and Starting Data Collector Sets with Powershell

I figure it's about time I posted an update to my previous post on Monitoring, Starting, and Stopping Perfmon Performance Counter Logs using VBScript. In the years since that post, a few operating systems came out and many much more interesting languages happened by. So this update is in Powershell. You could do it in pretty much any language, but Powershell seems to be my weapon of choice lately.

The basic premise here is that we want our Data Collector Sets to always be running. Seems like a reasonable request, but for various reasons, Data Collector Sets just... stop. This script should be run regularly (every 5 minutes?) to ensure that the sets keep collecting data. In the event that the script finds some work to do, it will send you an email.

#======= START Config ========
$emailServer = "";
$fromAddress = "you@yourcompany.com";
$toAddress = "you@yourcompany.com";
$ignorePattern = '^(RandomCollectorSet|OSTest)$';
#======== END Config =========
$ErrorActionPreference = "SilentlyContinue"
$serverName = Get-Content env:computername;
$emailBody = ""
$collectors = New-Object -COM Pla.DataCollectorSetCollection;
$collectors.GetDataCollectorSets($null, "Service\*");
$collectors._NewEnum | ?{$_.Status -ne 1 -and $_.Name -notmatch $ignorePattern} | %{
	if($_.Status -eq 1) {
		$emailBody += "Started $($_.Name)`n"
	} else {
		$emailBody += "Failed to start $($_.Name)`n"
if($emailBody -ne "") {
  $smtp = new-object Net.Mail.SmtpClient($emailServer);
    "$serverName <$fromAddress>", 
    "Started Data Collector Set(s)", 

If you want the script to keep any of the non-user created collectors running, modify the namespace parameter of the GetDataCollectorSets method call.

Tagged as: 1 Comment

Pattern for Conditionally Sending SSRS Subscriptions

This is a pretty common request - send the report if there's an issue, don't send it if there's not an issue. This is a simple pattern for doing that using Data Driven Subscriptions. Here, we're not using the data driven subscription to dynamically populate recipient and parameter information, we're just using it to suppress the sending of the report if the report has no value.

To do this, we only need one thing - a query that returns one row if the report should be sent and zero rows if the report should not be sent. Here is an example - Let's say I want to send a report out if a JDE Address Book record with an AT1 of 'DC' has been updated after a certain julian date.

First, write a query to return rows meeting that criteria:

Tagged as: , Continue reading

Script for Ad-Hoc SSRS TimedSubscription Scheduling

Quick script to locate and execute SSRS timed subscriptions (useful for testing).

1. Execute the top (uncommented) part and identify which subscription you want to schedule.
2. Paste the SubscriptionId from that row in to the @id variable in the lower section
3. Run the entire lower (commented) section

DECLARE @ReportName VARCHAR(255) = 'Report Name'
      ,CAST(b.Parameters AS XML) AS [Parameters]
      ,CAST(b.ExtensionSettings AS XML) AS [ExtensionSettings]
from ReportServer.dbo.[Catalog] a
      INNER JOIN ReportServer.dbo.[Subscriptions] b
            ON b.Report_OID = a.ItemID
where a.Name = @ReportName
       @id varchar(260) = ''
      ,@Type VARCHAR(32)                  = 'TimedSubscription'
EXEC [ReportServer].dbo.AddEvent @EventType=@Type, @EventData=@id
Tagged as: No Comments

Consuming an Authenticated JSON Feed with SSIS

For this post, we'll look in to one option for consuming a JSON data feed from a service provider's API. In my case, the provider is AtTask - a web based project management service. This sort of assignment/request is pretty common as various corners of your organization simply want to use the best tool for the job. Far too infrequently does anyone really look at things like how it connects to your current systems. They say things like "we have a well documented API and a strong user community" - yeah, I've heard that a few thousand times. JSON is neat if you're using JQuery. It's horrible if you're trying to suck data down in to a SQL Server table. Which is what we're going to do.

First, we wander over to the AtTask API Documentation and figure out how to authenticate. Notice that every request needs a sessionID - and that sessionID is created by a request to the login url. This means we'll have to make at least two requests - one to nab a sessionID and another to actually get the data that we want.

Tagged as: , , Continue reading

Revisiting Embedded Text Qualifiers and the SSIS Flat File Connection Manager

To quickly summarize the problem at hand, the Flat File Connection Manager doesn't support embedded Text Qualifiers. The reason this is a big issue is because the most commonly used text qualifier is a double quote (") and that's also a commonly used punctuation mark. That means if you have a CSV to import and lines that looks like this:


You would expect to import

one 1 two 2 th"r"ee 3 asdf
"left right" mid"dle w",tf w,"tf ,"omg omg",

That's not what you get, you get an error. I've seen a few different approaches to working around this known issue:

What I'd like to explore in this post is using regular expressions to transform the entire file in to something that can be natively consumed by SSIS. There are two options for dealing with this:

  • Tokenizing the quotation marks found within the cells
  • Change the text qualifier

I chose to change the text qualifier because SSIS natively supports multiple character text qualifiers. This will allow me to pick something ridiculous that is incredibly unlikely to appear in the data naturally. Changing the text qualifier also means we won't have to add any extra transformations as we would if we tokenized the embeded quotation marks (to turn them back in to quotation marks).


Troubleshooting Kerberos Configuration for SSRS

I recently worked with Dave DuVarney while he was configuring one of our Reporting Services instances for Kerberos authentication. While working through it, he suggested we take a look at network traffic to see what's actually happening. Turns out, that DuVarney fellow has some clever ideas now and then 😉

We used Network Monitor 3.4 for our tests and really weren't 100% on what we were looking for. Examining traffic over the KerberosV5 protocol, you'll always seem some chatter from the krbtgt as it does whatever it does, but other than that, we just kind of captured and sifted. Here are a few examples of what we saw:


SQL Schema Source Control 2.0

SQL Schema Source Control is, at it's core, just another database scripting app. Where this app differs from others is that the sole purpose of scripting out the database is to check it in to source control (sorry, it still only supports SVN).

I think this may have once gone under the name "SQL2SVN," but was rebranded before being checked in to Codeplex for the first time. This is a re-write of most of the application to be more modular and extensible in the future. Additional features found in this release are:

Tagged as: , Continue reading

Reporting Services Source Control 1.0

Reporting Services Source Control is an application to create copies of all the reports stored in your Report Manager and commit them to source control (SVN is the only supported source control in the 1.0 release). The need for this arises for two reasons:

  • First, we empower our users to create reports using Report Builder and those reports usually only exist the ReportServer database.
  • Second, Report Builder has matured so well that it is no longer very beneficial to create a Report Server project using Business Intelligence Development Studio (BIDS).

So now we have intellectual property generated by both end users and developers sitting in Report Manager with no version control whatsoever. Reporting Services Source Control addresses this by scripting out reports, subscriptions, and other properties and committing them to source control.

Tagged as: , , Continue reading

Monitoring Failed Report Server Subscriptions

While we love to empower our users as much as possible, we still need to pay attention to them and what they're up to. The first place I usually see users running amok as Microsoft BI adoption grows in organization is the Report Subscriptions in SSRS. They go nuts with them. I was at one company where 50% of the report subscriptions went to users outside of the company - that means our customers were depending on these subscriptions! Which brings up an interesting licensing debacle... but that's another story. Needless to say, when people set up a subscription, they expect it to work. If it doesn't, we REALLY need to let someone know.

Here, we have a procedure to monitor for Reporting Services subscription failures. Specifically, email subscription failures. When a failed subscription is detected, an email is sent to both the subscription owner and a digest of failures is sent to the specified admin group.

Tagged as: , , Continue reading