tim laqua dot com Thoughts and Code from Tim Laqua

24Jul/1122

Troubleshooting: 1998 Jeep Grand Cherokee (ZJ) Doesn’t Start – Cranks, No Fuel, No Spark

This has been one of the most frustrating repairs we have come across. One day the Jeep just plain refused to start anymore. We have had prolonged starting times for a while now where we will let fuel pump prime at least twice before it would start, but it would always start. Nothing really precluded the no start condition, we just turned it off one day and it didn't want to start again. The symptoms don't really point to any particular component (other than the PCM) and without some fancy tools, you'll do exactly what we did - replace the PCM... and it won't fix anything. So we took out our fancy tool, looked up some diagnostic diagrams on alldatadiy.com, and got to work.

A few things to check for right away to see if you're having the same issue we were:

  • Check to see if the fuel pump runs during cranking - ours didn't
  • Check for spark during cranking - we didn't have any
  • Check to see if your check engine light is on (ours WASN'T)
  • Check your voltage and fuel gauges with the ignition on - ours didn't register anything
  • Check to see if your low fuel light is on - our was
3Jul/1114

Consuming an Authenticated JSON Feed with SSIS

For this post, we'll look in to one option for consuming a JSON data feed from a service provider's API. In my case, the provider is AtTask - a web based project management service. This sort of assignment/request is pretty common as various corners of your organization simply want to use the best tool for the job. Far too infrequently does anyone really look at things like how it connects to your current systems. They say things like "we have a well documented API and a strong user community" - yeah, I've heard that a few thousand times. JSON is neat if you're using JQuery. It's horrible if you're trying to suck data down in to a SQL Server table. Which is what we're going to do.

First, we wander over to the AtTask API Documentation and figure out how to authenticate. Notice that every request needs a sessionID - and that sessionID is created by a request to the login url. This means we'll have to make at least two requests - one to nab a sessionID and another to actually get the data that we want.

Tagged as: , , Continue reading
8Jun/1128

Revisiting Embedded Text Qualifiers and the SSIS Flat File Connection Manager

To quickly summarize the problem at hand, the Flat File Connection Manager doesn't support embedded Text Qualifiers. The reason this is a big issue is because the most commonly used text qualifier is a double quote (") and that's also a commonly used punctuation mark. That means if you have a CSV to import and lines that looks like this:

"one","1","two","2","th""r""ee","3","asdf"
"""left","right""","mid""dle","w"",tf","w,""tf",",""omg","omg"","

You would expect to import

one 1 two 2 th"r"ee 3 asdf
"left right" mid"dle w",tf w,"tf ,"omg omg",

That's not what you get, you get an error. I've seen a few different approaches to working around this known issue:

What I'd like to explore in this post is using regular expressions to transform the entire file in to something that can be natively consumed by SSIS. There are two options for dealing with this:

  • Tokenizing the quotation marks found within the cells
  • Change the text qualifier

I chose to change the text qualifier because SSIS natively supports multiple character text qualifiers. This will allow me to pick something ridiculous that is incredibly unlikely to appear in the data naturally. Changing the text qualifier also means we won't have to add any extra transformations as we would if we tokenized the embeded quotation marks (to turn them back in to quotation marks).

22Apr/110

Troubleshooting Kerberos Configuration for SSRS

I recently worked with Dave DuVarney while he was configuring one of our Reporting Services instances for Kerberos authentication. While working through it, he suggested we take a look at network traffic to see what's actually happening. Turns out, that DuVarney fellow has some clever ideas now and then 😉

We used Network Monitor 3.4 for our tests and really weren't 100% on what we were looking for. Examining traffic over the KerberosV5 protocol, you'll always seem some chatter from the krbtgt as it does whatever it does, but other than that, we just kind of captured and sifted. Here are a few examples of what we saw:

21Apr/110

SQL Schema Source Control 2.0

SQL Schema Source Control is, at it's core, just another database scripting app. Where this app differs from others is that the sole purpose of scripting out the database is to check it in to source control (sorry, it still only supports SVN).

I think this may have once gone under the name "SQL2SVN," but was rebranded before being checked in to Codeplex for the first time. This is a re-write of most of the application to be more modular and extensible in the future. Additional features found in this release are:

Tagged as: , Continue reading
20Apr/110

Reporting Services Source Control 1.0

Reporting Services Source Control is an application to create copies of all the reports stored in your Report Manager and commit them to source control (SVN is the only supported source control in the 1.0 release). The need for this arises for two reasons:

  • First, we empower our users to create reports using Report Builder and those reports usually only exist the ReportServer database.
  • Second, Report Builder has matured so well that it is no longer very beneficial to create a Report Server project using Business Intelligence Development Studio (BIDS).

So now we have intellectual property generated by both end users and developers sitting in Report Manager with no version control whatsoever. Reporting Services Source Control addresses this by scripting out reports, subscriptions, and other properties and committing them to source control.

Tagged as: , , Continue reading
19Apr/111

Monitoring Failed Report Server Subscriptions

While we love to empower our users as much as possible, we still need to pay attention to them and what they're up to. The first place I usually see users running amok as Microsoft BI adoption grows in organization is the Report Subscriptions in SSRS. They go nuts with them. I was at one company where 50% of the report subscriptions went to users outside of the company - that means our customers were depending on these subscriptions! Which brings up an interesting licensing debacle... but that's another story. Needless to say, when people set up a subscription, they expect it to work. If it doesn't, we REALLY need to let someone know.

Here, we have a procedure to monitor for Reporting Services subscription failures. Specifically, email subscription failures. When a failed subscription is detected, an email is sent to both the subscription owner and a digest of failures is sent to the specified admin group.

Tagged as: , , Continue reading
18Apr/110

Parallel Cube Processor 1.0

Parallel Cube Processor (PCP) is an application designed to process multiple Analysis Services databases at the same time. I originally wrote this application with the assumption that it would be blazing fast and essentially warp space and time. As it turns out, SSAS appears to be going as fast as possible at all times, so giving it more stuff to do doesn't really make the whole shebang faster - just different. Play around with it, watch some performance counters - it's an interesting conundrum. The application can handle parallelism using two methods.

The first method is XMLAParallel - in this mode, the application constructs an XMLA command to process the specified number of databases inside of a Parallel element. Only one XMLA command is issued at a time so the applicatoin will not submit another command until the longest processing job completes in each batch.

Tagged as: , , Continue reading
15Apr/111

Restarting Automatic Services using Powershell

What? Why would you need to restart an automatic service? Because they die sometimes - there's probably some very good reasons for this, but I don't know them. What I do know is they die from time to time and don't come back. Sometimes they never get around to starting after a reboot.

I run the following Powershell script every 5 minutes to restart the ones that got tired:

#======= START Config ========
$emailServer = "1.2.3.4";
$fromAddress = "you@yourcompany.com";
$toAddress = "you@yourcompany.com";
$ignorePattern = '^(ShellHWDetection|MMCSS|sppsvc|clr_optimization.*)$';
#======== END Config =========
 
Get-WmiObject Win32_Service | 
? { 
  $_.StartMode -eq 'Auto' `
  -and $_.State -ne 'Running' `
  -and $_.Name -notmatch $ignorePattern
} | 
% { 
  $svcName = $_.Name; 
  $serverName = Get-Content env:computername;
  Start-Service -Name $_.Name -EA SilentlyContinue; 
  $smtp = new-object Net.Mail.SmtpClient($emailServer);
  $smtp.Send(
    "$serverName <$fromAddress>", 
    $toAddress, 
    "Started service ""$svcName""", 
    "Started service ""$svcName"" on $serverName.")
}

$ignorePattern is a regular expression for identifying what services to ignore (so services that you don't care to try and restart).

Tagged as: 1 Comment
15Apr/112

Dealing With Long Running SSAS Queries using Powershell

So, your Analysis Services instance keeps eating all the memory on your server despite proper memory configuration? This happens when queries get too big for their britches - SSAS memory limits are basically soft limits. Whatever an active query requires in terms of resources, SSAS will attempt to provide it. Even if it has to page the entire operating system out to disk to do it. You can mitigate this with good cube design, better aggregations, user training, etc - but it's bound to happen. When your cube gets big enough, one of your users will blow the entire server up.