tim laqua dot com Thoughts and Code from Tim Laqua


Locating Rampant Database Growth

Every now and then you get a helpful alert from one of your database servers about disk space being low (you're monitoring that, right?), or a bunch of autogrowth alerts (you're monitoring that too, right?) - but what happens when you get these for a database that you don't expect growth in? Further, what happens when that database is growing rampantly (say like 1GB/hr in my case) and it's a canned database from a 3rd party product? This time it was the database that SolarWinds uses for collection - and apparently it was collecting a lot of something that it wasn't collecting before.

  1. you send out an email asking the end users of the system (IT in this case) if anything changed
  2. adjust so said system stops trying to fill up your drives

Unfortunately, there's often quite a few possibilities for "why" a database is growing - and when it's a canned product, you don't always have the best understanding of why it does what it does when it does it. As a Database Admin, you can help diagnose the problem by letting everyone know what exactly is growing:


Script sp_configure Output To Migrate Settings Between SQL Servers

Sometimes when setting up a new server, all you want is for it to work and be configured exactly like the old server. In some cases, the new server is almost identical in every way already. If you've worked much with SQL Server, you know very well that there are a billion switches and knobs and every time you do this, you forget at least one. I had a similar situation recently and figured I'd be lazy about it and just turned the output of sp_configure in to a script I could execute on the new server:



I think most people setup the drives on their Development servers to match their Production servers - this is so that restores go smoothly and files go where they're supposed to, things match up nicely, etc. Unfortunately, when you create a full backup all the backup file contains is the logical name of all the files - no physical paths. This means that if the database doesn't exist on the destination server yet, the engine has absolutely no idea where to put the files or what to name them.

The following script can be executed in the context of the source (original) database that was backed up to script out the MOVE statements if your plan is to put the files in the same place on the destination server.


Charting Analysis Services 2008 Trace Events

The other day I was running some Analysis Services traces for some reason or another and ran across Chris Webb's post on (Visualising Analysis Services Trace Information in Reporting Services). After looking over that post, I thought it'd be interesting to visualize various types of processing, queries, etc - basically take a trace and then graph out what happened and when. Here's a few samples of what the project yielded:

  • Red: Cube Processing on a 2 Minute schedule
  • Green: Cube Processing on a 1 hour schedule
  • Blue: Cube Processing on a 1 hour schedule
  • Black: Query Activity

Most of the activity here is from SSRS subscriptions firing around 8AM
8AM MSRS Subscription Processing


Estimating the Size of a Table in SQL Server 2008

I have read this (http://msdn.microsoft.com/en-us/library/ms178085.aspx) article at least 6, maybe 7 times in the past - and every time I say to myself "this is ridiculous - someone has to have written a script to do this by now" and every time, I google for hours and fail to find anything. So I finally gave up and wrote something to do it. Note, I've only verified it on 100% fixed width tables. I compared its output to a 600+ million row table and it came out somewhere around 3% higher - fine with me as I'd rather over-estimate space requirements than under-estimate.


Searching for keywords in all Reporting Services (SSRS) reports

During impact analysis for any changes to existing database tables, cube dimensions, cube measures, etc, it's nice to know which reports are going to horribly break before your end-users let you know about it 😉 All of the rdl content for reports uploaded to SSRS is stored in the ReportingServices database in the Content column of the Catalog table (in binary, of course) so here's what I came up with to get the list of soon to be broken reports:

select [Path], ContentText
select [Path], cast(cast([content]as varbinary(8000)) as varchar(8000)) as [ContentText]
from [catalog] cat with(nolock)
where [type]=2
) a
where ContentText LIKE '%ColumnName%' 
	OR ContentText LIKE '%columnname%' 
	OR ContentText LIKE '%MeasureName%' 
	OR ContentText LIKE '%AttributeName%' 
	OR ContentText LIKE '%etc...%'