tim laqua dot com Thoughts and Code from Tim Laqua

8Apr/140

jderef.com is dead – Create your own using SSRS and JDE metadata tables

ssrs-jderef-1

Anyone who has used jderef.com is certainly mourning its demise as JDE, like most ERP systems, is a bear to understand without a really big instruction book. Luckily you can chat with your JDE people and, given the correct libraries, modify the following query to get the same information:

SELECT
       RTRIM(TDOBNM)   AS TABLE_NAME,
       RTRIM(d.SIMD)   AS Table_Description,
       RTRIM(d.SIPFX)  AS Table_Prefix,
       RTRIM(TDOBND)   AS Data_Dictionary_Object_Name,
       RTRIM(TDSQLC)   AS SQL_Column_Name,
       RTRIM(b.FRDSCR) AS Column_Description,
       RTRIM(e.DRDL01) AS Field_Type,
       RTRIM(c.FRDTAS) AS Field_Length,
       RTRIM(c.FRDTAD) AS Field_Decimals,
       --RTRIM(c.FROWDI) as DD_Item,
	CASE 
		WHEN RTRIM(c.FROWER) = 'UDC' THEN 'UDC: RTRIM(DRSY)=''' || RTRIM(c.FROER1) || ''' AND RTRIM(DRRT)=''' || RTRIM(c.FROER2) || ''''
		ELSE 'Generic Edit'
	END AS Edit_Type,
       --RTRIM(c.FROWER) as Edit_Rule,
       --RTRIM(c.FROER1) as Edit_Rule_Parm1,
       --RTRIM(c.FROER2) as Edit_Rule_Parm2,
       RTRIM(TDPSEQ)   AS Program_Sequence_Number
FROM
        COPD910.F98711 a
        LEFT OUTER JOIN DD910.F9202 b
                ON a.TDOBND = b.FRDTAI
                   AND FRLNGP = '  '
                   AND FRSYR = '    '
        LEFT OUTER JOIN DD910.F9210 c
                ON a.TDOBND = c.FRDTAI
        LEFT OUTER JOIN OL910.F9860 d
                ON a.TDOBNM = d.SIOBNM
                   AND d.SIFUNO = 'TBLE'
        LEFT OUTER JOIN PRODCTL.F0005 e
                ON LTRIM(RTRIM(c.FROWTP)) = LTRIM(RTRIM(e.DRKY))
                   AND e.DRSY = 'H98'
                   AND e.DRRT = 'DT'
WHERE
        RTRIM(TDOBNM) LIKE ?
ORDER BY
        TDOBNM,
        TDPSEQ

ssrs-jderef-2

Sample RDL can be found here: http://timlaqua.com/jderef.rdl. You will need to change the data source to work in your environment as well as update the following libraries to your environment:

  • COPD910.F98711 - F98711 varies per environment, so make sure you get the right library
  • DD910.F9202 - all the F92 files are in the same library
  • PRODCTL.F0005 - whatever library your UDC table is in
  • OL910.F9860 - the Object Librarian library
Tagged as: , , No Comments
10Feb/130

Hanging The TV

One morning as we were eating breakfast, Lindy (my wife) looked over to the other room where we had a TV sitting on a sofa table and commented that it looked "terrible" because of all the wires all over the place. This is basically license to knock down the entire wall and do whatever you like as long as she can't see those wires when you're done. I stopped short of that and just pulled a little part of the wall off.

Here's what it looked like originally (the terrible wires photo):
1-Original

3Feb/130

The Garage Heater: Natural Gas

Now after all this running around with sensors and propane heaters - there's really no possible way we could NOT install a natural gas heater. Browsing the Menards ad for the week yielded a 75,000 BTU Beacon Morris unit on sale for about $450. Figuring that's a pretty good deal, we headed over there and gathered the supplies we needed. Oh the supplies... B-vent to get out the roof, flashing, crazy angle pipe things, a cap, storm collar, black iron pipe for a drip leg on the gas line, 60' of 1/2" copper flex, some flare nuts, some 14/2 wire, a thermostat, some 18/5 thermostat wire, hangars, and various other random things - probably some beef jerky since you always buy beef jerky at Menards. All in - a whopping $750 bill at the checkout. Seems like a lot, but it's much less than the $2,200 quote I received from a local HVAC shop for a 45,000 btu heater. Fast forward a few hours and we've got ourselves cheap(er) heat!

Scenario 1: 20 degree day, maintain 50 degrees

NaturalGas-50degrees

On for about 8 minutes, off for about 56 - that's a total cycle of 64 minutes and it's on for 12.5% of the time. It's a 75K BTU heater, meaning it will consume 75K * 12.5% or 9,375 btu per hour. 24 hours * 9,375 btu / 100,000 * $0.67 = $1.50. That's $1.50 per day for a 30 degree temp differential.

Scenario 2: 20 degree day, maintain 70 degrees

NaturalGas-70degrees

On for about 7 minutes, off for 11. Cycle of 18 minutes, on for 39% of the time. 75K * 39% = 29,250 btu. 24 hours * 29,250 / 100,000 * $0.67 = $4.70. $4.70 per day for a 50 degree temp differential.

Scenario 3: Back to 50 degrees

NaturalGas-70to50

I had some crazy idea like i could figure out the heat loss of the garage with data like this - but there are really just a ton of variables in addition to the heat loss. The biggest issue I see is the heat that moves from the objects in the room (that have been heated up to 70) in to the air around them as the air cools. Look at the difference in how long it takes to drop from 59 to 49 the first time, and then again the 2nd time after the air is heated again.

Here's what it looked like installed:
Installed

And the thermostat:
Thermostat

And my car in the middle of being waxed - since I had to keep the garage at 70 for a while to gather data, I figured I'd make use of the warm weather. That laptop in the background is gathering the data:
WaxingTheCar

Filed under: Thoughts No Comments
23Jan/132

Hiding SSAS Attributes With Powershell

The challenge here was that we have a cube with some great data as well as some low level detail (customer detail, think PII) that we want to expose to a larger audience. The problem is that the low level detail is not something that the new audience needs (or in some cases, is allowed) to have access to. The obvious answer here is Dimension security - why not use that? Two reasons. First, Dimension security is slow. Second, even if it wasn't slow, to hide all the data in many (30+) attributes is tedious to setup, and when new attributes were added we would have to make sure and disable access to those as well. To be clear, we're not just hiding attributes here, we're creating an entire copy of the existing SSAS database and hiding attributes in the copy.

# Import Required Libraries
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices") >$NULL
 
# Connect to server
$server = New-Object Microsoft.AnalysisServices.Server
$server.Connect('localhost')
 
# Make a metadata copy of the existing cube
$newdb = $server.Databases.GetByName('POS Data').Clone()
 
# Change ID and Name of SSAS DB
$newdb.Name = 'POS Data - Limited'
$newdb.ID = 'POS Data - Limited'
 
# Drop Existing SSAS DB if it exists
$server.Databases['POS Data - Limited'].Drop()
 
# Add the new copy to the server
$server.Databases.Add($newdb)
 
# Sync our copy of the new database with the server's copy
$newdb.Update([Microsoft.AnalysisServices.UpdateOptions]::ExpandFull)
 
# Grab the cube we want to work with from the new database
$cube = $newdb.Cubes.GetByName('POS Data')
 
# Hide the Location Dimension
$cube.Dimensions.GetByName('Location').Visible = $false
 
# Hide all attributes and hierarchies in the Customer dimension
$cube.Dimensions.GetByName('Customer').Attributes |  %{$_.AttributeHierarchyEnabled=$false; $_.AttributeHierarchyVisible=$false;}
$cube.Dimensions.GetByName('Customer').Hierarchies | %{$_.Visible=$false; $_.Enabled=$false;}
 
# Enable the key attribute in the customer dimension - it won't work if the key isn't enabled
$cube.Dimensions.GetByName('Customer').Attributes['Dim Customer'].AttributeHierarchyEnabled=$true
 
# Enable the Market attribute in the customer dimension
$cube.Dimensions.GetByName('Customer').Attributes['Market'] | %{$_.AttributeHierarchyEnabled=$true; $_.AttributeHierarchyVisible=$true;}
 
# Hide the Location Database attribute in the Product dimension
$cube.Dimensions.GetByName('Product').Attributes['Location Database'] | %{$_.AttributeHierarchyEnabled=$false; $_.AttributeHierarchyVisible=$false;}
 
# Add a new member to the role granting Read Only permissions in the cube
$newMember = new-object Microsoft.AnalysisServices.RoleMember("domain\tlaqua")
$newdb.Roles['Role 1'].Members.Add($newMember)
 
# Push our updates to the server
$newdb.Update([Microsoft.AnalysisServices.UpdateOptions]::ExpandFull)
 
# Process the new database
$newdb.Process([Microsoft.AnalysisServices.ProcessType]::ProcessFull)
 
# Disconnect from the server
$server.Disconnect()

This approach has two notable downfalls. First, you have to think up a different database name because the original db with the low level detail still exists on the server. Second, you have to ProcessFull the clone that you made. It doesn't close the data, it clones the metadata. All in all, works great for us, this particular cube is only processed once a week and the 20 minutes we lose processing data in to the clone is more than acceptable (looks really clean to the users as well).

20Jan/130

The Garage Heater: Propane Torpedo

As a first followup to my article that looked in to the economics of heating my garage (The Garage Heater), I ordered myself a TEMPer USB temperature sensor to do some fact checking. Note - I use the term fact pretty loosely... more like "numbers I made up using formulas I found on the internet."

The software I'm using to record the data is called ThermoHID (http://www.thermohid.co.uk) - a freeware app which is far superior to the terrible software that comes with the sensor. For this test, I went out on a 5 degree day, bought a bunch of propane, and did my best to keep the garage 60 and 85 degrees for three hours or so:
PropaneTorpedoHeater

The first thing I would like to point out in the data is the pause in the initial heat up. It's darn cold out - the garage started around 38, which it seems to be able to maintain with just heat leaking in from the house (there's one finished wall w/ a house on the other side), so when we tried to heat it up to 85 the first time, the propane tank froze up and we had to switch to a warm one. That's just what happens when you use a 106,000 BTU propane heater with a 20lb tank. It simply can't support that rate (something about vaporization rates).
PropaneTorpedoHeater-TankChange

There's also an interesting drop in there after a bit - that's when we opened the garage to vent the the stinky air out (propane stinks).
PropaneTorpedoHeater-OpenedGarage

And finally - it's interesting that as time goes by, it takes less time to heat the garage back up and it takes more time to lose the heat. I'm pretty sure this is due to all the "stuff" in the garage(cars, garage slab, some engines, refrigerators, tools, workbenches, etc). Heating up air is one thing, but heating up stuff - that takes more energy and it has its own rate of heat loss. But when your "stuff" loses heat because the air around it is cooler, the heat goes in to the air - the air we're trying to keep hot. Just an interesting observation. I'm also fairly sure that 106K BTU heater is putting out nowhere near that hooked up to a 20lb tank, it simply couldn't have been based on how fast that much energy should be able to heat the garage. Now to move on to the inevitable - the Natural Gas heater.

Filed under: Thoughts No Comments
13Jan/134

The Garage Heater

Being a life long Minnesota resident, running propane heaters in the winter to keep the garage warm is incredibly common... and incredibly smelly. Eventually your eyes start burning from all the stuff in the air. Add in some degreaser and it gets really enjoyable in there. This age-old problem led me to try and figure out how to keep it warm in there without the trusty propane heater (or kerosene - that's equally terrible). The only two viable options are Electric and Natural Gas. I don't really care which - I just want to pick the most cost effective one. If you ask anyone, the answer is always natural gas. But why? How much more efficient is natural gas? I care because natural gas requires a gas line, while electric would just require another (big) circuit (the natural gas heaters require a circuit as well, just way more logically sized). Ever wonder how those electric heaters rated in watts translate to the BTU rated natural gas heaters? I did. I also wondered how much all this propane is costing... and why I have to run the heater so frequently.

First things first, I looked at my gas bill and it's billed in therms. That's kind of the standard unit i'll use, a therm is 100,000 btu (and wherever I say BTU, i really mean BTU per hour). How's that translate to electricity? Well, 1 watt = 3.414 btu so one therm 29,291 watts. Then I went and looked at my electric bill and it's billed in Kilowatt hours so now I can just multiply by my per kWh cost ($0.10).

Electric costs roughly $2.92 per therm
Natural Gas cost roughly $0.67 per therm

That's a difference of $2.25 per therm consumed. Let's say I think I need a 75K BTU heater. That's 75% of a therm, so for every hour I run it, it would cost $1.69 more for electric. Now let's assume the Natural Gas setup cost $1,000 more to install - it would take 591 hours to eat up the cost difference.

Ok, got that sorted out, now what about this propane stuff, what's that cost? First we have to sort out how much is in the tank (just using 20lb tanks) and divide by the fill cost. Turns out one gallon of propane has 91,600 BTU. But I buy propane in pounds, so one gallon is 4.24 lbs - meaning there's 4.6 pounds required per therm. On the high side, i'm paying about a buck a pound (if they actually filled the tanks up to 20lbs).

Propane costs roughly $4.60 per therm

That's a whole bunch - definitely need to stop using propane. Now let's try to figure out how much heat I actually need rather than just using the 'ol "bigger is better" logic. It takes 0.018 btu per cubic feet to raise the air 1 degree fahrenheit and my garage is 34' x 22' x 9' = 6,732 cubic feet. If I did the math right, it would take 121.176 BTU per degree we want to heat it.

Now for the heat loss part...

Filed under: Thoughts Continue reading
6Jan/131

Our Life Fitness X5 Elliptical: Part 1

I want to start off by saying that this is the greatest home elliptical I've ever used. I'm 6' and my wife is 5' 4" - the adjustable stride really goes a long ways (pun intended) towards allowing both of us to enjoy this piece of equipment. I'm not familiar with the Basic console, but I can attest that the Advanced console's auto-adjusting workouts are very handy at 6AM when you're really not interested in pressing buttons - or really thinking at all. This unit gets used almost daily (by Lindy - I fell off the wagon some years ago) and we've owned it for about 6 years now. Here are the stats on this mamma jamma:

DSCN0069[1]

DSCN0068[1]

We haven't had a SINGLE problem with it... until now. After our most recent move, to be exact. Now - we've been in our new place for about 4 months and this is the first time I noticed it was having an issue. Wait - but didn't I say it was used daily? Oh, it is. But Lindy is they type that doesn't really care about the flashing lights on the screen, she actually finds them to be distracting. So what she does is turns the unit on, hits all the buttons to get it in to manual mode, and lays a DVD case across the entire screen, cause she really doesn't care what it says. Well, I decided I should jump back on the wagon for some strange reason earlier today and my little auto adjusting program wasn't working. The Level area of the display simply says "OFF" and no amount of button pressing will wake it up.

DSCN0070[1]

So the only good news here is I get to give Lindy crap for always using the elliptical on Level 1 and being a slacker. Bad news, I gotta fix it since the warranty ran out like 3 years ago.

5Jul/1212

Consuming Sharepoint Lists via OData with SSIS

We recently needed to bring data from a Sharepoint list in to SSIS for use in an attribute lookup for a cube dimension. Seems like this should be pretty straight forward since SSRS does it natively now, but no - that wizardry hasn't made its way over to the SSIS team yet. As stated before, we don't care for third party dependencies or external non-standard assemblies of any sort in our packages. That means, as usual, we'll be writing a script component to take care of getting our data out of Sharepoint.

First, navigate to the list you are interested in and export it as a data feed

18Jun/120

SQL 2012 Upgrade: Lessons Learned

Monday of last week (2012-06-11) we started our SQL 2012 upgrades by first upgrading our development and staging environment. We encountered very few issues with this particular phase and forged ahead with upgrading one of our monitoring servers - this is the one that hung us up for an hour or two. We decided it didn't need to run enterprise edition and could get by with the Business Intelligence edition. What we didn't notice was that at some point in the past we had setup MDW on that server for evaluation and, after abandoning it, never cleaned it up. The MDW databases uses both partitioning and compression - enterprise only features.

After sorting that out, we moved on to our SSAS presentation server which also received Business Intelligence edition as there are no core limits on SSAS in that edition - that box has 48 cores and the engine licensing scared us a little - then we remembered we don't even run an instance of the engine on that server. Finally, we upgraded our production ETL/DW/SSRS server and that's when we found out that someone declared war on commas in email addresses (I have no idea what standard led them to that decision - I assume they forgot that commas commonly exist in titled email strings). Then pass-through authentication in SSRS tanked, but our DBA quickly caught that one noticing the SSRS config file was completely overwritten and was only allowing NTLM (switching back to Negotiate fixed the issue).

For now, one week later, things appear to be stable and we believe we've found most of the issues. This upgrade certainly didn't yield magic performance improvements as we saw in the 2005 upgrade and the 2008 upgrade and we have yet to use any of the new features - but we're ready. Someone keeps talking about about Tabular instances 😉

As the saga continues and we move on to upgrading our remaining environments, I'll keep the following list updated:

Phase 1: Core ETL, DW, SSRS, and SSAS Servers

  • Context menus are a little different in the new VS environment. Had to manually add the Change Connection command to the context menu that's used when editing XMLA files (SQL 2012: XMLA Change Connection Command Missing)
  • You can't add DATETIME and TIME any more, you have to explicitly CAST the TIME part to DATETIME first
  • Don't worry about the Project Deployment Model in SSIS yet. The Package Deployment Model works fine for now
  • When compiling SSIS packages, they don't sit in the ./bin root by default anymore. This can behavior can be changed by editing the DeploymentOutputPath in the Properties of the solution file
  • Do NOT diagonalgrade a server running Enterprise Edition only features to SQL 2012 Business Intelligence Edition (Upgrading from SQL 2008 R2 Enterprise to SQL 2012 Business Intelligence)
  • The SSRS Config file (RSReportServer.config) gets overwritten. Back yours up before doing the upgrade so you can put the settings back. The one we noticed was the authentication changed to NTLM only rather than Negotiate (we use kerberos primarily) so auth started failing for many data sources
  • The SQLPS (Powershell) shell behaves a little differently. Working with files (copies, moves, etc) will not work correctly unless you break out of the default provider context (it's in the context of SQL's provider when the shell starts). At the top of all of our Powershell type SQL Agent Job Steps we just added a "C:" line to break out of the SQL provider and things worked as expected again
  • In SSAS, go through the data sources in your AS databases and switch any references to SQLNCLI10/10.1 to SQLNCLI11. We've seen SQLNCLI10.1 continue to work fine after the upgrade in some scenarios and fail in others - safest to just update them.
  • msdb.dbo.sp_send_dbmail Msg 14613, Level 16, State 1, Procedure sysmail_verify_addressparams_sp, Line 13. The specified '@recipients' cannot use commas (,) to separate addresses: 'Tim_Laqua@trekbikes.com;Laqua, Tim <tim_laqua@trekbikes.com>;'. To continue, use semicolons (;) to separate addresses." I looked at the validation proc, it tosses out ANY string with a comma in it. Simply remove the commas (so instead of LastName, FirstName use FirstName LastName for titled email addresses)
  • SSRS also declared war on email recipient strings - Some subscriptions will fail with the error "specified string is not in the form required for an e-mail address." Fiddle w/ the email address line, we've seen this be an issue with white space, blanks, etc. Basically, they made the email address validation ridiculously rigid.

Don't try to perfectly plan your upgrade, just be ready to zig when it zags. As a team, we've been through at least two of these major SQL Server upgrades previously and this one was more about being prepared for possible issues rather than spending ridiculous amounts of time trying to perfectly reproduce production load to simulate the production upgrades. We did spend some time tidying up our house first as over the years we have learned that the more you stay away from third party components and assemblies, the smoother these upgrades go. This observation led us to rip all of our third party SSIS components (components from codeplex, purchased components, and ones that we wrote) out of our packages and only use built-in components. Every single package we've had so far upgraded without issues.

Tagged as: No Comments
12Jun/120

Upgrading from SQL 2008 R2 Enterprise to SQL 2012 Business Intelligence

With all the new licensing considerations in SQL 2012, everyone's busy trying to figure out exactly what the cheapest license they can get by with is. For one of our instances, we decided rather than Enterprise, we could run the Business Intelligence edition to save some cash. That upgrade path is explicitly listed in the Supported Version and Edition Upgrades documentation.

What happened?

We clicked through the upgrade wizard as usual and it got through about 85% and then prompted an error:

Wait on the Database Engine recovery handle failed. Check the SQL Server error log for potential causes.

We clicked ok, then it finished and the results showed that the SQL engine and a few other features failed the upgrade. Great.