Sunday, January 23, 2011

Where Are The New M8 Cars?

Almost daily, on train platforms and in town, a commuter stops me to ask, “So, where are the new M8 rail cars?”  I wish I knew!
It has been six years since then-Governor Jodi Rell announced that the state would finally be replacing its broken-down rail fleet… six years!  It’s taken that long for their design, bidding, construction and delivery.  The first car arrived Christmas Eve 2009, already a year late due to builder Kawasaki’s construction problems.
For 13 months those cars have undergone testing. But today we seem no closer to riding the M8s despite promises that they would be in service by now, and the testing process has been cloaked in mystery. (Ironically, there are dozens of videos of the M8s undergoing testing on YouTube, but that’s the closest I’ve been to seeing them running.)
Throughout 2010, we were told that prototype testing was going well.  But by November, we wondered why a date hadn’t been announced for the trains to go into service.  After all, even Governor Rell had been promised that the M8s would run before she left office at the end of December 2010.
So, in November the Commuter Council asked CDOT to bring us someone from Kawasaki to talk about the testing.  They refused.
Then, six weeks ago, there was a glitch:  an electromagnetic pulse from the cars was affecting the signal system.  This was a deal-breaker.  Testing was stopped.
But rather than advise legislators or the Commuter Council about this problem, CDOT and Metro-North gave the bad news “exclusive” to two reporters, who had to pledge they would not speak with any stakeholders with oversight. 
Those are questionable journalistic ethics and hardly “transparent”.  Since when do government agencies get away with spin-control on such bad news?
This past week, the Commuter Council asked CDOT for updates on the testing.  We received the same vague generalities as we’d been given for a year: “The testing is going along as planned.”  But this time, something new and disconcerting was added.
A senior CDOT official told us “We take out the M8s every night and run them, and every night a new issue comes up.”  A new issue?
Yup… every night of testing a new problem is found.  Among them, problems with the auxiliary power system, the automatic train control and the diagnostic computer monitoring.  And until they are all fixed, the final crucial test, 4,000 error-free miles, can’t begin.
And testing of the M8s on Shore Line East, under Amtrak’s power system and signaling, hasn’t even begun.  Until the M8s can run on Shore Line East there will be no equipment to run on the long-promised New Haven to Springfield commuter rail line.
But wait… there’s more.  It seems that Metro-North itself hasn’t been overseeing Kawasaki’s testing of the M8’s, but a consultant.  LTK Assoc. of Pennsylvania has been paid $15 million to monitor the tests.  And this week their contract will be extended seven months for another $12 million.
None of these details were shared with the legislature’s Transportation Committee or Commuter Council, despite our interest in this crucial testing stage.  It came out in a newspaper article.
If Metro-North feels it needs multi-million dollar consultants for another seven months after we’ve already had a year of testing, that sure sounds like they don’t believe the M8s will be in service anytime soon.
Nobody wants to rush these cars onto the tracks, however badly they are needed.  But given the $866 million cost of the project and the six years we’ve already waited, why can’t Metro-North and CDOT be open and honest about what’s going on?
The Commuter Council has been asking the questions but the answers have been curt and condescending.   Perhaps it’s time for the legislature’s Transportation Committee to get to the bottom of this story.  Commuters (and taxpayers) deserve an answer.

Wednesday, January 19, 2011

Introducing Data Detective

Introduction

Sometimes working on this blog is a nightmare, other times it’s a joy, and when I’m really lucky, someone else does all the work; this is one of those times. 

Several (well, like two) people have told me that they can't find the link to this tool.  Never let it be said that I stand in the way of disseminating cool stuff.  Download Data Detective RIGHT HERE and enjoy Dave’s generosity.

No test like production

What do you do when you change a formula, calculation, data source, etc. in your superduper Essbase database?  Never check for correctness and pray for the best?  Create a copy of the app and make the changes there (there is this concept called development, quality, and production) and then spend endless hours recursion testing?

I’m guessing you do the latter as my readers are smart, intelligent, and continuously employed.  And when you do that testing, I’m going to bet that you compare the existing database against the new one.  I do this all the time, setting up retrieve sheets for the original and the new databases and then creating a variance sheet.  This is an acceptable approach, but every time I find a difference, I invariably drill into it, and then have to replicate the modified retrieve in the other database and then manually rewrite the variance formulas, ad infinitumHeretofore, this has been the only way to do this kind of compare, but it is an inefficient and error prone process.

Dave Farnsworth has a solution to this problem with one of his great utilities – this man knows Excel.

In his own words

Data Detective is one of those programming ideas that hang around in my head. I intended to write the utility if I ever had the time and motivation. Finally after many years I’ve put pen to paper and this utility is the result.

Commonly consultants and Essbase administrators engage themselves in the tedious task of comparing two Essbase databases checking that the values are equivalent. After too many years of retrieving from two databases, creating a delta sheet and then pivoting the original report and doing it all again I said enough! Delta Detective is my new favorite tool.

Very simply, Delta Detective is an Excel based macro that uses the classic Essbase add-in tool kit. As the user, you open Delta Detective.xlsm, connect to Essbase using the normal Add-in dialog, Format your first query. Next, start the macro using ctrl+q which brings up a dialog where you enter the connection information for both database that you want to compare. Hit the run button and the macro creates exact duplicates of your original sheet, retrieves data from the second database, computes the delta’s and colorizes the mismatching cells. Returning to the original report sheet you can drill, pivot, or whatever using the classic menu and then rerun the delta report. Excel formats from the original report carry over to the others and you can always be sure that the sheets are connected to the proper databases.

Please look at the ReadMe particularly if you are using an Office version earlier than 2007.

A couple of points

The classic add-in, not SmartView

Another quote from Dave, this time from his documentation (yes, he’s that good), “Since this utility is freeware, the idea was to make it simple, so that it didn’t consume Excel resources when not in use. Additionally it needed to be relatively version independent with regards to Essbase and Excel. For these reasons it requires the traditional Essbase add-in and not SmartView.”

Why not an add-in?

Dave wrote this as .xlsm file instead of a xll because of the loss of menus in Excel 2007/no ribbon bars in Excel 2003.  One day we’ll all be in the ribbon-driven world, but that is definitely not true today.  Until then, this approach removes the UI from the equation.

If you’re using Excel 2003

To allow you to open Excel 2007’s Macro-Enabled Workbook format you must first have Excel 2003 SP3 (I tried skipping SP3 – nope, it doesn’t work) and then you must install Microsoft Office Compatibility Pack for Word, Excel, and PowerPoint 2007 and later File Formats (Compatibility Pack).  Read all about it here.

Macro security

This workbook isn’t signed, so in both Excel 2003 and 2007 (sorry, this hasn’t been tested on 2010 but go ahead and leave a comment telling us), you are going to need to allow Excel to run the macro. 

2003


Just follow the instructions and set the security level to medium.

2007


You’ll see the warning message right up front when you open the file.  Click on the Options button, and enable the content.

Tested variants

Essbase version
Excel version
9.3.1
2007
11.1.1.3
2003
11.1.1.3
2007
11.1.2
2003
Has every release under the moon been tested?  No, but I’m going to guess that since there was zero problem in getting this to work in such a wide span of Essbase releases, there isn’t going to be any problem in your world, either.

Documentation

In addition to the utility itself, Data Detective comes with its very own tutorial. 

Conclusion

I’ve used Data Detective in beta on a project as I tweaked and optimized complex financial allocations.  Data Detective is an easy to use tool to generate the deltas between the databases and helped me quickly hone in on problematic calculations, all without me laboriously creating dual retrieves and variance sheets.  Very, very nice.  Download Data Detective RIGHT HERE and enjoy Dave’s generosity.

Monday, January 3, 2011

Stupid Programming Tricks #6

Introduction

Number Six in my Stupid Tricks collection. 

Should I be worried that so much of what I know falls into this category?  I’ll Keep Calm, and Carry On and hope that one day I’ll get smart.  Ah, but a man’s reach should exceed his grasp, Or what’s a heaven for?

Here’s a block, there’s a block, where are MY blocks?

Block creation is sort of the Achilles heel of BSO calc script writing.  We have all written an allocation script (or two, or three, or four) that looks good but when we run it – nothing, naught, nada, zip, zero, zilchAfter a hopefully quick sanity check to ensure that the data assignment and the Excel retrieve are pointing to one and the same data set, the answer is almost always…block creation, the bane of calc scripts.

Happily, there are lots of ways to create blocks, including sparse assigns, crossdims on the left hand side of the equal sign within a calc block, SET CREATEBLOCKONEQ, the sometimes performance-sapping SET CREATENONMISSINGBLKS, and of course, the old standby DATACOPY.  11.1.2 has its new and nifty XWRITE function which you can combine with a LOOPBACK function.  Oooh, tricky.

But it’s DATACOPY that is the surefire, never-fails, gold standard way to create blocks.  And it is.  Until you remove the blocks, especially when you think you didn’t.  Arrrgh.

That old devil CLEARDATA

Quick quiz -- what does CLEARDATA do?  Why, it clears data.  Duh.

And what doesn’t it do?  Remove blocks.

Sez who?  Our friend the TechRef:
CLEARDATA does not clear blocks, even if all data values in a block are #MISSING. Use CLEARBLOCK if you wish to clear blocks from the database, which can improve performance.

What does CLEARDATA have to do with DATACOPY?  Well, typically after copying data to create blocks, clearing out the data at the target is a good idea as the code that uses the blocks may or may not touch all of the DATACOPY targets.  This may seem inefficient, but given the speed of DATACOPY (usually), this isn’t an issue.

Watch out for that documentation

I was working on an allocation and used that the never-fails, sure-shot, good-as-gold block creator, DATACOPY.

But something odd was happening – the script was running, but I wasn’t getting any results.  What oh what could it be?  Surely not a block creation issue as I was, after all, using DATACOPY to create the blocks and then CLEARDATA to remove the data values. 

Happily, my client, Dave Anderson, is smarter than I am (NB – Consultant  trick #1 – somehow convince the client you’re not an idiot, make sure he’s smarter than you, have him solve the problems, you blog about the solution.) and suggested that the blocks I thought I was creating in fact weren’t really there.

I will admit a certain degree of skepticism because I knew CLEARDATA does not clear blocks, even if all data values in a block are #MISSING.  (NB – Consultant trick #2, when nothing works, and the client makes a suggestion, at least go through the motions of listening to him.  You might be surprised.

I ran just that segment of the code and did a before and after block count.  Yes, I cleared out the entire database and ran everything from scratch and stopped right before this step; that was my initial block count.  Then I ran the code and checked the block count in EAS.  Uh-oh, same number of blocks.  Commenting out the CLEARDATA gave me…a higher block count.  Double uh-oh and a what the ? moment.

MVFEDITWWW, aka Sample.Basic

My Very Favorite Essbase Database In The Whole Wide World is the place I prove out BSO concepts. 

Sample.Basic’s density

Dimension
Type
Year
Dense
Measures
Dense
Product
Sparse
Market
Sparse
Scenario
Dense

Yeah, it’s weird, particularly the dense Scenario dimension at the end.  But the density setting of Scenario will be important as we’ll shortly see.

Are you sitting comfortably? Then I'll begin.

Lock and load

I created a copy of Sample.Basic, cleared it out, and populated the following spreadsheet.
If this is the only data in the database, how many level zero blocks do you think there are?  You get a gold star if you say, “One, and duh, what else could there be?”

Fire for effect

Now run this code:
SET UPDATECALC OFF ;
SET AGGMISSG ON ;
SET EMPTYMEMBERSETS ON ;
 
CALC ALL ;

How many upper level blocks?  Eight – take a look at the spreadsheet below and count the highlighted rows:

I know, I know, block creation 101. But this is important and there’s a test at the end of the semester, so let’s review:  You had one block from the lock and send and eight blocks from the CALC ALL. 

A change is as good as a rest

Bringing DATACOPY and CLEARDATA in, let’s create a block and then clear out the data

SET UPDATECALC OFF ;
SET AGGMISSG ON ;
SET EMPTYMEMBERSETS ON ;

CALC ALL ;

FIX(@LEVMBRS("Market", 0))
  DATACOPY "Actual"->"Product" TO "Budget"->"100-10" ;
  FIX("Budget")
    CLEARDATA "100-10" ;
  ENDFIX
ENDFIX

Data is copied to Budget->100-10.  Does this increase the block count?  Nope, Budget is dense and data already exists at Actual->100-10, so the DATACOPY doesn’t add a block.

The number of blocks is now nine – one from the lock and send, eight from the aggregation in CALC ALL, and zero from the DATACOPY as the block at 100-10->Budget already exists.

CLEARDATA does just what it ought to – remove the value of Actual->Product from Budget->100-10 and nothing more.  It sure doesn’t remove the block.

Stop making sense

So let’s make this all blow up/work just like most Essbase databases and make Scenario sparse, not dense.

In theory, the database will now have 10 blocks:
1 level 0 from a lock and send
8 upper from the CALC ALL
1 additional 0 from the DATACOPY

Clear out the database, send that first sheet, and then run the code with the CLEARDATA.

What happens?  Do we in fact have 10 blocks?  Nope, because a CLEARDATA when dealing with purely sparse dimensions…removes blocks.  Well, it also clears out the data, which is after all sort of implied in its name.

Arrgh.

Fix it in one step

Get rid of the CLEARDATA and replace it with =#Missing.  Yes, it’s that easy.

SET UPDATECALC OFF ;
SET AGGMISSG ON ;
SET EMPTYMEMBERSETS ON ;

CALC ALL ;

FIX(@LEVMBRS("Market", 0))
  DATACOPY "Actual"->"Product" TO "Budget"->"100-10" ;
    FIX("Budget")
      "100-10" = #Missing ;
    ENDFIX
ENDFIX

Run it all again and ta-da, 10 blocks.  Whew. 

Yes,  assigning  a value of #Missing is slower than CLEARDATA.  But it works, and in the end that’s what counts.

Conclusion

This was a real stinker to figure out.  CLEARADATA not removing blocks is (was) such a basic piece of my BSO knowledge I refused to believe my eyes when my code didn’t work. 

So this was a salutary lesson in observing and questioning.  And oh yes, listening to your client as well. 

Popular Posts