Microsoft SSDT, Birst and Analytics Meetup

I attended a community event recently.  Data Analytics.  They presented a tool from Microsoft called SQL Server Data Tools:




It's a free add-in.  Used to manage the SQL Server databases.  In that you can connect to an existing SQL Server database, and pull in the entire set of artifacts.  Make changes, redeploy, it keeps track of the scripts.  You can do compares against projects, other database, even dacpacs.

And the project gets saved to a source code repository like Team Foundation Server.

It allows "snapshots" of database as points in time.  It's got tons of configuration settings.

If you work someplace where the production database is off limits, you can hand off the dacpacs to someone and they install without having to be a DBA.  Good for change management.

For smaller teams working without a DBA, its optimal for version control, backups, restores, applying changes to other databases and compares.

It doesn't save the data, but they showed a feature to compare data, for smaller data sets as it could cause a performance hit on larger tables.

The presenter mentioned Service Broker.  The objects have dependencies and making changes can be quite a challenge.  With this tool, you can make changes and apply them and they get deployed to  the database easily.

I wouldn't necessarily do my development in this tool.  I'm going to start using it immediately.

The next presenter discussed the product Birst, a data tool in the cloud.


He demonstrated the ability to pull in data, from Excel and Amazon AWS cloud, it builds the model for you, including Fact Table and Dim tables, on the fly, with no assistance.  That right there is quite awesome.  Although I'm not sure how one would apply business rules to the ETL and handle NULL values.  It even built the Time dimension without being prompted.

Once the model was created and the data was available, it has the ability to do Data Discovery and poke around the data with different charts and graphs and tables.  Those can be added to a Dashboard.  And it also has a more granular feature to build other reports.  And a filter feature to connect to the Dashboard for user interaction.  And KPIs.  As well as a client component installed on a local database which pushes the data, although having a live connection could take performance hit.

They said a new release is expected in 2 weeks so some of the features should get enhanced.  I thought it was a cool product and ideal for the Power User / Data Citizen, one who sits on both sides of the fence, having domain knowledge and technical skills and likes to dig into the data.

The last presenter discussed Analytics.  Gave examples of Target sending out coupons to people who were thought to be pregnant, even a teenage girl who hadn't revealed her pregnancy.  But it helps to explain how much information is collected to get a good picture of their client base at the detail level.

He also explained how the new algorithms are used to assess the optimal marketing campaigns, which can not be duplicated by humans on it's precision and  accuracy.  And he also talked about Social Media analytics and Sentiment Analysis.  It's quite powerful.

It was a good event, met some nice people, learned some new things.  Good stuff~!

Pros and Cons of Technical Consultants

Consulting has many benefits and challenges.

There's flexibility.  Variety.  Constant challenges.  Problems to be solved.  Deliver results.  New people.  New places.  New technologies.


Expectations that you know everything.  Tight deadlines.  Time management.  Multiple concurrent clients.  Pressure.  Estimating Projects.

Typically, the role of a technology consultant is to deliver results.  On time.  Under budget.

What if the estimated hours were too low?  Work harder.  Smarter.  All night.  Whatever it takes.

What if you're expected to work in a technology you aren't an expert.  Figure it out.  Fast.

What about the hours you have to travel to the client, there's drive time or flying time?  It's just part of the job.

What about learning new technologies to maintain expert level?  You have to carve out time, stay up late.

What about family time?  You have to become efficient in time management.

What about eating healthy and staying in shape and reading non technical book or laundry or catching up on sleep?  Again, time management.

What about speaking or attending technical community events, or writing blogs or authoring a book or two?  Again, time management.

What about spending time with co-workers at your consulting firm?  Again, time management.  

What about filling out expense reports and keeping track of hours worked per task in time management system?  Again, time management.

What about laptop maintenance, installing new software and service packs and hot fixes?  Again, time management.

What about time to study for Certification exams?  Make time.

Consulting is a fast paced world.  You have to pace yourself.  And drink plenty of coffee.


Some Thoughts About SQL Server Reporting Services #SSRS #MSBI

SQL Server Report Services is a great product for creating reports.  What I like about it, the IDE, the tight integration with Team Foundation Server, the ability to customize every feature and having multiple data sources within a single report.

It has a web server to distribute reports via the web and schedules and you can add them to on-Premise SharePoint.

I've used 2005, 2008 and 2012.  It's got a lot of great features.

What are some of my frustrations?

The formula dialogue box is sometimes difficult to navigate.  The field are available to choose from at the bottom, the font could be adjustable for easier viewing.

I'm not aware of a formula editor to validate the syntax.  When incorrect, it displays a red underline under the entire formula.

It would be great if the exact error was displayed and perhaps a recommendation on how to fix it.  Sometimes the IIF statements get quite nested and trying to determine where the error is takes longer than it should.

And while the dialogue is open, it would be nice to switch to another formula expression without having to close and reopen.

And perhaps a button "Validate Formula" to test it instead of closing out, running the report to see if it works.

Everything is customizable.  Some things are found when pressing F4 in the properties Window.  Others appear in a dialogue box.  There are so many customizable features, it's not always known where to change a setting.

The Group Headers and Footers and Details along the left hand side are not easily identifiable.  Especially when adding many rows to a Group Header.  Perhaps a label or hover over feature.

I've seen in other reporting tools the ability to export the entire report contents to a text document.  This documents the report, as well as helps to search through the doc to identify where a particular formula exists.  I like the ability of SSRS to view the source code, but beware, one slight change and the report breaks and sometimes unrecoverable.

I did not see any easy way to add back a Group Header of Footer easily, other than to remove the Tablix and recreate it.  I posted my solutions here:  http://www.bloomconsultingbi.com/2015/07/ssrs-add-group-header-back-to-tablix.html

I like SSRS ability to call another report from within a report and adding hyperlinks to spawn new reports and actually writing custom JavaScript code to open a new window.  I like the ability to add multiple data source within a single report as well as Share Data Sources.

When adding a field to a Group, it typically requires a "First" or "Sum".  And then you have to specify the Group name within quotes to let it know which group to grab the data from in regards to totaling.  This feature requires additional keystrokes which slows down the development and if you cut and paste a formula you have to debug it.

One of the best features about SSRS is the ability to export to Excel.  I've been using Crystal Reports since version 5 in 1996 and there was always an issue export to Excel.  SSRS has a tight integration with Excel.  Great feature.

I like the way the parameters are laid out.  Pulling data from Data Sources, specifying the default value, can all be done in code.

When adding fields to a report, it typically displays the fields in the order they appear in the query.  Having an ability to sort the list would be nice, as sometimes there's a hundred fields or more to hunt and peck through, taking time.

Overall the product is awesome.  Since I've been writing reports non stop this week, I just wanted to blog about some of the things I've noticed.

And there you have it~!


Intelligence is Artificial and Is Becoming Domesticated

Who are we as a species?  Where did we come from?  How did we get here? And why?

These questions have lingered since the birth of civilization.  How do humans suddenly appear, from life in the jungle to the prairies to agriculture to city complexes?

What triggered our ascent into intelligent beings?

Some people say the thumb is the key factor in our accelerated growth, allowing tools to be made, for better survival.

Perhaps the ability to walk upright as bipedal hominids.  Able to outrun wild game and travel long distances, in search of food.  Hunters and gatherers.

The leap to farming and culture and education and  monetary system and writing and worshiping the Gods, how did that spring forth in such accelerated fashion?

Aside from cave drawings, megaliths sprinkled across the globe in non random fashion, we have very little knowledge of our ancestors going past 40,000 years, if that.

One day we were killing sabor tooth tigers with flaked stone spears, the next day we had organized society.  How did  the leap occur?

Terence Mckenna had a theory that ancient humans wondered into the prairies and began to eat the wild berries and mushrooms.  Turns out the mushrooms were hallucinogenic, expanding the minds of the primitive beings.  And that was they key to our advancement.

As Shaman's learned the plants for medicinal use, to steer the tribes towards food and shelter and healing the sick.  Perhaps that opened some neural pathways which allowed cognitive thought to develop.  And with the advent of leisure time, people build civilization.

And we've been on an upward journey ever since.  Our animal instincts have laid dormant as we are able to find food and shelter rather easily, so our survival system isn't needed as much.

The internet has spawned a completely new society in that the transformation includes every aspect of our lives.  Similar to the Industrial Revolution, the Digital Revolution has sparked youthful hope for the future.

With increased storage capacity, decrease in costs and a connected world, we are creating mounds of data from everything and everywhere.  And some business' are capitalizing on this phenomenon by storing everything, collecting, mining, predicting and finding patterns we've never been able to see before.

Intelligence has been captured and domesticated in the form of databases and querying tools.  There's a lot of possibility to advance mankind for the good.  Like finding cure's to diseases, keeping the economy in balance, and especially sharing the vast amount of knowledge to the masses.  It's unprecedented.  It's the opposite of the "dark ages".  It's the age of digital enlightenment.  The information  is just sitting there on the web, for anyone to access at anytime for as long a period as they wish.  The barriers to education have been removed.  This is good.

Technology allows us to do things faster, more efficient and cost effective.  Streamlining processes, real time business' and a sharing economy.  Intelligence has been decentralized.  To the masses.

And this intelligence is artificial in nature.  Not only is the information retained for long periods, the systems can learn from the data.  It can teach itself.  It's an oracle.  All knowing.  But it's still artificial.

It's not a living thing.  Like a tree.  Or a giraffe.  It's a bundle of circuits connected through solder.  It's not alive.  It doesn't self replicate.  It doesn't have survival instincts.  It doesn't have a thumb or walk upright.  Yet.

It doesn't have a soul.  In that it's not connected to the Universal Life Force that connects everything.  When unplugged, it goes away.  When plugged back in, it's awake.  I don't think mankind can create a living being out of circuits.  For now.

Augmenting the body with enhanced abilities is possible.  Perhaps embedding chips into the human body, a hybrid could be formed.  Which could extend the life of the recipient.  Or give an advantage over non hybrid humans.  Once the ethic boundaries are worked out, that could accelerate.

And we have advanced robotics that can travel difficult terrain.  And we have simulated reality through a digital portal where humans can interact and affect the inner worlds.  And we have algorithms that can identify patterns and sift through tons of data at a good speed.  We are very eager to tame this new intelligence to serve mankind.  There are in fact many benefits to all of this.

Strangely though, I'm reminded of a movie I saw many years ago.  "The Gods Must be Crazy".  Here's a link:


The coke bottle could be viewed as Artificial Intelligence and the Kung Bushmen could be viewed as us.  I don't think we are in a position to bury the bottle outside the village.  So as technology spins us into the future, let's hope for the best.  As society will look different in a few years.  That's for sure.


SSRS Add Group Header Back to Tablix

This week I got the opportunity to write some SSRS code for a client.  The task was to convert 6 Crystal Reports to SSRS SQL Server Reporting Services 2012.

So the SQL was replicated to the new server, added indexes and began building the report.

The Tablix report consisted of 5 Groups and 10 Detail rows.  After creating the 5 Groups, at some point I removed a few of the Group Headers to clean up the report.  Finished, moved to Dev to testing.

Began the second report.  Was very similar to the first.  So I made a copy, renamed, added to the Project, connected the SQL to the report via Data Source.  And realized the need to add back the missing Group Headers.

I scanned the internet, Google and Bing.  Didn't find a workable solution.  So I pulled one the older version of the Project, which I saved to another location for just this purpose, loaded the project and began to compare the old code with the new code.

First thing, I saw the rows had to be added back in the Header.  Looking at the old report, Textbox76 was a field that had been removed.  

Then opened up the Report in Code mode:

Copied the entire "TablixRow" XML Code from old project to new project:

And did the same for the other 2 rows.  Sure enough, the rows re-appeared in the new SSRS report.

Next step, there was a difference in the XML around the Group field:

Here, you can see additional code in the old report, highlighted here:

Actual code here:


Then the Tablix report looked like this:

 Then added more code to the other Groups:

Now we have connecting lines indicating the newly added rows belong to the Groups as Header.  The desired outcome:

Group Headers have been re-added back to the SSRS after being deleted.  The trick is to go into the XML code and add the missing TablixRow first, then add the Group XML code.

And there you have it. Hope this helps~!

Common Coding Techniques

Programming languages require code.  And code can be written many different ways.

Some people like to write code as simple as possible, using the fewest lines of code and character space.  While efficient, this can sometimes be difficult to maintain.  And understand.  Perhaps they combined 9 lines of code into a single line.  Maybe the variables names are not descriptive.  For example:

Declare @a nvarchar(6)
Declare @b nvarchar(25)

Uh, what's @a signify?  And @b? Who knows.  Job security.

Intuitive Code
There's lot of code out there, where you know the programmer went out of their way to make the code complex.  Because when you are the person who has to maintain it, the code is not intuitive.  If you've spent time in the trenches working as a maintenance coder, you know the importance of writing code that people can understand.  Maybe back in 1968, where the programmer only had 2kb of ram for the entire program, writing super efficient code was valuable.  Now a days, code should be written with the maintenance coder in mind.

And what about comments.  Most code I've seen has limited to no comments or comments that made sense at one time but no longer valid and only serve to confuse the next person.  There should be a flag in the IDE to mandate good coding practices.  Perhaps set it to No Comments, Limited Comments, Required Comments and Shrink Wrap Comments.  And each org can mandate the level required.

Camel Case
When writing code, it should be legible for the next person.  So deciding on a language preference, it's good to pick one and stick with it.  In T-SQL, there are so many ways to write the code.  For example:

[Current Price] [bit] NULL,
[current price] [bit] NULL,
[current_price] [bit] NULL,

[CurrentPrice] [bit] NULL,
[currentprice] [bit] NULL, 
[current-price] [bit] NULL,

You get the point.  Many ways to confuse the next developer and make their lives more difficult.  If that's the intent.

Back in the 1990's when I was learning to code, the Senior Developer made it very clear, on coding standards and sticking to them.  He said you should be able to read the code like a book, following the plot, as you step through the code.

At the time, I did another no-no.  I was learning to code in Visual Basic, so I tried different approaches to conquering the same thing.  For example, in one place, I would use a Class.  In another, I used a User Defined Function.  In another place, I did something different.  He made sure I knew about the error and not to do it again.

Run On Code
You've heard of the run on sentence.  Goes on forever.  It's possible to jam too much code into a single block of code.  Like a Stored Procedure that goes to 4000+ lines.  Sometimes it's good practice to break the code into mini blocks.

Code that Calls Code that Calls Code
There are ways to make code extremely difficult.  One technique, is to create code that calls code that calls code.  A classic example is a View in SQL that calls a View that calls a View, which gets called by Stored Procedure, on a Linked Server.  Tough to track the code down and figure out what's happening where.  Need Triple-A road map sometimes.

Writing Bugs
There's probably a few bugs in every piece of software, undetected.  The most bizarre bug I ever troubleshot was in a CASE statement, it went on forever and checked for endless combinations.  In one, it was checking an "O", as in Oh.  Stepping through the code, it wasn't falling into the CASE statement as expected.  Looking more closely, it turned out the developer used a Zero 0 instead of an Oh O.  Obviously they look the same.  Gotcha!  That was a good undetected land mine.

Run on Formulas
This one is significant.  If you've ever had to translate spaghetti code from Excel formulas to T-SQL, you know what this one is all about.  I had one project, where an Excel notebook had 10 tabs.  Each tab was considered one report.  Except each tab had 10 report within.  So there were actually 100+ reports, billed as 10.  For example, one report had some fields.  One of those fields called another tab in the workbook.  That tab had some funky logic, which called another tab.  Which called another tab.  Which got produced from an Access database.  And the Access database had a 15 step process to prep the data.  Which it pulled from the source database, SQL Server.  This was an absolute nightmare.  Job security for the author, none the less.  I still wake up in the middle of the night sometimes dreaming I'm stuck somewhere along the chain of logic from this code.  Perhaps in time, it will get better.

So there you have it.  Some pieces of information on coding.  Perhaps someone can relate to one of these topics.

Thanks for reading~!


Data Blockages Can Give Your Org a Heart Attack

Around 2004, I  joined a tennis club near the house.  The club was basically run down at the time.  But for some reason, this was the happening place amongst the tennis community.  People traveled from all over the county to get a good game of tennis.

Basically,  people just showed up around 4pm,  without a pre-set match.  People would team up as they arrived, find a court, and start playing.  It was free form.  After the set finished, players would change courts and find new people to play with.  Lots of good energy.

And there was one guy,  who orchestrated things, behind the scenes.  He knew everybody.  And  he was always willing to  talk with people.  And his network comprised everyone on the courts.  Essentially the go-to guy for information on  the happenings of the  tennis community.  People freely shared their info because they received information in return.

That guy knew the power of information.  And how to get it, distribute it.  And that allowed him insider information and an elevated status.

In the workforce, information typically flows downhill.  Those at the top, know what's going on at most levels of the org.  And that knowledge allows them to run the business.  That top down system is dismantling.  Because the information is now flowing in from the data.  All the applications running the business, from lead generation, to sales, to processing and  fulfillment, accounts receivable and payable, to renewal history and customer retention.

So if you want to keep a pulse on the business, you have to have access to the data.  In the form of reports  and dashboards and KPI and mobile and self service.  And that pattern has been growing the past years.

Now we can access data from repositories on the web, public data, social data, scraping web sites, leveraging non traditional data forms such as log files, unstructured data, emails, semi-structured data, etc.  And there're tools to store this data for long periods for later retrieval.  To look for patterns, knowledge and insights.

So upper management is now placing emphasis on the data.  To get a glimpse of the past (traditional),to Predictive Analytics, as well as Prescriptive Analytics, changing  the future events to conform with desired expectations.

So what does that mean?  Management is not only about managing people, products and flow.  It's also about metrics and insights and trends and patterns.  Does the traditional MBA prepare people for this new work environment?  Can a CEO manage the business without access to data?  Can a business maintain their competitive edge without data or a social networking presence?

The world is changing.  Being disrupted.  Forcing change on every company and business and government organization to adapt to survive.  How are we preparing the future generation to thrive in the future world when they're learning outdated skills.

The rules that dictate the world are crumbling all  around us.  Those who have access to pertinent information will rise to the top.  And those who can  interpret the data are even further ahead.  And those who can make changes to existing systems based on the data will be still  further ahead.

What are you doing today to create a culture embedded in data.  Data needs to flow through the org and empower workers to  take action.  Because the information is no longer silo'd in the executive offices.  The data trickles up, instead of top-down.

Those who have access to the information are the go-to people in today's world.  Just like the guy at the tennis club.

The world is getting smaller.  Real time.  24x7.  Always connected.  Less barriers to information.

Data is not the new oil to be mined.  Data is the blood of every org.  Flowing through every appendage, like veins.  If you got blockages, you're heading for a heart attack.  If the data is flowing smoothly, you're org is in  good health.  What is the health of your org?


Will robots make people obsolete? (1959) Parade Magazine Article by Sid Ross

Speaking of Robots, I happened to see a good article posted on Twitter last week.

Will robots make people obsolete? (1959)


Interesting article in that it was written in 1959.  So after reading, I noticed it was written for Parade Magazine.

Interesting that my Grandmother's Sister's Husband wrote for Parade Magazine during that time.  But his name was (Uncle) Sidney Rosenblatt.  The author of this article is Sid Ross.  Similar, but different.

So then I did a quick search on the internet, and found this article:


Ben Ross, who's real name was Ben Rosenblatt and his brother Sidney, worked for Parade Magazine.  Apparently they used the pen name of "Ross" because Jews weren't looked highly upon during that time.

So Sid Ross is actually (Uncle) Sidney Rosenblatt.  What a coincidence.

I grew up with family get together's and remember Uncle Sydney and Aunt Minna.  Very nice people and always had great stories.  In fact, when I was very young,  he would send me and my brother letters, and instead of writing with a pen or pencil, he cut out letters from magazines and newspapers to form the words of the entire letter.  Must have taken a long time to do.

His wife, Aunt Minna, owned a Tiffany Lamp store on 5th Avenue in Manhatten, NY.  Our family went there a few times and I remember distinctly being told not to touch any of the lamps, as some of them cost more than I'd earn in my lifetime.

Uncle Sidney went back to school later in life, got his degree in his 80's I think.  And he was always traveling to other countries to transport the lamps to customers, and he would stop at the smaller cities to look for books on the Jewish Holocaust.

And in this article, he donated a lot of money and books to form a Holocaust library at Fordham University:


What a coincidence!  Thought I'd write about it~!

SSAS Performance Metrics using Fiddler, SQL Server Profiler and Logging

Working in Business Intelligence, sometimes its necessary to watch the HTTP traffic on the production server.  Perhaps to take metrics.  Before and after setting changes.  Like adding Cube Warming to an SSAS Cube for example.

I like to use the Fiddler application.

So you start a new session, it captures all the network activity on the server:

Looking at the Fiddler website, it turns out the logging feature to Database, is an added feature.

The web sessions show all HTTP and HTTPS sessions captured by Fiddler, the messages include certain key information:
  • # – An ID# of the request generated by Fiddler for your convenience
  • Result – The Result code from the HTTP Response
  • Protocol – The Protocol (HTTP/HTTPS/FTP) used by this session
  • Host – The hostname of the server to which the request was sent
  • URL – The path and file requested from the server
  • Body – The number of bytes in the Response body
  • Caching – Values from the Response’s Expires or Cache-Control headers
  • Process – The local Windows Process from which the traffic originated
  • Content-Type – The Content-Type header from the Response
  • Custom – A text field you can set via scripting
  • Comments – A text field you can set from scripting or the session’s context menu
Has to be customized with script.  Instructions found here:

Created table in db:

USE [SSASQueryLog]

DROP TABLE [dbo].[tblSessions]


CREATE TABLE [dbo].[tblSessions](
       [RowNumber] [int] IDENTITY(0,1) NOT NULL,
       [ResponseCode] [nvarchar](4000) NULL,
       [URL] [nvarchar](4000) ,
       ClientConnected [nvarchar](4000) null,
       ClientDoneRequest [nvarchar](4000) null,
       ServerConnected [nvarchar](4000) null,
       ServerGotRequest [nvarchar](4000) null,
       ServerBeginResponse [nvarchar](4000) null,
       ServerDoneResponse [nvarchar](4000) null,
       ClientBeginResponse [nvarchar](4000) null,
       ClientDoneResponse [nvarchar](4000) null


To add code to Customize Rules:

 I added the code, and was able to save the contents of the Fiddler HTTP status updates into the database.  From there, you can write SQL code to determine time intervals between fields.
    // Log the currently selected sessions in the list to a database.
    // Note: The DB must already exist and you must have permissions to write to it.
    public static ToolsAction("Log Selected Sessions")
function DoLogSessions(oSessions: Fiddler.Session[]){
        if (null == oSessions || oSessions.Length < 1){
            MessageBox.Show("Please select some sessions first!");
        var strMDB = "C:\\log.mdb";
        var cnn = null;
        var sdr = null;
        var cmd = null;
            //cnn = new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + strMDB);
            cnn = new OleDbConnection("Provider=server;Data Source=database;Initial Catalog=SSASQueryLog;Integrated Security=SSPI;")
            cmd = new OleDbCommand();
            cmd.Connection = cnn;

            for (var x = 0; x < oSessions.Length; x++){
                var strSQL = "INSERT into tblSessions ([ResponseCode],[URL],[ClientConnected],[ClientDoneRequest],[ServerConnected],[ServerGotRequest],[ServerBeginResponse],[ServerDoneResponse],[ClientBeginResponse],[ClientDoneResponse]) Values (" +
oSessions[x].responseCode + ", '" + oSessions[x].url + "', '" + oSessions[x].Timers.ClientConnected + "', '" + oSessions[x].Timers.ClientDoneRequest + "', '" + oSessions[x].Timers.ServerConnected  + "', '" + oSessions[x].Timers.ServerGotRequest+ "', '" + oSessions[x].Timers.ServerBeginResponse + "', '" +  oSessions[x].Timers.ServerDoneResponse + "', '" +  oSessions[x].Timers.ClientBeginResponse  + "', '" +  oSessions[x].Timers.ClientDoneResponse + "')";
                cmd.CommandText = strSQL;
        catch (ex){
        if (cnn != null ){

Don't forget to add this line to the top of the page:
import System.Data.OleDb;

And query the data through SQL:

SELECT [RowNumber]

         ,convert(datetime, [ClientConnected]) as [ClientConnected]
         ,convert(datetime, [ClientDoneRequest]) as [ClientDoneRequest]
             when IsDate([ServerConnected]) = 0 then '1900-01-01 00:00:00'
             else convert(datetime, IsNull([ServerConnected],'01-01-1900 00:00:00'))
             end as [ServerConnected]
             when IsDate([ServerGotRequest]) = 0 then '1900-01-01 00:00:00'
             else convert(datetime, IsNull([ServerGotRequest],'01-01-1900 00:00:00'))
             end as [ServerGotRequest]
             when IsDate([ServerBeginResponse]) = 0 then '1900-01-01 00:00:00'
             else convert(datetime, IsNull([ServerBeginResponse],'01-01-1900 00:00:00'))
             end as [ServerBeginResponse]
             when IsDate([ServerDoneResponse]) = 0 then '1900-01-01 00:00:00'
             else convert(datetime, IsNull([ServerDoneResponse],'01-01-1900 00:00:00'))
             end as [ServerDoneResponse]
             when IsDate([ClientBeginResponse]) = 0 then '1900-01-01 00:00:00'
             else convert(datetime, IsNull([ClientBeginResponse],'01-01-1900 00:00:00'))
             end as [ClientBeginResponse]
             when IsDate([ClientDoneResponse]) = 0 then '1900-01-01 00:00:00'
             else convert(datetime, IsNull([ClientDoneResponse],'01-01-1900 00:00:00'))
             end as [ClientDoneResponse]

  FROM [SSASQueryLog].[dbo].[tblSessions]


Another thing I did was to create metrics using SQL Server Profiler.  You basically start the application, then click on the website, which generates MDX queries from the PerformancePoint Dashboard, and it captures the data to a custom table.  From their, we were able to extract the MDX queries and add them to an SSIS package to run the queries after the ETL job runs and after the cube is processed.  Warming the cube is basically calling some queries to get them loaded into Cache, so when the users click on the reports, the queries are "hot".

How to implement cube warming strategy.http://www.sqlis.com/post/SSAS-Cache-Warming-Using-SSIS.aspx
And a reference, Chris Webb was kind enough to forward the SSIS project to me after I sent a DM on Twitter, thanks Chris!

There’s a tool to assist in the Cube Warming process: