3/31/2013

Do Business Intelligence Experts That Know Everything Exist?

I'll be the first to say, I don't know everything about Business Intelligence.

There are lots of smarter people out there than me.

But why is it that nobody is really an expert on BI.

BI is such a complex subject.

So many angles to look at.

Mobile, Traditional, Self Service, Dashboard, Visualizations, Big Data, ETL, etc.

Some people are experts on specific features of BI.

Yet nobody I know is an expert at everything.

Because there's so many facets to it.

And because it's growing every day.

And so many vendors in the space have their distinct flavor.

It's like trying to describe the scenery at 100 miles an hour.

You get glimses along the way, yet it's difficult to see and know everything.

The view from the high level down to the low level details.

So I'll just keep trying to learn as much as possible in the time available.

If you find someone that knows everything about everything on BI, I'd like to meet them.

3/29/2013

Day 2 Converting c# Web App to SQL

As a follow up to yesterday's post, I've been converting a web based c# application to SQL.

So there are 3 ways to view the data, A, B and C.

Each time any of those are called, it can be run for Weekly or Monthly.

And for Pending or Closed sales.

And each of of those calls the database twice, once for the legacy data and one for the latest data.

And each one of those calls the database 5 times, one for every day of the week or all the weeks of the month.

So as you can see, I estimate there's potential for 60 database calls.

On a single report, the database gets 20 calls per page refresh.

And there's a timer built into the app to refresh the report automatically every so often.

I was able to deduce all this into 2 queries.

Which pull back all the possible combinations.

And the new code pulls every Sales Person, not just a specific group.

And it runs in about 4 seconds.

However, I've only done half the job so far.

As there's also more calls to another database which I'll begin working on now.

Converting the business logic from c# to SQL is a tedious job.

However, I pulled both reports and matched every single total for every single sales person.

Now onto part 2.

Here's the post from yesterday in case you missed it:

http://www.bloomconsultingbi.com/2013/03/converting-c-web-app-to-sql-for.html

3/28/2013

Converting c# Web App to SQL for Centralized Reporting

I spent the day digging through the logic of a c# web application.

In attempts of converting the business logic to SQL.

It's not as simple as it seems.

The SQL is built on the fly depending on rules.

And it loops and calls queries from time to time.

The goal is to get the Business Logic into a Stored Procedure.

Which will either go in an SSIS package.

Or a standard SQL Agent job.

To run every hour or so.

And pull the new data.

Add it to some local tables used for reporting.

To be queried from SSRS and the Dashboard.

Can blast SSRS email subscriptions possibly.

And perhaps create a Tabular Model from it.

For now just trying to understand the flow of the web app.

Tomorrow can dive into the logic a bit more.

Data Is Where It's At

When you think of IT of yesterday, it was all about getting data into the database.

Through a client-server front end or a web based application.

And most programmers wanted to be on the front line creating the those pieces, whether it was the GUI, the middle layer tier or the data connector layer.

I believe that IT today is much different.

It's the data that's important.

Pure and simple.



Because the data has the insights.

And the data can help drive your business.

And the data can help manage your business.

By creating processes to measure results.

And there's no shortage of data.

It's only going to grow.

And those volumes of data need to be analyzed.

And you can syphen the nuggets of gold.

And use that to your advantage.

Whether you sell that info as knowledge.

Or use it for internal purposes.

I say you can hire a web developer to build you a front end.

Contract them out if need be.

But if you want the most bang for your buck, hire some smart data people.

And you can't go wrong!

3/27/2013

Windows 7 Wouldn't Boot Up

So while on vacation my Dell laptop stopped working.

Except I was able to boot in safe mode.

Do some defrags, check for virus, etc.

And suddenly is began working.

Then got it home.

Same thing.

Doesn't work.

Except now it doesn't boot to safe mode.

So I created a repair disk off my other computer.

Booted from CD/DVD, tried the repair option about a thousand times.

Tested the memory, seemed fine.

Then went into DOS and applied the following commands.

BOOTREC /FIXMBR
BOOTREC /FIXBOOT
BOOTREC /REBUILDBCD
BOOTREC /SCANOS

I have no restore points.
So I decided to use Dell BackUp Safe to reinstalled Windows.

It stores off all your files to the C:

Then overlay Windows 7.

Then when booting up pressed the F8 button repeatily.

Clicked on load last good restore.

System booted up, except I revoked the Administrators password with 3 incorrect attempts.

So I loaded up again in Safe Mode with Networking, added a new user, rebooted and voila!

Success!!!!

Now restoring files from the Dell BackUpSafe!


With a brand new clean install of Windows 7!

If I had to guess what caused the Windows 7 crash, I think it was a bad install / version of Internet Explorer 10, which got hosed on install, I tried to back it out, wouldn't take, tried to install IE9 again, no luck, then disabled IE 10.  Just the other day I re-enabled it and I believe that's what caused the meltdown.

Just a hunch though!

Moving Forwards with PerformancePoint Dashboards

We gave a presentation the other day to demonstrate the capabilities of Microsoft SharePoint PerformancePoint Dashboards.

To the CEO.

Personally, I thought I could have done a better job.

However, there were time restraints.

And there were infrastructure constraints.

As we are on a new domain now and I'm not able to develop on my local workstation.

I must work on a VM on the other Domain.

So the Dashboards weren't crisp enough for my standards.

And for some reason the numbers didn't appear on the Dashboard, perhaps because the demo was on a Mac.

And I was out on vacation last week so I didn't have time to prepare like I wanted.

However, net result is we are moving forward with development of our first real Dashboard with a real internal customer.

From end to end.

So now I'm in the spec gathering mode as we still don't have the infrastructure in place.

Basically doing the prep work on gathering the data.

Also we are only going to expose the data internally so viewing the Dashboards outside the network is not going to happen, for security reasons, smart choice!

And that's all I've got to say about that!

3/25/2013

#AI Artificial Intelligence Should Mirror Mother Nature, Not Humanity

I was wondering about Artificial Intelligence.

Perhaps we already have a derivative of it today.

We can view the past for historical purposes.

Because many events, actions, things, places are being stored off into relational databases, for scientific purposes, business ventures and for Big Brother to get the all seeing eye.

And now sensors are producing large sets of data for storage and future consumption.

So if you Google a question, you get answers.

Perhaps not in natural human language although very close.

So what would it take to make the leap into pure artificial intelligence.  Well first of all are we attempting to duplicate the human mind.

I say in addition to retrieval of past information, facts and figures, we also need to include jealousy  hatred, revenge, gluttony, rivalry, competition, grudges, peer pressure as well as the 10 commandments.  You see, human mind capacity is not the upper pinnacle of intelligence.  It's somewhere between hell and survival.  We wouldn't want to create a computer that acts and thinks like humans.  Not by a long shot.

What you want to do is to simulate Mother Nature, the laws of Physics, and patterns of the Universe.  Which I say is impossible to duplicate the complexity, mystery and beauty which it contains.

It's like my old joke, I would have gotten better grades in High School if I sat next to smarter people.  Sure I copied of the dumb kids that's besides the point.  We should not be trying to simulate the human mind, that's too low of a bar to reach.

We need to strive for something higher, something more beautiful and pure.

Wake up people.  Nature is the highest form of intelligence, not mankind.  I'd say humans are at the 3rd grade level in the hierarchy of the Universe.  And sadly, proud of it.

3/18/2013

Public Speaking On Business Intelligence

I haven't done many public speaking events, basically ever.

I did one in 2008 while working at the County job, we basically had to present to the top brass to meet our goals for our yearly review.

I presented on Crystal Reports at the time and blew them away.

They had no clue about BI, as they basically locked everything tight and never let their customer's view the data.

After that, I presented at the Tampa SQL BI User Group.  For that one I was given no notice and did a pretty good job considering.

And this past weekend, I was asked to speak for an Architecture group which you can listen to here.

Just click on the Orange button to the right of the Architecture Concepts.

For this Podcast, we met on Skype, so I went out and purchase a set of HeadPhone/Microphone at Best Buy for about $30.

We were scheduled to speak at 2pm and I had my day set around that.  At 1:50, the event got post poned until 4pm, so I went food shopping, did two reports on my side job and then spoke for the hour broadcast, which got edited down to about 43 minutes.

Then went out to eat with the wife and mother in law to 5 guys burgers, mmm, greasy burgers and fries!

Anyway, I hope you get a chance to listen to the podcast and let me know what you thought of my presenting style and / or topics.

I've gotten some positive feedback thus far.

I enjoy speaking about BI and would welcome the opportunity again.

Cheers!

An Overview of Business Intelligence (BI) with Guest Jonathan Bloom

Over the weekend I was fortunate to speak in a Podcast.

The topic of conversation was Business Intelligence.

You can listen here.

New Employer Starting Today!

As of today I work for a new company.

As the Security Business Unit of GFI was spun off to form Threat Track Security, Inc.

The new website is:

http://www.threattracksecurity.com/

This should be a good as many of the applications will need to be spun off and supported by the new company.

Which means I should be inheriting some new work.

In addition to the Business Intelligence I've been working on prior.

And my wife will stay working for the parent company GFI Software in the Accounting dept.

So hats off to a fresh new start!

Cheers!

3/16/2013

@Pluralsight Video on SQL Big Data Convergence by @andrewbrust

I just got through watching a new Pluralsight training video.

The course I watched was by Andrew Brust (Twitter: )  -

SQL Big Data Convergence - The Big Picture

.
It was a great course in that it discussed many of the topics I'm interested.

Specifically the mash up between Business Intelligence and Big Data, BI-BD if you will.

Because that's really the holy grail at this point.

And Andrew gave a good background on Hadoop/NoSqL and Massive Parallel Processing Engine.

There are company's today that are blending the two so that a version of SQL will be able to bypass the HIVE connector, the current methodology for merging BI and Hadoop, which also bypasses the need for Map Reduce job, which significantly increases the run time of queries.

Which will benefit everyone as we can now access this data in our pre-existing BI arsenal of vendor products.

He discussed different Vendor approaches and how each one tackles the same problem.

I like that fact as I've been working with both Cloudera and Microsoft's HDInsight.

The demo appears to have significant increase in speed using the beta Cloudera Insight, which has the SQL engine to parse/query Hadoop.

I also liked the demo using the HIVE connector in Microsoft Excel 2013 to show how easy it is to import data from HIVE - Hadoop, roll it up into PowerView and display the results in cool visualizations.

So I like both implementations Cloudera and Microsoft.

So check out Pluralsight for your latest training, and check out any of Andrew Brust's (Twitter: ) video's as he's got a bunch.

And there you have it!

College Statistics Class

When I attended the University of Florida from 1987-1991, I started off with an unofficial major in Business.

Attended Macro and Micro Economics, Accounting, Calculus and Statistics.

Now the Statistics class interested me for some reason.

I really enjoyed learning about probability, means, averages, + or - 3%, variance.

The professor was really cool and I seemed to do well in the course.

However, when I got a D in my Management course, the teacher refused to let me retake an exam or do extra credit work, I was forced out of the Business major per my Guidance Counselor's recommendation.

I actually thought about becoming a PE teacher buy was soon talked out of that.

My next choice was Liberal Arts, specifically Psychology.

I took some Psychology courses as I had some friends perusing that degree.

I took Fortran and got an A in that, and then dropped the Pascal class because I never went to class.

Then had a conversation with the Guidance Counselor, after reviewing my transcripts thus far, he said if I attended the Summer Archaeological dig at University of South Florida to receive 12 Upper Division credits, I had enough Anthropology courses to major in that and graduate on time.

So guess what, that's exactly what I did.

However, looking back, I really did enjoy the Statistics as one of my favorite classes.

It was interesting and I was good at it.

Maybe I could resurrect my 'Stats' skills into my Business Intelligence job.

Then my degree would have some value.

Public #BigData Services Exposed on the Web

The latest rage is all about Big Data.

And what do many people think of first, Hadoop.

Which is basically a distributed file system over multiple servers, either internally or in the Cloud.

With Map Reduce jobs to query your data sets, typically written in Java, however not always.

So you can query a bunch of files stored in the file system.

What if we took that concept another level.

What if the entire network at work was the file system.

And you could query every document on the network.

As in a live query run against the live network in real time.

Wouldn't that be powerful.

And if you wanted, you could expose certain folders to the outside world.

For world consumption over the internet.

Of course you would have to apply a valid namespace for identification and perhaps some security.

But then you could offer this up as a Service.

Big Data Network Distributed File System.

An example would be to expose Census data for the Federal Gov.  Or just about any data you could think of.

So you wouldn't have to spend a gazillion dollars on traditional databases.

You would simply add a bunch of files to your network or Cloud which you want to expose, apply your namespace, open up to the outside word, similar to a Web Service, in this case, a Public Big Data Service.

Bam!

An entire new industry!

Built on current technology and methodology.

Could this concept have legs?

Job Security No Longer Exists

My first job was answering phones for Sears Credit in 1991, approving credit cards.

After transitioning to my first real job, there has been an unending round of acquisitions  layoffs, downsizing's  rightsizing's, mergers, you name it.

It's just part of the environment in today's workforce.

I thought working for the County would be safe, but we had rounds of layoffs in 4 years.

I have never been safe at any job ever.

And neither have you.

Because job security no longer exist, anywhere.

Now that you know, you can prepare.

Hang on loosely but don't let go.

Grow your network, learn new skills, you never know when the hatchet man a cometh'

And so it goes!

Business Intelligence Architect

My first IT job was doing report writing with Crystal Reports, and Programming with Visual Basic and Database work with Oracle.

And I've worked with many technology's over the years.

So now after 18 years of IT experience, my role is changing.

Because now I've finally got the full life cycle development in the BI space.

Going forward I will be the main BI guy at my job which means gather data into databases, to writing apps to get the data into the db, to report writing, to ETL, to getting the data into cubes, to exposing the data in SharePoint, in PerformancePoint, Power Pivot, PowerView, KPI and Dashboards as well as Hadoop.

I'm now a "Business Intelligence Architect".

Ask Lots of Questions When Transitioning Apps

When an application is being transitioned to me, I ask as many questions as I can possibly ask.

And I hear it over and over again.

"Don't worry, we'll be here to support you if anything goes wrong."

That's a buzz-phrase.

Translation.

"We are going to fling this over the fence as fast as possible, and you'll never hear from us again."

So if you approach the situation from that point of view, you won't leave any stone unturned.

Because first production issue and you're the only resource to put out the fire, nobody's going to want to hear that someone promised to help you out, they are nowhere in sight.

It's your irresponsibility to ensure a smooth transition.

That's my point of view.

3/15/2013

Data Isn't Getting Any Smaller

Data isn't getting smaller.

In fact, it's growing in volume.

And complexity.

However, it's also becoming less expensive to store and archive.

And the number of devices that produce data are increasing.

So what does that mean to us in IT?

An onslaught of more data and complex data.

We're going to have to run faster to keep up.

Which isn't necessarily a bad thing.

Because that will spawn greater innovation.

Because those who can tame data growth will be the winners.

And this is a huge market.

Because as some heavy hitters said recently, we have already entered the Information Revolution.

It's all about the data.

Are you ready?

3/14/2013

Standardize IT Programmer Job Descriptions Maybe

I may have posted about this before.

However it needs to be said again.

Who the heck decides what name to describe the position which requires the skill of programming computer languages to solve problems in elegant and efficient manor?

I see job postings for Senior Engineer 3, Data Specialist, Data Generalist, Programming Analyst, Solutions Developer, Software Developer, Code Monkey, Gunslinging Problem Solver and Fixxer of things Broken.

Well, maybe not the last few examples.

However, what the hell?

How can a system get so fragmented.

In Baseball, anywhere you go, if you say you're a 3rd baseman, I know what you do.  If you're a relief pitcher, I know that too.  A right fielder does not put on his resume, Cather of Fly Balls, Thrower of Ball to Home Plate, Occasional Batter, you get the point.

What we need here is some standardization!

If you say you're a programmer engineer 2, maybe I'd have some clue as to what you do for a living.

Cause honestly, I have no freakin' clue what that position consists of.

That's what I'm talking about!

Continuing Saga of the All Inclusive Dashboard

If you've been following along, the PerformancePoint Dashboard is taking shape at work.

We've got 4 KPI reports in place, 2 Monthly/Yearly and 2 Quarterly/Yearly.

So at a single glance you can see if the indicators are Red, Yellow or Green.

So an Exec can see where to focus attention.

Along with 15 other reports on the Dashboard, a single location to find all the summary data in one location.

What more can you ask for?

I'll get back to you on that one!

New Project to Create a Summary Dashboard

So I've been writing reports here since June of 2012.

And I've got exposure to a variety of data sources.

And I can see how the data overlaps, integrates, resides, etc.

And now I've been asked to compile some Summary Dashboards in PerformancePoint.

Luckily we have a Data Analyst who's been gathering data, creating reports and distributing them to the Execs.

So together, we are combining our knowledge of data, business and technology.

To gather all the important data sources into a single cohesive Dashboard.

With Key Performance Indicators.

It's going to be awesome!

High Level vs Low Level Thinkers

Some people say that there three types of people.

Those who make things happen.

Those who watch things happen.

And those who wonder what happened.

That's probably true.

I believe that there's two types of people.

High level thinkers (HLT).

And low level thinkers (LLT).

HLT's view the world from 10k feet.

They don't get into the details.

They make decisions without looking into all the possibilities downstream.

LLT's on the other hand, can foresee every pitfall miles away.

They see all the details and swim in them daily.

And when the two groups meet, HLT and LLT they collide.

The LLT can't understand why the HLT summarize so much.

And the HLT don't understand why the LLT bring up all these issues and details.

The clash of these two breeds in probably the biggest cause of conflict in the business world and life.

Each group has to see the world through the other lense.

And those who can dabble on each side have a leg up.

So what group are you in ?


3/13/2013

#PerformancePoint Saga Continues - Out of Disk Space

So yesterday I got the PerformancePoint working just right.

Along with the Scorecards and KPIs.

When I got to work this am, Dashboard worked fine in Sharepoint, except holy cow, everything else was broken.

Couldn't delete anything, couldn't save anything in designer, couldn't spawn a new designer, nothing!

So I notified the Data Analyst.  And our SharePoint Admin.

We set up a time to discuss as I had meetings from 10-11:30am.

When he got to my desk, I showed examples of all the error messages.

Including the recent one which said no disk space.

So he logged on to the site and viewed the Event logs.

Sure enough, out of disk space on the G: drive, completely maxed out.

Probably when I created the new cube the other day.

Except it didn't show up for a few days.

So we put a ticket in to have space added.

Once that was done, I logged back in to designer and sure enough everything was back to normal.

So I put it into high gear and added 34 more measure, 34 more KPI, 4 more Scorecards and added to the Dashboard.

Wham!

Looking good!

And that's how we roll~!

Select * From BigData

Big data is Big.

And Complex.

And if it's stored in Hadoop, you may want to query that data.

Alongside your traditional data stored in Relational Databases.

This poses a problem.

Which Microsoft found a solution.

A Driver.

For Hive.

Which allows common ETL tools access to the data.

And copy it to a database.

This is just the beginning of a great flood gate about to open.

Because if you can expose Big Data with Traditional Data in a common Select statement, just think of the possibilities.

Everyone will become a data scientist perhaps.

Where they'll write some code such as "Select * From BigData Where EveryoneIsADataScientist = True".

Wouldn't that be nice!

3/12/2013

KPI in ScoreCard in DashBoard in PerformancePoint in SharePoint

So today I was able to understand how to create a KPI (Key Performance Indicator).

Basically I had to create a new "Measure" in the Tabular Model.

What this does is to expose the data in the KPI because you can only work with Measures.

In essence, I had to create 12 KPI one for each Month, then 4 for each Quarter and one for the Year.

Times 2 as are tracking 2 different things.

Once in Tabular Model measures, you create a new KPI in Performance Point, set the Actual or Sum of item captured, in our case Sales.  Then had to apply filters, for just Renewal Business, for just 2013, etc., etc.

Then to create the other value to compare against, I set it to the Measure created in Tabular.

Once the KPI is created, you then create a new Score Card and add the new KPIs.  I added them in the order they should appear.



Also I didn't figure out how to add or subtract KPIs once the Scorecard was created.

Once you have a Score Card  you add it to the Dashboard in a designated Zone.  You could add Filters here if you wanted.

Now that you have a working Dashboard in Performance Point  you Deploy it to SharePoint where it uploads all the artifacts and opens a new web page where the Dashboard is view-able.

So the first Dashboard is out the door, our Data Analyst will verify the numbers.

One thing she already pointed out was my KPI show % short of 100.  Instead they would rather have total amount / percentage from 0.  So instead of -4% they's prefer 96%.

And I asked for the metrics which determine the Red, Yellow or Green which we should have tomorrow.

All in all a great day!  Can now add this stuff to my resume!

Yesterday's Post here

3/11/2013

#PerformancePoint KPI Status

Was working on Sharepoint PerformancePoint today.

Basically had to figure everything out on the fly with no helpers.

Which was okay as the first report deployed on Friday.

However, I mentioned to the customer we could have KPI Key Performance Indicators.

So today they came back with some figures one for each month for 2 separate KPIs.

So I created 2 tables in SQL-Server, added data values for each of the months for 2013, added both tables to the Tabular Model and can see the values in PerformancePoint designer.



That's where I ran into a wall  - trying to match up summary value to summed up detail value:

Do I need to connect the new tables to the current time dimension where month = month and year = year.  Won't that limit the results for all the reports if done at the cube level?

And why don't the Quota dollar amounts show up as Measures on the PerformancePoint KPI side in the Measure drop down list?

Can sum the 3 values $50 + $75 + $100 for each of the 3 rows in the details Renewed dollar amounts.

Jan 2013  $50
Jan 2013 $75
Jan 2013 $100

And then the Quota value is something like $250, a single value in a single row of a table for Month = Jan Year = 2013.



Did they renew the amount they were supposed to, if not, how close were they, indicate with a Red, Yellow or Green, in this case they renewed $225 and their quota was $250, so show the appropriate indicator color.

Then do the same for specific Quarter or YTD.

So tomorrow I'll do some investigation and maybe ask one of the architects.

I think the dashboard I'm building has 9 distinct components / zones so there's plenty of work that needs to be done.

Good stuff!

Link to Friday's Post on PerformancePoint

3/09/2013

Week In Review

It must have been a busy week.  Because I conked out early after getting home on Friday.

This week I dabbled in the new Microsoft HDInsight big data Hadoop.

Downloaded a trial version, got it installed, running, even uploaded some files, processed them in Hive, although I could not get the HIVE odbc driver to work.  I think the trick might be to download the driver from the Microsoft site, not the Hortonworks site.

http://hortonworks.com/thankyou-hdp12/#addon-table

However, if it works, next step is to import big data into SQL-Server via the HIVE Driver using SSIS and / or Excel.  Think of the ramifications, mashing Big Data with traditional Business Intelligence with SQL.

I also worked with some ETL by migrating the Fuzzy ROI logic to Microsoft SSIS.  It was actually a larger project than expected and still need to finish it up.

http://www.bloomconsultingbi.com/2013/03/ssis-union-all.html

Then I got pulled into Dashboards - Microsoft Performance Point style.  It's been on the back burner for a while, except it's become a top priority now.  So I worked with the business / data analyst and we are spec'ing out what it should look like.  What metrics are we trying to show, where does the source data reside, etc.  It's due shortly.

http://www.bloomconsultingbi.com/2013/03/my-first-performancepoint-dashboard-msbi.html

My part time job went well - I was asked to search 5 SSRS reports to see if a change would break the reports.  Then did the same with an SSIS package where it turned out a change was necessary to prevent the code from breaking.

Also I managed to catch two colds in one week so all in all it was a busy week.

Lastly, our CTO saw me in the hallway and said we have Cloudera visiting in 5 weeks for Big Data training, said I was welcome to sit in.  Whoopie!

So there you have it.  My latest projects.

3/08/2013

My First PerformancePoint Dashboard #MSBI

So today I created my first PerformancePoint Dashboard.

Basically log onto SharePoint to instantiate the PerformancePoint designer.

From their, you can create multiple connections, I chose to point to a Tabular Model cube.

From there, created a Dashboard, which contains multiple Zones.


Then went to Create --> Analytic Chart, chose some measures and saved the report(s).

Then placed each report in their respective zone.

Then created a filter using the built in wizard.

Then dragged the new filter to the report Top Row along with the Apply Filters Button.

The tricky part was deploying the report to the SharePoint site.

Yes, there's a button to deploy from the Home dropdown menu.

However, the Ok button was dimmed out until I managed to find a site URL path to point to.

And next thing you know the 4 new reports, new dashboard and filter all appeared on the Dashboard site within SharePoint.

Then double clicked the Dashboard and we got working data, working filter and valid data.

I've only been wanting to create a PerformancePoint dashboard for 2 years now.

So when they came to me today asking for some new Dashboards, I was very excited.

Now to bring in more data sources and spruce these suckers up.

3/07/2013

SSIS Union All

Writing code in Microsoft SSIS today for a complex project.

The package has one step where I need to pull 14 queries as the Source location and move the data into a single table.

The table had to be created first.

And permissions needed to be added up front.

In order to truncate the table with the User running the package in SSIS.

Once permissions were granted, next step is to create all 14 Data Sources within the Data Flow task.

The next step is to add a Union All to the package and point all 14 Data Sources to this component.

After that, you add an Ole DB Destination to move the data to final destination.

Here's what mine looked like:


Now when this runs, it appears nothing happens.

The output screen shows: SSIS.PipleLine: Pre-Execute phase is beginning.

Some research online shows this problem is not unknown in the SSIS community:

Long Duration Pre-Execute Phase

One suggestion looked promising which was "ValidateExternalMetadata is set to false"

Another suggestion was that there is no issue, the 14 queries are running at the same time which could possibly take 10+ minutes, and the lack of Yellow/Green indicators is that the system is pegged memory wise so just need to be patient.

So I moved 2 of the 14 components to a separate Data Flow task to look under the hood, got the following error:


[OLE DB Destination [2]] Error: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 11.0"  Hresult: 0x80004005  Description: "The INSERT permission was denied on the object 'FuzzySFSC', database 'Report', schema 'dbo'.".

Added permissions and who-la!  We got successful package!



So next step, I split the 14 Data Source connections into 4 Data Source steps (a+b+c+d) to produce a daisy chain of events to reduce server costs.

And that's my intro to using SSIS Union All~!

Install Packages

Now a days, a lot of programming development is done for the web.

However, back in the day, it was all Client Server apps.

Which meant after you developed you application, you had to deploy to many users machines.

Which was not always easy.

What that meant is you had to create an install package.

And the built in tools back then were good.

They had wizards to assist you to bundle up your executables, your DLLs, add your registry entries, all nice and tight into an install package.

One way to deploy was to place the install package on the server and have your users install.

Another way was to physically go to every machine and do the install yourself.

There was also a way to push your app to the users.

Much of this has gone away now that everyone develops for the web.

However, deployment packages are still in use today.

So not only does the developer need to program the app and all the business logic, he / she must also get the app to the user.

And then the process of updating those client servers can begin.

The jump from VB5 to VB6 required all new DLLs to be installed.

And that posed a big issue for us developers at the time.

We couldn't push an update, we had to uninstall and re-install every machine.

However, the server people helped us if I remember correctly.

Anyhoo, that's the life of a developer, nothing's easy!

And so it goes!

3/01/2013

#TampaBay Is Primed for Huge Growth

I've been working in Tampa Bay since Summer 1995.

And there are quite a few business located here.

Most have IT departments.

And in those department reside many smart individuals.

So there are good opportunities to work in IT here in Tampa Bay.

And we have round robin effect.

Programmer 1 leaves Company 1 to work for Company 2.

That opens up a spot which gets filled by Programmer 2.

Who just left Company 3, which opens up a spot for Programmer 3.

And the cycle continues.

Musical chairs if you will.

Each programmer writes some code, which will eventually get supported by another programmer.

And each programmer is trying to learn the new skills so he or she is marketable.

In order to jump ship, to get higher salary, better perks, etc.

Except they're all really the same job, just different locations.

Because the people don't change, they just change the company they work for.

And so it goes.

You try a company, stay there for a while, then move on.

And the place you go to isn't much better than where you came from.

Thus we have a small pond in which to reside.

It looks brighter on the other side, except it's just a different view from the same lake.

What we need are more lakes in which to move about.

Stir things up a bit, add more company's into the mix.

Which will attract new talent, Venture Capital money and bright minds.

Which will open up opportunities we've never had before.

Which will cause a cascade effect of people finding better jobs.

Which will cause the existing company's to care a little more about their employees.

In order to retain them.  As the workers will then have options.

Which raises the bar for everyone involved.

Which attracts better talent.

It's a ripple effect.

We just need to get the ball rolling in a positive direction.

Sure better transit would help, but who cares.

We can improve things with what we have.

And anybody telling you otherwise has ulterior motives.

We've got the infrastructure in place.

This area is just as good as any other.

We've got many assets here.

We don't have any limitations to transform this into reality.

We have high education systems, international airports, plenty of office space, a legacy of hard workers and the desire to build a better place.

Let it begin!

From A-Ha Moment to Uh-Oh

Business Intelligence can bring you closer to finding moments of insights.

And everyone assumes these are light bulbs going off above your head.

New discovery's to better your business.

However, not all A-Ha moments are good.

Sometimes they are Uh-Oh moments.

When you come to the realization that your discovery is not a good one.

And we should not be afraid of this happening.

Because problems can only be corrected once they are discovered.

And if you don't take the time to search then you in double bad shape.

So be prepared to say Uh-Oh occasionally.

It's definitely possible in the field of Business Intelligence.

And so it goes!

Mountain Living