11/26/2014

Latest Power Query Connects to Analysis Services

Power Query for Microsoft Excel now supports connection to SQL Server Analysis Services cubes.

You can download the latest release here:
http://www.microsoft.com/en-us/download/details.aspx?id=39379

Upon loading Microsoft Excel, it will prompt you to close programs while it installs the new features.

Once loaded, you can see the Power Query tab across the top.  I'm using Excel 2013, so it was already there, just updated to a newer version.


We'd like to get some external data from our SSAS Multidimensional cube, so we click on the button "From Database Get External Data":

Then we select "From SQL Server Analysis Services".


We locate the name of our Analysis Services Server.

Next we're prompted to Use Your Windows credentials to access Analysis Services, click SAVE:


On the right hand side of Excel screen, you'll the Navigation pane containing our SSAS Multidimensional Cube:

We selected Measures and a Dimension and clicked the Load button, it starts chugging away, to pull the data into our Excel Workbook solution:

After a brief moment, whallah, we have data from our cube:


Now you can edit the query by clicking the Edit Query button, on the right hand side, you'll the list of steps used to pull in the data:


We then clicked the Settings button to the right of the Added Items and a window pops up:


Added another field called Orders Count and the field got added to the list of data:


By clicking the down arrow on any of the fields, a window pops up to sort the data, or remove specific entries:


Click the Close and Load button:


Excel re-loads the data containing the new field:

And the field now displays in our Workbook:


No MDX required!


We select the "Load to Data Model"... Click Load... we get a warning box:

Click continue, it loads data into a Model:

Then hovered over the Workbook Queries, right clicked, and said "Send to Data Catalog"


After supplying credentials, a screen pops up:

And the Sharing screen:


Notice the box in Yellow, '"This query uses a local data source and may not be available to others".

I publish and log into PowerBI.  Navigate to My Queries and sure enough, there it is...


I click the ellipses...

See the Analytics...


I don't have a Data Management Gateway established at this time, so that's as far as I'm going to take it at this point.

Hope you enjoyed and learned a bit.

Happy coding~!

11/25/2014

MCSE

Passed the 5th test today... Microsoft® Certified Solutions Expert: Business Intelligence


How About Cross Platform SQL Server?

In the world of Microsoft big new recently:

Opening up Visual Studio and .NET to Every Developer, Any Application: .NET Server Core open source and cross platform, Visual Studio Community 2013 and preview of Visual Studio 2015 and .NET 2015

For those programmers working with .Net, this is a huge new way of doing things.  Open Source.  Cross Platform.  This should really open avenues to bring in more developers from some of the 'free' languages.  Write once, deploy anywhere.

That got me to thinking, why not do the same with data.  How about SQL Server running Cross Platform?  On Linux and Unix operating Systems.

Only makes sense, MySQL has Cross Platform.

What do you think?  Possibility?

11/24/2014

Next Project in the Queue

Service Broker for SQL Server.

I saw a demo recently at the SQL BI User group.

Now I get to work with the technology on my job.  Got a new project assigned this evening.  It involves taking a proof of concept and getting it to work in Production.  Also add a new endpoint.

It's basically a queue / messaging system built into all version of SQL Server, including Azure.

You send a message to one or more endpoints, by establishing a conversation, the message conforms to specific formats, after the conversation ends, it shuts down. 

And you can send multiple messages, in any order, and the receiving machine treats the entire batch as a transaction, which can roll back.

If the receiving machine goes away, the messages stack up in the sending machine.

That's the basics anyway.  I'm excited to have a project to bill hours against and we need to do a good job on this because we now own the process.  Which means I'll have work to do while in Germany the next two weeks.

And I get to work with people from other countries, like Germany and Australia.

Should be fun~!

11/22/2014

Why Change Careers Every Few Years

I am not the greatest business intelligence person out there.  Nor the greatest programmer.  Nor the smartest.  I have trouble remember people's names, passwords and memorizing things.  Chances are I have a mild form of dyslexia.

So it would have made sense to learn something, get good at it, and stick with it.

I could have stuck with approving loans, as a credit analyst, branch manager, or a mortgage broker.

I could have stuck with digging holes as an Archaeologist.

I could have stuck with teaching tennis at a club.

I could have stuck with .net, had plenty of years with Visual Basic, ASP and .net.

I could have stuck with Crystal Reports, SAP Universes, XCelcius and Business Objects.

I could have stuck with with Java, JSP and Web Services.

I could have stuck with Oracle, Pl/SQL

I could have stuck with Supervising SSRS developers.

I could have stuck with SQL Server Reporting Services.

So why didn't I?  Each of these careers could have been sufficient to live a good life.

I guess it comes down to a few things, primarily, learning new skills.

If you're always learning, you don't get stale.

If you don't set challenges, you get bored.

I now build data warehouses, business intelligence solutions, SQL Server related technologies, Analysis Services Cubes, and Dashboards and perhaps some Hadoop.

Am I the greatest at each, no.

Descent?  Yeah.

But here's the skinny.

If I don't know something, I learn it, fast.

And I have other skills, soft skills, which I depend on.

Problem solving.

Troubleshooting. 

Driving projects home to completion, finishing the last 10%, shipping.

Focus.

Attention to Detail.

Multitasking.

Taking responsibility.

Accepting projects that others don't want.

Learning on the side.

Ambition.

Dedication.

Logic.

Intuition.

My new job is another challenge along the path.  Are the required skills so complex, that if locked in a room for 10 years, I wouldn't be able to grasp and understand?  No.  I can learn whatever they throw at me.

So am I nervous about my new role as Business Intelligence Architect for a major consulting firm?

Sure.  However, based on the track record, I think it will turn out just fine.

11/19/2014

Data Solutions have Incrementally Compounded

The world of data has fragmented and splintered into many tiny pieces.  Back in the 1990's as a report writer, we had data in Oracle or SQL Server and a GUI interface called Seagate Crystal Reports to pull the data, export it and send to the users.

Very simplistic.

Now we have a plethora of options, quite overwhelming.  If you were to look solely at the Microsoft offering alone, you'd quickly see at least 20 variations.

Storing the database, we have SQL Server 2014, Excel, Access, Blob Storage, DocumentDB and Hadoop.

Pulling the data, we have SSRS, Power View, Power Pivot, Power Query, T-SQL, Map Reduce, HiveSQL, Machine Learning, SSIS, MDX and Excel.

There's on-premise, Azure, Office365, PowerBI and Hybrid solutions.

So the variety of options can be customized to fit any client.  However, this creates an opportunity for Data Professionals to learn the gamut of offerings and stay up to speed on new developments.

My approach is to learn as much as possible to keep a wide view and focus on specific technologies as it becomes necessary.  The other approach is to become deeply knowledgeable of a few specific technologies.  These features listed above are the hard skills

The soft skills include project management, Agile Methodology, gathering specification, Networking, Active Directory, Virtual Networks, Visio diagrams, Documentation, attention to details, problem solving, etc.

And then there's other skills like completing the last 10% of a project, staying within budget, meeting client expectations, proper email etiquette, entering hours in timesheets, meeting new clients for potential sales.

And then there Architecture.  Recommending the best solution to specific clients based on needs.  Knowing hardware specification, hard drives, memory, virtual vs. physical, Cloud vs. On-Premise, Source Code repository solutions, Network, ITIL Methodology and Architecture frameworks.

So you can see there's quite a bit of a knowledge curve in getting up to speed in the world of data.  And the learning never ends.  It's a challenge and opportunity and a great way to earn a living.

Gone are the days of simple queries to pull data into a report to send to a client.  The Data Ecosystem has incrementally compounded for the better.

11/18/2014

#Pentaho #Kettle #PDI CE Offering

There's a product from Pentaho to perform Extract, Transform and Load, ETL / ELT, which you can use on Linux or Windows, called Kettle.

There's the free Community Edition CE:
http://community.pentaho.com/

Download latest version of Kettle CE Community Edition:
http://sourceforge.net/projects/pentaho/files/Data%20Integration/5.2/

As well as the Enterprise Edition with a 30 day trial:
http://www.pentaho.com/download

I wanted to load on a Linux environment, so I downloaded a Hadoop VM from Hortonworks CentOS Sandbox version:
http://hortonworks.com/products/hortonworks-sandbox/#install

Link for installing Kettle:
http://wiki.pentaho.com/display/EAI/01.+Installing+Kettle

So I logged onto my Hyper-V Hadoop Sandbox single node cluster as root/Hadoop.

Was logged in as root, created a new user called pentaho:
To sudo to the Pentaho user
$ su - pentaho
 
 
 


And back to root:
su - root

I created a folder in the root called /Pentaho

There's no easy way to copy the zip file from the host computer to the Hyper-V, so I used curl:

curl 'http://tcpdiag.dl.sourceforge.net/project/pentaho/Data%20Integration/5.2/pdi-ce-5.2.0.0-209.zip'

Checked to see the Zip file was downloaded as Archive file, not HTML:

To Unzip the Zip file, first install Unzip
yum install unzip -y from the command line on the Hadoop VM


Then key in the command:
unzip pdi-ce-5.2.0.0-209.zip -d /Pentaho


Extracted:

Files were loaded:

Next step is to get Kettle PDI running...

In the meantime, I wanted to explore the Windows version and sure enough the same bits work for both Linux and Windows, in my case, Windows 8.1.


You'll want to click on the spoon.bat file:

And within a few seconds, the app loads, no installation required:

There're existing Samples to get started:
 
There're tons of transformations:

And the thing to notice is the Big Data Transformations:
 
And some connection types:

To create a Database Connection to Hadoop, follow this URL instructions:
http://wiki.pentaho.com/display/BAD/Configuring+Pentaho+for+your+Hadoop+Distro+and+Version

What I like about the Pentaho Kettle PDI solution is the ability to get installed and up to speed quickly.  Once the application is running, you have an arsenal of Extract, Transform and Load functionality at your disposal.  And the best part is there's little to no actual coding.  It's drag and drop WYSIWYG interface allows Rapid Application Development robust solutions.

This blog was how get started and some of the features on the Community Edition of Kettle PDI from Pentaho.

Happy coding~!



11/14/2014

Microsoft Data Offerings in the Cloud

This week I spent some time learning more about the Microsoft Cloud offerings.  Azure has quite the number of tools for getting a fully functioning Data Warehouse running in a short time.

First, you can provision an Iaas (Infrastructure as a Service) SQL Database in Azure Cloud in about 4 seconds, that is, to provision it, not counting the time to create the Azure account, log in, etc.

You have a number of database options to choose from, what I liked was the SQL Server 2012 SP2 Data Warehouse version, designed for speed.  And they have 2014 version as well with many cool new features.

They also have the Paas (Platform as a Service) which allows you to create a VM in Azure, which you are free to load your own software.  You can provision a pre-configured Data Warehouse version of SQL Server using a Power Shell script, or you can load your own Database.

However, you must purchase or supply your own licenses as in Windows and SQL Server.  You may be able to port your On-Premise licenses, but they must remain in the cloud for at least 90 days I believe.

By loading SQL Server, you have at your disposal MOST of the OnPremise data tools such as SQL Server Data Tools to build SSIS, SSAS and SSRS solutions.  You have the SSISDB Catalog to store your SSIS packages, you have your SQL Agent, along with a fully functioning SQL Server Reporting Server SSRS web, with Data Subscriptions available.  And it pre-loads Analysis Multi Dimensional with the option of activating the Tabular Model.

It does not however have Performance Point.

And you can set up an FTP site to push your data files to the Cloud if you choose.  And you can activate Active Directory Federation Services to integrate your On Premise AD.

You can set up a Virtual Network to establish a seamless link to your On Premise users.

You can set up a Static IP Address.

What I like is the option of pointing your On Premise Excel to the Cloud for Power Pivot, Power View and  Power Map using 2013 or 2010 with the Add Ins.  It seems like a cost savings to get clients to the Cloud sooner than later with the option of adding on later.

And by adding on, that opens up the arsenal of Office365.  For a subscription price, you get SharePoint, full office online or available for download, Lync, integrate with Active Directory Federation Services and a ton of other stuff including 24/7 support, all depending on which version you select.

And to top that off, you can also purchase PowerBI subscription, which opens up Self Service offerings.  You can point to On Premise to pull data, without having to store the data in the cloud.  You can host Excel files in PowerBI, manually refresh or set up Automatic Refreshes.  And with the Data Management Gateway, you can securely link data to PowerBI.

And PowerBI will  point to your Azure OLTP data, and I believe they recently announced the option of pointing to Analysis Services, which is a real selling point to clients.

So I touched on a few of the basic offerings from Microsoft Cloud offering around Data and SQL Server and Office365 and PowerBI.  With the Cloud, you no longer have to host this stuff On Premise, which means faster time to production, faster insights, secured, with Active Directory integration, push and pulling of data and it all runs in the Cloud, which means Disaster Recovery, which is a major selling point as well.

I hope to get a client or two up into the Cloud to find out where the gotchas are and to better understand the pricing models as they have changed a few times and there are so many factors and options available.  Suffice to say, depending on your clients budget, you can really pick and choose custom architecture to satisfy almost any configuration.

And then throw in HDInsight, Blob Storage, Machine Learning, Microsoft is to say the least, got their stuff together with their Data offerings in the Cloud.

11/13/2014

Microsoft .Net Opens Up

Believe it or not, my first programming language as a computer professional was Microsoft Visual Basic.  I worked with version 4, then 5, then 6.  That was around 1995 or so.  Kind of dates my age a bit, but nonetheless, I thought it was a great programming language.

And back then, Java was starting to build in popularity.  And if you compared VB to Java, you clearly saw the differences.  Java was truly Object Oriented.  It was a real programming language.  And that's what I wanted to program in.  Because it ran on any operating system, not just Windows.

And even back then, I wondered why Microsoft didn't port VB to other operating systems.  Why didn't they have a Microsoft Virtual Machine similar to Java Virtual Machine?  Because they were proprietary. And they had enough market share.

So you could create a procedure in VB, then convert it to a Function, then move all the functions to a Class, and bundle the Classes into a DLL, then put the DLL in Distributed DLL.  And that could be called from a new web language called Microsoft ASP.  I worked as an ASP developer for years, yet it wasn't Object Oriented.

Next was .net.  An entirely new IDE, new languages and the thing was, they had a language called Visual Basic.net.  Yet if you looked under the covers, the name was the same, and that was about it.  Because now it was Object Oriented and Microsoft said you must now learn this or else, no choice here.  Visual Basic was soon deprecated and they removed much of the knowledge base off the internet and there was no turning back.  Yet .net was still proprietary.

And now, November 2014, Microsoft has finally succumb to opening up the .net languages to Open Source.  And it will run on most operating systems.  Because of the new CEO, the changes had to come to compete with up and coming languages.  Or loose market share.

Was it a good choice?  You betcha.  I think it should have happened decades ago, better late than never tho.  The Cloud is now the operating system, and Mobile is the new PC, and if you want to compete in the new era, you have to let the programmers use your development software, it's basic supply and demand.  Give the languages to the masses, so they can host on your Azure platform and you clean up on all the paid services.

You see, Microsoft was late to the Reporting / Data world back in the mid 1990's when Seagate Crystal Reports was the only game in town.

Then they missed the boat on the Internet and had to play catchup.

Now Open Source is forcing .net to run on any platform as Open Source software.

I've worked as a Microsoft Developer for close to 20 years and I still believe they are as strong as ever.  I wouldn't count them out just yet.  They've got Windows, Office, .net, Azure, Gaming, HDInsight, Machine Learning, SQL Server, SharePoint, PowerBI, Office365, etc.

I'm just trying to keep up with all the changes, which never end.  I see this latest change as a win.  Better late than never.

Here's a prior post: Microsoft's Data Centric World View

One more: #Microsoft #MSBI Offerings Tough to Beat

Here's the article on the latest news...
http://www.zdnet.com/microsoft-to-open-source-more-of-net-and-bring-it-to-linux-mac-os-x-7000035716/

11/11/2014

Installing #HDInsight #Hadoop to Windows via Web Platform Installer

Today I'm going to document the installation of HDInsight Emulator on Windows.

First step is to go to the URL: http://azure.microsoft.com/en-us/documentation/articles/hdinsight-get-started-emulator/

This will get you started. 

In doing so, I downloaded Microsoft Web Platform Installer:

 
 
There are a few pre-requisites for loading:
 
 

So I began the Power Shell for Azure installation:
http://azure.microsoft.com/en-us/documentation/articles/install-configure-powershell/



As you can see, Power Shell for Azure was loaded:
 


Then did a search in Web Platform Installer for HDInsight,  the entry appeared:


Clicked on the Add button... then the Install button...


Clicked "I accept" and the installation began..



This part took some time, to download and install all 5 sections... although the install job did all the work...

Then it configured and finished...





A few icons got loaded onto the Desktop...



And there are a few new directories in the c:\ root...


from the c:\hdp directory, I double clicked the Start _Local_HDP_Service.cmd...



Double Clicking the NameNode shortcut on the desktop, the web page appeared...


After clicking on YARN Services, the web page appeared...


From the Hadoop command line, instantiated the RunSmokeTest job, the job executed...


Which created some folders in the file directory...


Reading through the verbiage, many of the tests passed...



And running a Hadoop fs -ls get the following...


And there you have HDInsight up and running in less than a half hour on a Windows 8.1 machine.

Hope you enjoyed the blog and you can appreciate the ease in which it gets installed without having the tedious process of installing all the pre-requisites which Hadoop traditionally requires.

Thanks for reading~!







 

Get Sh#t Done!