7/31/2015

Microsoft SSDT, Birst and Analytics Meetup

I attended a community event recently.  Data Analytics.  They presented a tool from Microsoft called SQL Server Data Tools:

http://blogs.msdn.com/b/ssdt/

https://msdn.microsoft.com/en-us/library/mt204009.aspx

https://social.msdn.microsoft.com/Forums/sqlserver/en-US/home?forum=ssdt

It's a free add-in.  Used to manage the SQL Server databases.  In that you can connect to an existing SQL Server database, and pull in the entire set of artifacts.  Make changes, redeploy, it keeps track of the scripts.  You can do compares against projects, other database, even dacpacs.

And the project gets saved to a source code repository like Team Foundation Server.

It allows "snapshots" of database as points in time.  It's got tons of configuration settings.

If you work someplace where the production database is off limits, you can hand off the dacpacs to someone and they install without having to be a DBA.  Good for change management.

For smaller teams working without a DBA, its optimal for version control, backups, restores, applying changes to other databases and compares.

It doesn't save the data, but they showed a feature to compare data, for smaller data sets as it could cause a performance hit on larger tables.

The presenter mentioned Service Broker.  The objects have dependencies and making changes can be quite a challenge.  With this tool, you can make changes and apply them and they get deployed to  the database easily.

I wouldn't necessarily do my development in this tool.  I'm going to start using it immediately.

The next presenter discussed the product Birst, a data tool in the cloud.

http://www.birst.com/

He demonstrated the ability to pull in data, from Excel and Amazon AWS cloud, it builds the model for you, including Fact Table and Dim tables, on the fly, with no assistance.  That right there is quite awesome.  Although I'm not sure how one would apply business rules to the ETL and handle NULL values.  It even built the Time dimension without being prompted.

Once the model was created and the data was available, it has the ability to do Data Discovery and poke around the data with different charts and graphs and tables.  Those can be added to a Dashboard.  And it also has a more granular feature to build other reports.  And a filter feature to connect to the Dashboard for user interaction.  And KPIs.  As well as a client component installed on a local database which pushes the data, although having a live connection could take performance hit.

They said a new release is expected in 2 weeks so some of the features should get enhanced.  I thought it was a cool product and ideal for the Power User / Data Citizen, one who sits on both sides of the fence, having domain knowledge and technical skills and likes to dig into the data.

The last presenter discussed Analytics.  Gave examples of Target sending out coupons to people who were thought to be pregnant, even a teenage girl who hadn't revealed her pregnancy.  But it helps to explain how much information is collected to get a good picture of their client base at the detail level.

He also explained how the new algorithms are used to assess the optimal marketing campaigns, which can not be duplicated by humans on it's precision and  accuracy.  And he also talked about Social Media analytics and Sentiment Analysis.  It's quite powerful.

It was a good event, met some nice people, learned some new things.  Good stuff~!

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.