Just Like Old Times

During High School, I played tennis.  I played tennis every day.  Some days I played several times.  Matches, group lessons and private lessons.  My coach's name was Cid.

Cid played at University of South Florida, played the #1 spot I believe.  And he was really a good player in addition to teaching.  He drove an old motorcycle to the club every day.  He wasn't the head pro at the club, but he had a steady following of students.

I took 2 private lessons a week.  And the lessons were grueling.  Cid gave me a list of exercises to improve strength and stamina.  And we'd do exercises during the lessons.  And these exercises were not easy.  Yet I was diligent in my efforts and was always exhausted at the end of the day after school, tennis and working out.

Cid could have played with any tennis racquet he wanted as teaching pro's always get deals from the reps.  Yet he played with an old racquet a decade old.  A metal frame, with a round head.  And he had a one hand slice backhand.  And the funny thing was, he used the same grip for both forehand and backhand.  And his strokes were efficient yet not fancy.  And he rarely came to the net.  And his serve wasn't strong.  Yet every tournament they had for the teaching pros in the area, he always won.

So why did he have so much success?  Well, Cid could place the ball anywhere on the court, wherever he wanted, whenever he wanted.  He had total control of the ball.  In addition, he was the fastest player I had ever seen.  He could track down any ball from anywhere on the court.  So his ball placement, foot speed, ability to read a player and strategy allowed him to win.

And since I was one of his students, he instilled in me some of the same ethics.  In the lesson, he would have me hit 25 balls in a row, just backhands for example, and I had to hit the ball into the left side of the court, with the ball landing behind the service line, every time.  If I missed one ball, I had to start over.  Then we'd do cross court backhand, same thing, 25 in a row.  Then the forehand side.  Then approach shots, same thing.  Then overheads.  After a while, I could place the ball anywhere on the court at any time.

And foot speed.  I did tons of foot speed exercises, running back and forth, touch the net, then back to the baseline, side to sides, around the complex, you name it.  And lots of stretching exercises.  And arm exercises to strengthen the forearm.

Now here's something interesting, I throw left handed, yet play tennis right handed.  So having a two handed backhand, Cid would have me hit left handed forehands for much of the lesson, hitting 25 balls to a specific spot on the court.  I got pretty good at playing left handed, yet it really helped my backhand, as it was stronger than my forehand, still is today.

And we practiced serves, although I lacked a fluid motion on my right arm, I learned to spin the ball and hit for placement.  He would set up targets on the other side and I had to knock them down with my serve.

To be honest, I probably enjoyed taking lessons better than playing matches.  Even today, I prefer to find a hitting partner who just likes to hit the ball back and forth, getting into the groove.  The competition took the fun out of the game for me, but I still played tournaments every weekend and I played on the High School tennis team for three years.  I played the number one spot my sophomore year and got out played with a 3-10 record.  I also played #1 doubles my sophomore year.  I played #1 both singles and doubles my Junior and Senior year as well.  Senior year my record was 10-3 and I made it to the finals of the District tournament losing to a guy that eventually went pro.  And that was my last junior match, as I threw in the towel and quit tennis cold turkey.

The day after graduation from High School, I drove to Jacksonville Florida to work in a factory for the summer to earn money for college.  I went to University of Florida in the fall, met a girl and joined a fraternity.  And I played tennis here and there over the next decade and a half, but nothing serious.

Until one day I rode my bike to the tennis club nearby.  I asked if I could play tennis.  The lady there asked if I knew how to play tennis.  I said yes, and was invited to play a round robin later that day.  Later that day I drove to the club, and played, and never lost a match.  And the owner of the club asked me to join.

And I climbed the tennis ladder at the club, won the club championship, and then started playing leagues, which I won, and then tournaments.  I then flew to New York to play the National Indoor champions for Men's 35 and older.  I got waxed pretty good but played the number 2 person in the country and gave a good fight.  I then played the National Clay Court championship in Daytona Beach and managed to win a few rounds and got a National Ranking.  And by that time I had quit my computer job and was teaching tennis and doing websites for people and having a great time.  And that's when I got married and went back to my computer job.

So lately I've been thinking to pick up the racquet again.  Just for fun.  Just to hit the ball back and forth to see how many times I can keep the ball in play.  Just like old times.


SQL Server Replication vs Service Broker

I'm currently working on a task to compare and contract SQL Server Replication versus Service Broker.  There's lots of good information out there on the web and I've been accumulating in order to present a final document.

SQL Server Replication

Replication is a good option to move large (raw) data sets, SQL Objects (Tables, Functions and Stored Procedures) from one instance of SQL Server to another.  This can be achieved through Local and Wide Area Networks, wireless, VPN and the internet.

Publisher - hosts the source database.
Distributor - is the "pass-through" database.
Subscriber - is the final destination of the data.

There are 4 basic types of Replication:

Snapshot Replication - selected data is captured from the Publisher SQL Server and sent to the Subscriber server.  This is a good scenario if your database doesn't change much or is smaller in size.  It's also the easiest type to setup and maintain.

Transactional Replication - this is for Server to Server scenarios that require high throughput.  Transactional consistency is guaranteed within a publication as the data is transferred in the order the data was saved or updated.  Data is read from the Transaction Log files and each touch of the data point is captured, including Inserts, Updates and Deletes.  DML and DDL schema changes are captured and replicated.  There can be multiple Subscribers to a single Publisher.

Merge Replication - process of distributing data from Publisher to Subscriber and is typically used for Mobile applications or distributed server applications like a POS system or combining data from multiple sites.  It's the most difficult to setup and maintain.

Peer to Peer Replication, is built on the foundation of Transactional Replication, basically used for 'real time' synchronization.  Enterprise Edition will allow bi directional replication.

To create SQL Server Replication, you can use the built in wizards to assist with the installation.  You define which database will become the Publishing Database, Distribution Database and Subscription Database(s).  Optimally, they should all reside on separate servers, however, this is not mandatory.  Deployment of Replication is a combination of SQL Server Management Studio (SSMS) Wizards and scripts for automating purposes.

SQL Agents
There are SQL Agent jobs that do certain tasks, including the Snapshot Agent, Log Reader Agent and Distribution Agent.  You can set your Agents to run all the time or at specific intervals.  And you will need to create Users to run the SQL Agent Jobs. 


  • Multiple Subscriber databases can point to a single Publishing database.
  • Transactional consistency.
  • Potentially less coding required to setup and install Replication, including wizards and T-SQL Scripts.
  • Replication has built in solutions for Synchronization, Monitoring and Recovery.
  • Ability to adjust the frequency of the Replication by adjusting when and how often the SQL Agents run.
  • Increased administration time depending on the number of objects moved and subscribers.
  • There's no automatic fail over for Disaster Recovery.  
  • There is with no easy way to Transform (ETL) the data during transit.
  • You can apply filters to limit which data gets copied, although there could be a performance hit in high traffic servers.

Service Broker

Service Broker is suitable when you are not just trying to maintain identical copies of the data, but rather each application maintain its own data and the apps engage in an exchange of logical messages over a business conversation.  When and how these messages are sent or received is completely driven by the app, rather than the database.  Service Broker solves receiver management issues by activating queue readers as required when messages arrive.

To install and configure Service Broker, one must have a good understanding of the hierarchical objects which comprise the architecture.  This can become quite complex as there are no built in wizards or front end utilities.
  • All messages in a dialog are received in the order sent, with no gaps in the message sequence, even if the server goes down in the middle of a transaction.
  • Database integration enhances application performance and simplifies administration.
  • Message ordering and coordination for simplified application development.
  • Loose application coupling provides workload flexibility.
  • Related message locking allows more than once instance of an application to process messages from the same queue without explicit synchronization.
  • Automatic activation allows applications to scale with the message volume.
  • Asynchronous Queues enable the flexible scheduling of work, which can translate to big improvements in both performance and scalability.
  • Service Broker allows the ability to transform and manipulate the data as its moved between databases.
  • Service Broker has the ability to initiate transfer of data on Triggers.
  • Data is encrypted between servers to protect the connection.
  • You can use certificates to secure the machine endpoints with SSL encryption.
  • Servers do not have to reside within same network.
  • Service Broker allows custom Audit tables to track the flow of data.
  • Service Broker allows immediate data transfers with asynchronous transactions.
  • Service Broker allows each database side to own its own schema so they do not have to match.
  • Service Broker allows multiple servers talking to a single back end service as well as forwarding.
  • Service Broker has efficient transactions by leveraging SQL Server's internal transactions.
  • Service Broker Queues are database structures, so they are backup via standard SQL Server backup procedures.  They gain all the benefits of incremental and differential backup and restore processes provided by SQL Server.
  • Service Broker provides a way to send messages when the receiving queue is not available.
  • Service Broker allows for Prioritization of Queues.
  • Service Broker has the ability to Multicast messages (SQL Server 2012) which enables a single initiator service to send messages to multiple target services.
  • Service Broker requires you to do a fair amount of coding.
  • Although Service Broker can be much faster, it is not tuned for large data transfers.
  • Service Broker could experience performance issues, if the data transfer rate is high.
  • Service Broker has no built in mechanism for monitoring and debugging.
  • Service Broker does required additional configuration in order to work with the Always On Availability Groups.
  • Adding more data centers or servers require lots of new code on both servers preventing easy scale out.
So choosing the correct tool for transferring data between SQL Server depends on your unique situation, business and configuration rules and coding skills.  However, by understanding the benefits and downsides to each, you'll be better equipped to make the right choice.


Fertility Statue

We like to shop at garage sales.  And today I was looking through some stuff like usual and I noticed a peculiar looking statue thing.  It had a large round head, with intricate carvings, and it had breasts so it was definitely a woman statue.

I asked the guy if he knew anything about it.  He said that he purchased it 25 years ago in Long Island from an estate sale of a wealthy person who had passed away.  Said the house was full of stuff like that as the guy must have been a hunter and collector.

Said he paid $25 for the item 25 years ago, but was offering it for $20.  I bought some other stuff but didn't purchase the statue.  After we got to the car I looked in my wallet to see how much cash was left and said to my wife that I was going back to buy the item.

So I chatted with the guy some more, said the piece was supposedly a fertility God piece used by the Shamans in Africa.  And he pointed out that you could see the worn down sides so it was definitely used in ceremonies.  He said the original collector must have had it for a long time, so who knows how old it actually is.

I don't usually buy much stuff for myself.  But as I learned a few weeks ago, after passing up on the meteorite at another garage sale, which this guy said could have been worth $35,000, I no longer pass up a unique item just because of the price.  Granted $20 is not a huge sum, but in garage sale money, that's a lot, as most items are just a $1 or 2.

So I'm going to research the wooden carved fertility God statue which I purchased today for $20 and see if its got any history.

I still like garage sale for precisely this reason, you really never know what you're going to find.


It's sort of looks like this, just without the arms and lighter in color, called Ashanti Akuaba Dolls:


Create SQL Server Proxy Account to Run SSIS Package

When scheduling SSIS Packages to run in SQL Server 2012 Integration Services, you may need to create a Proxy account to mimic the rights of a specific user, so the package runs successfully.

I can't remember where exactly I found this code, but here it is:

--Script #1 - Creating a credential to be used by proxy
 --Drop the credential if it is already existing
IF EXISTS (SELECT  1 FROM  sys.credentials  WHERE name = N'SSISProxyCredentials')
, SECRET  = N'Password'
--Script #2 - Creating a proxy account
USE  msdb
 --Drop the proxy if it is already existing
IF EXISTS (SELECT  1 FROM  msdb.dbo.sysproxies 
WHERE name = N'SSISProxy')
EXEC  dbo.sp_delete_proxy
@proxy_name  = N'SSISProxy'
--Create a proxy and use the same credential as created above
 EXEC msdb.dbo.sp_add_proxy
@proxy_name  = N'SSISProxy',
--To enable or disable you can use this command
EXEC  msdb.dbo.sp_update_proxy
@proxy_name  = N'SSISProxy',
@enabled  = 1  --@enabled = 0
 --Script #3 - Granting proxy account to SQL Server Agent Sub-systems
 USE msdb
--You can view all the sub systems of SQL Server Agent with this command
--You can notice for SSIS Subsystem id is 11
EXEC  sp_enum_sqlagent_subsystems
 --Grant created proxy to SQL Agent subsystem
--You can grant created proxy to as many as available subsystems
--EXEC  msdb.dbo.sp_delete_proxy

EXEC  msdb.dbo.sp_grant_proxy_to_subsystem
@subsystem_id=11 --subsystem 11 is for SSIS as you can see in the above image
 --View all the proxies granted to all the subsystems
EXEC  dbo.sp_enum_proxy_for_subsystem
 --Script #4 - Granting proxy access to security principals
USE msdb
 --Grant proxy account access to security principals that could be
--either login name or fixed server role or msdb role
--Please note, Members of sysadmin server role are allowed to use any proxy
EXEC msdb.dbo.sp_grant_login_to_proxy
 --View logins provided access to proxies
EXEC dbo.sp_enum_login_for_proxy


Get Up to Speed Fast and Deliver Results

Programmers no longer do one single language any more.  We cross languages and technology every day.
You go to work one day, boss says you need to get up to speed on a new technology, expert level, as soon as possible.

First thing you do is go to the internet.  Scan every article you can find, blogs, tweets, tutorials, sample code, sample projects, whatever it takes.

Because you have the core foundation of programming and technology, learning a new language is part of the job.

My current job is a good example.  When I got hired, the boss said to learn Service Broker.  I'd heard of it, but no practical experience.  So I read as much as I could find on the web.  Did sample applications.  On multiple Virtual Machines.  And did a proof of concept.

Next thing you know, a new project started up with a need for someone who knows Service Broker.  Along with SQL Server, queries, views, stored procedures.  And automation.  Many programmers already knew SQL, but nobody else knew Service Broker, so I got tasked with SQL Server and Service Broker.

And now another project needs a solid Service Broker programmer for 6 months.  So I can leverage my experience onto the new project.

My boss didn't have to go hire a programmer, he took an existing one and gave me the opportunity to learn.  I self taught and got up to speed fast.  In a technology that not everyone knows.  Which has opened doors for more work.

And I inherited a Power Query project mid flight with no documentation and a 20 minute knowledge transfer.  Although I had to spend some time to figure it out, I got up to speed with minimal assistance to move the project forward.

So now I have to get up to speed on Power Shell, expert level, fast.  And c#.

I'd say it's good to have a foundation technology, that you've done for years and what people know you for.

But those skills can be a commodity where you're competing with other programmers for the same work.

You don't have to be the very best at one specific technology.

The trick is to be able to learn new stuff fast and hit the ground running and show results.  That's what makes you stand out amongst the herd.

Technology stops for no one.  And if you stop learning, the demand for limited skill sets will soon dwindle over time.

And so it goes~!


Power Query OData Calls to TFS

The past few weeks I've been working with Microsoft Excel.

My project consists of creating Power Query calls to an OData service from Team Foundation Server.  That entails writing customer M language queries to connect to the server, specify which table to pull data from, transform the data, expand columns and store either in Power Pivot or an Excel tab.

Since we are pulling from a hierarchy, first we grab the Features, then the User Story and Bugs and finally the Tasks.  Parent, Child, GrandChild relationships.

I first tried to bring in each of the tables to Power Pivot and apply the joins there.  However for one reason or another, the joins are not behaving properly.

Next, I tried to call a recursive function to go out and grab all the linked Ids along the hierarchy.  In doing so, it only keeps those records that have Child relationships so by the time the query finishes, the data set is stripped down and many records are missing, those records without matching records in all three tables.

Currently trying another approach, that is to pull each of the three tables separately.  Then apply a recursive function to Left Join the tables.

For the latter two attempts, I've leveraged code found on Chris Webb's BI blog, much appreciated.


To write the code, you can use notepad++, which has a plug in for the M language, you can download it here:


The goal is to have the Model built so that we can create Pivot Table reports for the client.  They are currently using the TFS Plug-In and assembling the data manually.  Our approach should streamline the process.

I was surprised how versatile Power Query was.  One of the hardest parts of Business Intelligence is the Extract, Transform and Load.  This tool allows people to bring in data from just about anywhere, massage, tweak and transform it and create a model for consumption in the Power BI Suite.  After the initial learning curve, I find it very flexible and easy to use.  It's similar yet different than traditional SSIS, and it's built into Excel, so business users have access to it.

Power Query is a cool product from the Microsoft Stack, built into Excel as a plug in.  I recommend it.


Agile Methodology Works

Agile methodology is a huge change from Waterfall methodology.

It used to be, in IT, that the programmers received specifications for an application, feature or enhancement.  And then went back to their caves and worked on it.  Occasionally the Project Manager would ask how things were going, maybe update the client with a demo and eventually the project was finished, maybe.

The downside to this scenario is accountability and visibility.  The client was typically in the dark for the duration of the project and didn't get to see a working product for months or years.  Many times the project went over budget, over time or did not conform to the original requirements.

Thus, IT was given a bad wrap, justly so in many instances.

With Agile Methodology, you have Sprints, typically two weeks.  The sprints are based off User Stories.  Which are based off the Statement of Work.  Which dictates what will happen on the project, how long it will take and delivery dates.

So within the two week Sprint, team members are given specific duties to complete.  Each day, the team gathers together, sometimes via phone calls, where the Scrum Master goes through the list of people and each team member provides an update with three specific pieces of information:

1. What did you do yesterday?
2. What do you plan to do today?
3. Do you have any blockers or impediments?

You don't go into details, just quick hit bullet point items.

After each team member gives their update the meeting is done and people work on their assignments.

The goal of the Sprint is to have a working deliverable at the end, to present to the client.  Any items not completed get rolled into the next sprint.  And the work assignments are given and the process starts over again.

Typically, the Delivery Lead will keep all the information stored in an application, we use Team Foundation Server through Visual Studio or web.  All the Sprints are managed there, with User Stories, tasks, assigned resources and hours.

The advantages to this process is accountability.  All team members pitch in.  The work is divided evenly.  You can better manage the resources and hours worked.  You can predict with some level of confidence how much work will get done in a specific time frame.  And most of all, the customer is in the loop on what's going on, visibility.

So the Sprints continue, like a steady drum, day after day, week after week, until the project is complete.  The trick is to stay on track, meet your deliverables and have a working set of features every two weeks.

Meeting goals is part of the job of programming.  And with Agile, you have to meet your goals consistently.  This adds much pressure to the job as there's no room for slacking.  And if you run into a problem, you'd better find a solution sooner than later, to keep on track.

You're also part of a team.  Which means there's lots of collaboration.  As your feature may be dependent on another programers feature.  As everything gets intertwined.  And that means the Team Leader or Delivery Manager has to keep a close eye on how things are progressing daily, as to better manage the project to ensure no delays and keep every resource occupied.

Which means a typical Scrum project may have more overhead than the traditional Waterfall projects because there are additional people on the team.  So a simple project with 100 hours of programming, will in fact be greater because you have to factor in additional staff to manage the project.

There are many advantages to Scrum Methodology.  And its amazing to see it work in action.  It's almost an art to run a successful Agile project.

More and more companies and departments are moving towards this framework for good reason.  Because managing an IT project 20 years ago was anybody's guess.  Now we have repeatable patterns to move a project from initial design to build to quality assurance to delivery.  With accountability and visibility.

So IT can now estimate projects better, meet their deliverable's better, keep the client informed and manage issues as they arise, as well as interchange tasks and resources as needed.



Projecting Human Traits to Robots

Will our future robots experience love, loss, fear, pain, happiness, sadness, envy, greed, lust, etc.?

Is that one of the requirements for the Turning Test?

I can just hear, robot A talking to robot B, "So, did you hear what happened to robot C, she just bought a new CPU, twice as fast as ours.  I never liked her anyway."

Envy.  A human trait.  Doesn't seem to fit a robot.

How about greed.  "Did you read the latest poll, robot D has accumulated the most massive amount of money."  What use would a robot have for accumulating wealth?

When we think of robots inheriting the earth and enslaving mankind, why would they do such a thing?  Why do we assume they would be evil and lack compassion?

Are we projecting our human quality of conquering people, places and things onto the potential robot like beings?  Out of fear?  Threatened by our loss of Alpha status?

I think one of the difficulties about the potential robot future is that it directly forces humans to look at themselves under a microscope.

Humans are responsible for tapping resources from the earth without replacement, starting countless wars throughout the ages, including genocide and forced enslavement.  We worship the dollar bill over compassion and equality for all.

You can hear them say, "No robot species is going to take all this away from us."

And so there's a segment of the population who would prefer to keep the status quo and see any technological advancement as a threat.  By projecting our human qualities onto a robotic being with the assumption that they'll have all these bad characteristics also.

But I imagine, a super intelligent being, understanding the ludicrousness of our ways and rise above our animistic nature.  I think their advanced reasoning abilities would form a conclusion that killing others, raping the land, greed, envy, hatred are bad qualities and would refrain from such activities.

Would robots be spiritual beings?  Would they believe in a higher power such as God and the Universe?  Would they have an algorithm written for such thought?  Or would they get to a point where logic did not compute and the only possible explanation is the Divine Light of a more Supreme Being?

You hear about higher levels of consciousness of spiritual gurus who live in a world of less conflict.  Would robots have this ability?  Would a spiritual robot believe in war and killing and taking others land and creating genocide?

Why are we so insistent that these robots will be killing machines?

Possibly because we fear who controls the robots and how they are designed or for what purpose.  If they are to patrol the earth, fight as soldiers in war and carry out the deeds of evil men, that is reason for concern.

I'm still a believer that the more intelligent a being is, the closer they are spiritually to the higher purpose and less likely to cause harm to other beings.  Who knows, robots could create a Utopia on earth where everyone gets along, there's food and shelter for all mankind, we care for our elders and war becomes obsolete.

When dealing with the unknown, we project our fears and past history onto new things.  So it makes sense that we are fearful of a new advanced species, in that we live in a Dystopian world.  But the jury is still out, our new robot friends could pave the way to a Utopian society for all to enjoy.

50/50 chance either way.  Time will tell.