To deliver speed and agility, “Moneyball” your technology and change teams

 
cap2its moneyball.png

Technology and change functions looking for speed and agility can still learn much from the lessons about data and measurement engagingly described in Michael Lewis’ book Moneyball: The art of winning an unfair game.

If you haven’t seen the movie or read the book, in short Billy Beane, GM of the Oakland A’s baseball team, along with his coaching / administrative team, took a radically different and more scientific approach to running a professional baseball club.  The results they produced debunked the widely held view that the poorer clubs would never be able to compete with the rich clubs like the New York Yankees who had a player payroll budget roughly three times that of the Oakland A’s.

In the “Afterword” chapter of Michael Lewis’ most recent edition he discusses how, following the book’s initial release in 2003, it wasn’t the other baseball clubs that most enthusiastically embraced the book and its lessons. In fact, they stoically or at times aggressively resisted opening their minds and clubs to the new approaches.  Rather it was other sports like hockey, basketball and football, along with Fortune 500 companies that beat down the front door of the Oakland A’s club wanting to learn more and take actionable insights back to their own organisations.

In my field, the world of transforming technology and change teams, I see so many powerful lessons that can be taken from the Moneyball story.  The big three that jump out for me are:

  1. It is vital to focus on the right measures of performance.  In Moneyball Michael Lewis dedicates many pages to describing why the broadly used measures of batting performance such as batting average and runs batted in were sub-optimal measures of batting performance and not necessarily aligned to team success.  Basically, the problem is that these measures place zero value on “walks” but to the team a “walk” has equal value to first base “hit”.  This is therefore a clear case of the batter being encouraged by a measure of individual performance to do something different to what’s in the best interests of the team.  After realising this Billy Beane’s team was able to recalculate the true worth of a batter resulting in the Oakland A’s acquiring players cheaply in the draft and trade markets because other clubs were undervaluing them.   Key insight: Just because a measure is broadly used doesn’t mean it’s the best or even a good measure.

  2. New data and measures may be needed to inform decisions and take actions that drive results.  In Moneyball we see how new measures of batting, fielding and pitching performance were used to better inform both big infrequent decisions, as well as small frequent decisions. An example of a big decision was drafting or trading a player which was informed with new measures of player worth.  An example of a smaller decision was an individual player deciding when to swing and when not to swing at a pitch.  They were asked to think about on-base percentage rather than batting average as their individual measure of success. Key insight: New metrics and insights can help to change decision making at all levels to lift performance.

  3. Be prepared to experiment and it will take some time.  Some areas of measurement may or may not throw up new actionable insights that drive value.  For example, the Oakland A’s wanted to better understand the value of players in the twilight of their careers.  They experimented with some ways to measure skills degradation with age so as to help find players who other clubs were undervaluing simply because they were older. They had no idea whether or not this experiment would lead to anything valuable but they were prepared to give it go in order to learn whether or not the market was efficiently pricing ageing talent.  Key insight: As in the field of science, not every experiment in business will produce the expected results so be prepared to experiment and learn over time.

These three lessons can be applied to the world of managing technology and change functions to improve efficiency, reduce risk, create better customer experiences and develop competitive advantage.

It is vital to focus on the right measures of performance

One good example of focusing on the right measures is the productivity of an organisation’s teams delivering change. The more an organisation invests in change, the more vital it is to owners that change functions are efficient. Despite this, it’s very common to find organisations without any measure of change efficiency at all. Instead these organisations tend to rely upon on-time on-budget delivery of projects as a rough proxy for change efficiency.  As I’ve written about previously, measuring on-time, on-budget performance for projects at best delivers no insight into change efficiency and at worst can lead to inefficiency by encouraging the practice of padding cost and schedule estimates.  Throughput and Cost per storypoint are measures of change efficiency which I’m seeing adopted more and more by long-running persistent change teams.

New data and measures may be needed to make decisions and take actions that drive results

When I was CIO at NAB we drove a dramatic reduction in outages impacting the business under the strategic priority of being “Always On” for customer and colleagues.  These results have been published in the investor and analyst briefing packs over the last few years.  Part of this change agenda included providing team leaders across the technology organisation with new management information about incident levels, repeat incidents, problem backlogs and other reliability and stability data.  The result was an 80% reduction in customer and colleague impact from IT outages over a three-year period.        

Be prepared to experiment as it can take time to move to a more scientific way of making decisions

We all know that it is not just baseball executives who prefer to use their own gut feel or experience over hard data to make decisions. It could be a lifetime of being fed fishy data that’s contributing to this mindset.  I’ve found it important to have empathy for this reality when it comes to the task of introducing new technology and change measures. Allow time for a new measurement system to be adopted to the point where the data is being used to change decisions.  The more people see results the more comfortable they will become with the new measurement approach but don’t just assume everyone will automatically see new measures and the associated value in the same way you do.  

Also be prepared to experiment.  In the NAB example I mentioned above, we moved away from availability measures like 4 nines and 5 nines as one of those measures that didn’t motivate the right outcome. As an alternative we experimented with a variety of other measures including problem aging, incident to problem linkages, repeat incidents, mean time to restore, change failure rates and variety of others.  Some were very helpful like problem aging and repeat incidents but others were hard to get up and running because the cost of the measurement and tracking system would have been greater than the likely benefit we would get from people using the new measure.  The key outcome of this whole process of experimentation was building a scientific and data driven culture that would continuously innovate and evolve.  


No matter how far down the track you are in constructing the scientific measurement system you need for your technology and change functions, further improvement is possible.  Implementing and using measures differently at all levels in your technology and change organisation, from the Board through to the front-line technology staff, will lead to better decision making that drives results.

If you need help with governance of your technology performance and risk free to reach out to me on david.boyle@cap2its.com


About the author: David Boyle’s IT career over 30 years spans both the buy side and sell side of technology services.  He has worked with Accenture, EY, Commonwealth Bank of Australia and until recently was the Group CIO at NAB.  David is now the Managing Director of CAP2ITS, a technology advisory firm focusing on technology strategy, performance and risk.