RESOURCES

Why Data Can Be Both Invaluable and Detrimental to Change Initiative Ideas

About this Publication

I have learned the hard way that if an organization lacks an Agile mindset, and does not foster an environment of safety, trust, and experimentation, collecting team metrics and insights can actually lead to more harm than good. However, once the prerequisite of safety is established, data no longer becomes your enemy as a change agent, but your ally, and a powerful tool in your Agile coaching toolbox.

1.      INTRODUCTION

Another movement sweeping through the corporate world with the same steam as Agile is that of Business Intelligence & Big Data. More and more organizations are turning to data analytics to identify patterns to help them improve. Therefore, it should come as no surprise that organizations undergoing Agile Transformations seek data to measure performance of the investment and identify areas that can be improved.  However, the sad truth is if the organization is not mature enough to utilize this data in the right way, it can do more harm than good.

2.      Background

Having grown up as BA, I have always gravitated towards data because I enjoy identifying patterns to tell a story. I am also a passionate Agilist, playing the role of change agent in Agile Transformations. Therefore, my natural inclination has always been to couple the two together to drive change.

I like to think of collecting team data as just another feedback loop in an Agile environment. After all, the people doing the work know best! Therefore, why not collect data points from teams to identify areas for continuous improvement throughout the organization? I have had some success in this area, but unfortunately this approach has not always lead to the desired outcome.

Like everything else that could be used for good, data can also be leveraged for evil. Organizations that have not acquired the Agile mindset and embraced the Agile values and principles will be more likely to use the data to drive a more command-and-control leadership style. This approach does not foster an environment of safety, trust, and transparency which leads to more risk that the data will not reflect reality. However, if the organization does embrace the values and principles, and people trust they can be honest and transparent, the data is more likely to be accurate. Leadership can then trust the data to act upon.

2.1      The Failure

I consider myself very lucky to have spent the first ten years of my career in a small software company that embraced Agile values and principles even before Agile was mainstream. The Agile mindset was instilled in me from the beginning. The mindset was easy to adopt since the Agile principles and values align directly with my personal beliefs. Agile has directly impacted my career, outlooks, and personal growth. Therefore, when it came time for a change, it felt right to pursue an Agile coaching career in order to help others share the same experience.

My first engagement was with a rather traditional monolithic bank. The organization wanted to be Agile in name only, but I was too early in my journey to recognize it. I become part of the Agile Coach Center of Excellence, a community responsible for assisting 55 teams, a tall order given there were nly three of us! I was eager to show my knowledge and skills, so I jumped on an opportunity to do some statistical analysis on teams due to my affinity for data. The assignment was a perfect match for my skills. All team backlogs resided in individual excel sheets! Needless to say, the organization lacked a holistic view of each initiative given many teams were contributing to it.

I saw it as opportunity to immediately add value. I created a program to consolidate all excel sheets into one. After a few excel macros, pivot tables, and charts, leadership had the view that they have been seeking. The task quickly morphed away from a one-time exercise, and became a full-time job! The spreadsheet became known as the “Agile Dashboard,” a document in which I was not only asked to constantly enhance, but also administrate.

At first, I accepted the role. I saw it as an opportunity to coach leadership on how to appropriately use the data to drive positive change. I encouraged them to leverage the data to identify behavioral patterns, organizational constraints, coaching needs, and continuous improvement opportunities. Unfortunately, leadership was not receptive to the coaching. In fact, they did the exact opposite! My data was used to punish teams, identify poor performers, and hold Agile PMs accountable for something they could not control.

I found myself in a role in which I did not sign up. I left the office every day with a sense of betrayal to the Agile community and my friends at the bank. This was not the Agile environment that I came to love; one that fostered safety, trust, and autonomy. Here I was, a so-called coach, fueling anti-patterns such as abuse and reprimand. When I left the engagement, I vowed that I would never put myself in that position again, and frankly, wanted to stay clear of data altogether.

2.2      What I Learned from the Failure

The experience made me a better coach and person. Complexity theory tells us that organizations are complex adaptive systems because they are a dynamic network of interactions that collectively self-organize when change is introduced. As a young coach, I was naïve to this fact and didn’t realize the magnitude of difficulty involved in organizational change. The word “transformation” is very misleading when it comes to Agile adoption initiatives because it implies there is an end state. Instead, these types of initiatives should be seen as an evolution, or a journey, characterized by relentless inspection and adaption to improve the current state.

Bob Hartman coined the term “doing” Agile vs. “being” Agile. In the earlier stages of an Agile adoption initiative, organizations that are transitioning from a traditional methodology are simply “doing” Agile. This adoption stage is characterized by people starting to learn Agile practices and the “mechanics” behind them. In no way, shape, or form does this translate to an organization “being” Agile.

At this stage, the Agile mindset has not been acquired by the organization and its leaders. This should be of no surprise! As part of the change, leaders and individual contributors, who were very successful in the traditional organization, are being asked to change their behaviors and thinking. This is a change that does not happen overnight. Unfortunately, the sad truth is this change in mindset will never occur for many organizations. Agile frameworks typically expose an organization’s dysfunction, but whether or not leadership wants to resolve it is a whole different story. From my experience, organizations often tend to change the framework to accommodate the dysfunction, rather than appropriately resolving the dysfunction to become an Agile organization.

Organizations will still gain some benefits when “doing” Agile, such as increased visibility, productivity, and the opportunity to change direction. However, the magnitude of these benefits does not compare to those reaped by the organizations that do acquire the Agile mindset and “become” Agile. Ahmed Sidky defines companies that “become” Agile as those that embody the values and principles, and work in small increments to deliver value continuously in order to receive feedback. Thanks to the digital revolution, power has shifted from the supplier to the customer. Customers now have many different options right at their fingertips. The organizations that deliver value early and often and incorporate customer feedback into their products will have a better chance of achieving better business outcomes.

Figure 1. Agile Adoption Benefit Timeline

 

Organizations early in their journey or ones that decide to never “be” Agile will still have leaders who use a command-and-control approach. Command-and-control leaders use data and metrics to drive behaviors and measure people’s performance. People working in a command and control environment typically do not trust those around them, and therefore do not feel safe. When trust and safety is absent, people will be fearful of transparency and honesty. As a result, the data is typically fabricated, gamed, and not reflective of reality. Sadly, this leads to strategic decisions being made on false data which compounds the dysfunction, and ultimately prevents the organization from reaching its goal of happy employees and better business outcomes.

For example, I once asked a team why they were committing to more story points then their average velocity. It turns out that they were fearful of reprimand because if they used their velocity to forecast, the project would appear off track and their manager would look bad. This is a perfect example of why the lack of trust and safety within an environment leads to detrimental results and is a losing proposition for everyone.

If the team would have felt safe to be honest about their capacity and forecast, it would have provided the opportunity for the product owner to manage stakeholder expectations, and discuss reprioritization. However, since the team did not, the customer was not aware of the risk until the last minute, leaving no time to manage expectations and discuss trade-offs. Ultimately, the goal of satisfying the customer was not achieved due to lack of safety and trust within the environment.

As coaches and change agents, we need to proceed with caution when utilizing data early in an Agile adoption initiative, as it could lead to more harm than good.

Figure 2. Data Effectiveness for Positive Change – Detrimental

2.3      The Success

As luck would have it, a few engagements down the road, I stumbled across an organization that had a mature Agile mindset and the goal to “be” Agile. There was never a doubt that Agile was a good fit. The culture was built on alignment, autonomy, trust, and safety. They wanted to take the next step forward in their Agile journey so they hired a few Agile coaches to help them get there.

As an Agile Coaching Center of Excellence tasked with improving the flow of work within each value stream across the organization, we saw data as our ally. We created a direct feedback channel from those doing the work to those making decisions by collecting multiple data points from teams to identify patterns impeding the organization from achieving its desired outcomes.

The organization had an open-minded leadership team who embraced servant leadership and was willing to be coached. They were receptive to viewing the data as an additional feedback loop, helping them identify to identify behavioral patterns, organizational constraints, coaching needs, and continuous improvement opportunities. The data was seen more as a means to an end; helping people and the organization achieve its goals.

As a result of trust and safety being present, team members reported accurate data. It was reliable and actionable. It was used to make strategic decisions that benefited everyone in the organization, ultimately leading to better outcomes. For example, velocity was used to forecast the completion date of the project. If the date was unacceptable due to business needs, it gave Product Owners and stakeholders ample time to re-prioritize. Once the Agile Mindset is acquired and people feel safe, data becomes a change agent’s ally, and invaluable tools to drive positive change.

Figure 3. Data Effectiveness for Positive Change – Invaluable

3.      Data Gathering Techniques to Drive Positive Change in a Safe Environment

3.1      Agile Team Self-Assessments

The word assessment has become a dirty word in the Agile community and I can understand why. The market has been littered with an abundance of assessments, bringing their credibility and value into question. On top of that, organizations are abusing them, utilizing them as team audits. We thought long and hard about whether or not to adopt assessments, but concluded that given the safe environment, they could be used as a tool for learning Agile practices, identifying team coaching needs, and recognizing organization systemic constraints.

We recognized assessments are not one size fits all! Rather, they should be tailored to both the team and organization’s challenges and objectives. All Agile coaches collaborated on a catalog of over 100 tried and true Agile and Lean Practices directly compiled from the specific framework literature. Each practice was mapped to one or many general business outcomes such as time to market, employee satisfaction, customer satisfaction, innovation, reliability, responsiveness, and predictability. The catalog was used to identify “when” to experiment with a practice and “how” to properly apply and measure the effectiveness of it. Each practice consisted of a set of criteria for each of the five agility stages; blocked, developing, emerging, adapting, optimizing.

The team was empowered to assess themselves. After all, the value is in the conversation! We worked with teams on facilitation and consensus voting techniques, and helped conduct assessments in which team members felt comfortable to express their opinions. There was no scoring system, just a set of criteria for each agility stage enabling teams to assess their fluency on each practice and set goals for improvement.

Finally, we were able to create dashboards that consolidated cross-assessment and cross-team results in a consumable and actionable format. Aggregated results were created from teams within a specific business unit or a program. This approach provided a holistic view of the group’s performance on practices that contributed to each general business outcome. Each unit or program had slightly different objectives and would assess themselves on the practices that lead to their desired business outcome.

Figure 4. Self-Assessment Dashboard

 

For example, we worked closely with the business unit responsible for supporting the organization’s payments application. Given that this was a mission critical application, the team’s focus was mainly on reliability and quality. After two assessments, it became obvious the team felt they lacked knowledge in key technical engineering practices that lead to high quality and reliability. This was a scenario in which our customized self-assessment framework enabled us to identify a coaching need for a team which supported a mission-critical application vital to the organization’s success. The team was assigned a technical coach who focused on the adoption of Agile engineering practices such as test-driven development, continuous integration, and test automation.

The overall agility stage of each team and group was also tracked. The overall agility stage of the team was determined by calculating the average of all questions answered by them in a given assessment. The motive behind this data point was to ensure that the overall Agile fluency of the team was continuing to improve. In one scenario, we identified a team that did not improve their overall agility stage after six quarters. After engaging the team, we found they lacked the skillsets to support the technical stack of the product, and were forced to hire contractors to fill this void. The organization did not retain a contractor for more than six months, so the team received new team members on the same cadence. It was obvious that the churn of team members prevented the team from forming team norms, and forced them to essentially start over every six months. After further discussion, it turned out the group’s management did not quite understand the product’s technology, and the level of effort and knowledge needed to best support customers. The conclusion was that the benefit of hiring a full-time employee in this role far exceeded the cost.

3.2      Cross-Team Impediment Roll-Ups

Another data collecting approach we utilized was collating team impediments that they could not control. The goal was to identify common impediments and attempt to quantify the impact in order to drive prioritization of continuous improvement opportunities. We created an internal tool that enabled scrum masters to record events and blockers in real time over the course of the sprint. It was dubbed the “Sprint Diary.” At the end of each sprint we would collect the data and identify common impediments encountered by the teams. The output of this exercise was items for our continuous improvement backlog.

In order to prioritize the continuous improvement backlog, we implemented a lightweight impact categorization system to help teams classify blockers and quantify the impacts. After collaborating with many team members and leaders, we concluded that a majority of team blockers fell within one of six general categories; staffing, unplanned work, dependencies, environments, administration, and business collaboration.

The impact of each blocker to team performance was rated on a scale of one to three, with three representing the highest impact. We encouraged teams to rank each blocker relative to one another, but also provided some guidelines based on the percentage of productivity lost due to the blocker. We acknowledged that there was no hard and fast science to calculate the impact. Instead, we empowered the teams to determine the impact in a manner they deemed appropriate for their given context.

At the end of each sprint, data would be collated across all teams in order to provide a holistic picture of “what” was impacting team performance. The coaches in conjunction with leadership would analyze the data in search for common patterns and trends of blockers inhibiting team performance. For example, this technique enabled the coaching practice to identify that the shared QA environment average downtime was nearly a day and a half per sprint, impacting over 13 teams. When we presented this case to executive leadership and quantified the delays, there was an investment made in upgrading the QA environment and increasing its stability.

3.3      Happiness Index

Employee engagement and happiness was a key pillar of the organization. Therefore, we would periodically take the pulse of individuals on the teams to see how they were feeling and how it changed given specific events. The data gathering technique become known as the “Happiness Index.” Thanks to some of the IOS developers, we created an app that randomly pushed out notifications to team members over the course of the sprint. The notification would ask the recipient “how are you feeling?” The response options were mad (1), sad (2), indifferent (3), and happy (4). Team members could respond autonomously, or not respond at all. All responses were aggregated so that teams could see their happiness trends over time. This metric became the one most monitored by leadership because they saw happy employees as a leading indicator to success.

 

Once we have enough data points, we overlaid the happiness index with the sprint diary. Combining these two data sets was a game changer for the organization. The view enabled everyone to see how team and organization events were altering people’s feelings and employee satisfaction.

Figure 7. Team Sprint Happiness Index/Sprint Diary

4.      Lessons Learned from my experience

  • Collecting data in an Agile Environment is an additional feedback loop you can use to make decisions and identify organization constraints.
  • Trust and safety is typically absent in a command and control work environment. This will lead people to be fearful of transparency and honesty. As a result, the data collected from teams is typically fabricated, and therefore strategic decisions are made on false data, which compounds the organization’s dysfunction, and ultimately prevents the organization from reaching its goals. In this case, change agents should be very cautionary when introducing new team data gathering techniques.
  • Once an Organization acquires the Agile mindset, and embraces the values and principles of Agile, people will feel safe to be transparent and honest. In these types of environments, data can be counted on to make strategic decisions. In this case, change agents should encourage new data gathering techniques.
  • Aggregating data points to tell a meaningful story helps Agile coaches and leadership to identify behavioral patterns, organizational constraints, coaching needs, and continuous improvement opportunities.
  • Agile Team self-assessments and happiness are leading indicators of whether or not organizations are the right path of achieving better business outcomes.

5.      Acknowledgements

I like to take the opportunity to thank my colleagues, Troy Lightfoot and James Gifford. Troy and James actively collaborated with me to produce these unique data gathering techniques. Together, we have created www.leanagileintelligence.com so other organizations can utilize the techniques to achieve better business outcomes.

I also like to thank Sue Burk, my mentor in this process. This is the first time I have published an experience report so I leaned heavily on Sue for guidance and feedback. Her vast amount of experience in this area was evident. I have learned so much from her during this short time.

6.      References

Gupta, Amit ”Insights From Complexity Theory: Understanding Organizations Better”: http://tejas.iimb.ac.in/articles/12.php

Sidky, Ahmed: https://www.stickyminds.com/presentation/agile-vs-agility-doing-vs-being

Sahota, Michael: http://agilitrix.com/2016/04/doing-agile-vs-being-agile/

Hartman, Bob: https://www.slideshare.net/lazygolfer/doing-agile-isnt-the-same-as-being-agile

Copyright 2017 is held by the author.

About the Author

Michael McCalla is a technology leader, transformation specialist, and avid agile practitioner. He has a passion for building great products and coaching organizations to create a value driven environment that fosters collaboration, empowerment, safety, and learning. Michael has spent the last five years leading enterprise-wide agile adoption initiatives. He has broad range of experience applying agile to small teams, large distributed teams, and portfolio management. Michael is President of Achieving Agility, an consultancy firm with a hands-on, pragmatic, and adaptable coaching approach that uses a blend of Agile and Lean practices to enable their clients to reach their desired outcomes. He is also a Board Member of the Agile Uprising, a purpose-built network that focuses on the advancement of the agile mindset and global professional networking between leading Agilists. He recently was named to the review board of the Agile Practice Guide, a collaborative effort between the Agile Alliance and PMI. Michael's latest initiative is Lean Agile Intelligence (https://www.leanagileintelligence.com) , the industry’s first customizable organizational change and learning tool. Lean Agile Intelligence leverages the power of big data and business intelligence to identify continuous improvement and investment opportunities that contribute to your organization’s vision and objectives. Our mission is to provide Agile teams with the guidance needed to adopt and experiment with Agile practices the way they were intended to be applied, and create a direct feedback channel from the people doing the work to those making decisions so that organizations can quickly identify impediments preventing them from achieving desired business outcomes.