When we moved to a dual-track delivery approach for our Bank of New Zealand Digital product teams, it felt more like Dueling tracks. People knew why we were doing it but were confused about how to link discovery and delivery. Who should do what, when and how? It became a clash of collective egos, stepping on each other’s areas of expertise. This is about how we brought clarity, alignment and direction to those teams by creating a Product Management playbook and how it had a profound effect on how we work and communicate with one other.
Anyone who has been part of a high achieving product delivery team knows how rewarding it can feel. A team empowered to focus on achieving valuable outcomes for the customer, rather than just turning the output handle and building a pre-determined thing efficiently. In the increasingly competitive and disruptive fintech market that the Bank of New Zealand (BNZ) Digital group resides, we need to set the pace through innovation and make better product decisions. This is why we are guided by a ‘Product Management’ culture that is aligned to Jeff Gothelf’s definition, where teams:
- have ownership of, and are responsible for, the process, solution and its success
- focus on creating an outcome for customers, rather than producing a deliverable
- do the smallest amount of work possible and start measuring the change
These Product teams follow a dual-track development approach, a term first coined by Jeff Patton and Marty Cagan, where there are two activities happening continuously and in tandem:
- discovery work, focusing on fast learning and validation
- development work, focusing on predictability and quality
Each Product team is jointly responsible for both these types of work and needs all the skills to conduct both discovery and delivery work. While a Triad made up of a Product Manager, UX Designer, and Senior engineer may lead discovery, they must involve the wider team. Both types of work need to be practiced with agile and lean principles and discipline. The problem is that not many people are talking practically about this.
The teams at BNZ Digital have been using agile since 2009. BNZ Digital consists of over 300 people and is structured along the lines of the ‘Spotify snapshot’. There are three main tribes who focus on ‘Platform infrastructure’, ‘Business banking’ and ‘Retail banking’. The latter tribe is the largest and has five subtribes that we term ‘hapū’ (which is the māori term for subtribe). Each of our tribes and hapū, have their own cross-functional product delivery squads (with an embedded Product Owner) that we call ‘clans’. These tribes, hapū and clans, are supported by specialist areas, including a data analytics team and a ‘creative services’ group that consists of UX designers (UI designers and interaction designers), UX researchers, content and comms. The purpose of BNZ Digital, is to deliver positive outcomes to the life of our customers through our digital channels (focusing on internet and mobile).
I joined them at the end of 2015 as an agile coach, to work across Digital tribes and supporting groups as a whole. I noticed very early on that while we were building things very efficiently, clans felt like they were being told ‘what to build and how to build it’ by Product and Creative Services. When I questioned them about this, we came into conflict and they told me agile was an ‘engineering thing’ that ‘did not involve the customer’. The designers liked the ‘Lean process’ while the UX researchers favoured a Human Centered Design approach… hang on! This felt like a really awkward three way handshake, and I was in the middle. Individual disciplines were focused on their own set of practices, so a lot of time was spent in philosophical debates with each other over which was better: Agile or Lean. It was exhausting. Being in the middle, I needed to work with Creative Services to find some common ground. In mid 2016, I collaborated with Claire, a senior UX designer at BNZ. We both loved the Lean UX book and thought it could change the way we worked from focusing on outputs to outcomes. It was our common ground. We initially ran this as an experiment for three months with two clans in the business tribe, to align and integrate customer discovery and delivery, Lean and Agile. The measures of success were the valuable outcomes we could make to our customers and how clan members felt in terms of being empowered on what to build and how to build it.
During that three-month period, the most valuable outcome was when one of the clans decided not to build a specific logging feature. They had sized it at around 12 months, but the biggest assumption we had was ‘our customers want this’. Claire and I were the team’s conscience, they talked to customers, looked at analytics and discovered no-one wanted it. Six months before we would have focused building this feature efficiently, releasing it in 11 months and then celebrated. Instead this clan decided not to build the feature based on actual feedback learning. They continued with Lean UX and went on to build things customers actually valued and needed. The clans felt empowered on what to build and how to build it. The three-month experiment was seen as a success and Claire and I started to roll Lean UX across all seven clans in the business tribe, then in December 2016, the Product Owners, Leadership Team (LT) and Creative Services went on Jeff Patton’s Passionate Product Ownership course. This inspired them and Digital as a whole to move to a ‘Product Management’ dual-track delivery approach focusing on continuous discovery and delivery activities. This would build on the Lean UX roll out we had done to date. This was awesome! wasn’t it?!
3. MY STORY STARTS WITH THE POWER OF LANGUAGE
Our biggest insight from the initial Lean UX experiment was the power of language and how it has a massive impact on our mental models and the way we view the world. Language can inspire and unite, it can also confuse and alienate. The vocabulary we used as part of Lean UX worked to empower teams to make better decisions by uniting people around the ‘why’, not ‘how’ or ‘what’. Trying to convince people that ‘your way of working’ is better than ‘their way of working’ is stressful. Like Simon Sinek’s ‘Golden Circle’, we realised we were all too busy focusing on ‘how’ we do things, rather than ‘why’ we did them. When we started to communicate with ‘why’, it inspired others and enabled positive collaboration and change to occur. The core ‘why’ vocabulary we used was focused on:
- Customer: the reason we do what we do; who is the actual BNZ customer we are looking to make a difference to?
- Problem: what is the problem we are looking to solve, does the need exist?
- Opportunity: we don’t always have problems, sometimes there are opportunities to make a difference to our customers
- Assumption: everything we do is a best guess that we have permission to test
- (In)validate: assumptions need to be tested, is this thing worth doing?
- Not now: we need to wait for feedback on our experiments
- Experiment: what is the smallest thing that can be undertake to learn something and/or deliver enough value
- Outcome: value our Digital products bring to the lives of our customers
These ‘why’ words are the fundamental reasons we practice agile, lean, or any customer centric, people-oriented approach. The empirical process of Build-Measure-Learn (B-M-L) is what enables us to both deliver value and discover what is valued.
We had empowered, customer-focused product teams, working more closely with UX researchers and designers, talking the language. But as we scaled our experiment to three Tribes, 22 clans and around 250 people (our business has grown since then) across Digital in the first few months of 2017, the wheels started to wobble. There was confusion over what dual-track delivery and a Product team-based approach meant. Everyone had their own interpretations of what should be happening in discovery and delivery, and who should be involved. There was still a perception of dual-track approach being ‘duel approach’, with Product triads and UX trying to feed ideas into engineering teams, with the latter focused on delivery. There was confusion over how to approach work, who should do what when, what the teams should build next, how did they know how to prioritize feature X over feature Y. Clans were responsible for the process, solution and its success, but they hadn’t all been on the Lean UX product management journey. To top it all off, there were widely varying opinions, lack of empirical discipline and overall confusion as to what Product Management meant overall. What was a Triad? What was discovery in dual-track? Who should be running discovery and how should they hand it over to the delivery track? Most importantly, how was the move to Product Management different from what the teams were doing before, from a work and collaboration point of view?
I realised that one of the things that had made the Lean UX experiment so successful, was the focus Claire and I could give the teams. We supported them through that transition and helped clarify the approach. We were asking a lot more of our teams with this new way of working, but we weren’t giving them the support and clarity that they needed. This scaling problem was compounded by the fact we didn’t have enough designers, UX researchers and data analysts to be embedded in each clan. There was a Designer for every two to three teams, and these specialised skillsets were now in hot demand; we were victims of our own success. We needed to find a way to optimise their time with the teams.
There was also resistance from Product Managers, who felt like I as the Agile Coach was telling them how to do their jobs, and from UX researchers and designers, who felt like their work was being taken away as I started to run more discovery workshops. It was a clash of collective egos, which was causing conflict and angst. I’m not a Product Manager. I’m not a UX researcher, designer, developer or tester. I don’t question the decisions these experts make; I question whether their decision-making process is effective. I question if they’re tracking towards their goal, and how they know. I needed a mechanism to allow me to have those conversations in a positive way that didn’t ruffle feathers.
3.2 Enter the Product Management Playbook
As a coach, I look to act as a conscience for the team and people I work with. I needed an agreed frame of reference that would enable me to be that conscience. As a Digital team, we needed to define what Product Management and dual-track meant to us. I needed a way of pulling all this thinking into a holistic framework that in turn incorporated the best of various approaches and expertise from all our disciplines.
The thing that really sparked change when we introduced Lean UX into the organisation was change collaboration. Linda Rising in her book Fearless Change, talks of the pattern to invite others to support you in the change. We’d successfully used this approach before, working with others in the design team to implement Lean UX. As a coach working across all teams and areas of Digital, I had the opportunity to collaborate with all disciplines to drive this change. I could draw on their expertise, specialism and experience.
We had created a playbook when we first introduced Lean UX. It had limited success but had been useful to align our approach. It was very Design Thinking centric and far too long. So long that hardly anyone read it. I was also getting cross at seeing ‘playbooks’ that were 100+ pages long, with huge cut-and-paste tracts of frameworks and practices from the Internet. Why re-invent the wheel? These were detailed manuals and checklists, not playbooks. Agile as no brain - it needs experts to apply it and make decisions in the real world. A comprehensive playbook document in our complex environment was not going to work. I chose the name playbook, as it should be enjoyable to read! The focus needed to be on the common language of why we do what we do, and the practices surrounding that. The practices came from the Build-Measure-Learn of empiricism along with a set of disciplined learning steps. The goal was to create a high-level framework to understand where we are today, where we want to be tomorrow and how to pursue success through exploration, experimentation and validated learning. This meant combining tools and techniques from all three mindsets we were drawing on. Jonny Schneider describes Design Thinking as exploring the problem, Lean as building the right thing, and Agile as building the thing right.
I think that ‘Agile done right’ covers all three mindsets, but to prevent a clash of egos, I had to acknowledge and promote the others. Each mindset brings its own benefits. They each have a different lens but share similarities and are complementary with one another. They have more in common than not and, like us, they seemed to be more like squabbling siblings. We knew we loved and needed all three. It wasn’t a case of picking one over the others - we needed to mash them together to create our own approach to Product Management.
The playbook needed to be a collaborative effort across our Digital teams. It needed to be endorsed by the Digital leadership team and involve expertise from all our skillset areas, including development teams, product owners, UX researchers, designers, content, project managers and data analysts. I started to draft a skeleton structure in a keynote presentation format in late 2017. I then invited others to collaborate and fill in the detail.
It was also important to make it feel like a playbook; it should be no more than 30 pages, self-explanatory, endorsed by key skillset areas, have lots of pictures and be enjoyable to read. The key test was could it be easily understood by anyone who picked it up. That meant no jargon or nonsense buzzwords, so we set out to use the language of why and make it inclusive. When we talk about the customer, we literally mean a customer of the bank, not a proxy for that customer. It became a ‘wisdom of crowds’ effort, curated by me. I wanted to keep it visual and engaging, what we needed were some engaging visuals that encapsulated our approach in a nutshell. The heart of the playbook (as in the heart of agile) was the fact that everything we do is based on empiricism. The realisation that everything is an assumption waiting to be (in)validated through an experiment. The perception from UX and Product Managers was that agile delivery team experiments were still engineering focused (how we can ship a product feature more efficiently), whereas their discovery experiments were focused on value and the customer.
We were running dual-track delivery, but it felt like two separate ‘Duel tracks’. We needed a shift to bring both the discovery and delivery tracks together. Dual-track is generally depicted as two separate parallel tracks. The intent is not that, but we were becoming a manifestation of that visual representation. It should one team, running one track after all. The problem for me was how to describe this. The ‘aha’ moment came for me in November 2017. It was seeing my son playing with a ‘lazy Susan’ at a local Chinese restaurant. As he was spinning it, the items he wanted on the inside reached him much quicker than the vegetables on the outside. If everything we do as a Product team is an experiment, surely they all follow the same ‘lazy Susan’ feedback loop? A Build-Measure-Learn cycle, repeated over and over again. What changes is how quickly those experiments take to make a full circle of that feedback loop. One team, on one circular track, running many different types of experiments. It was the start of the visual that encapsulated the approach we take.
Figure 1: Our Product Management B-M-L loop (lazy susan!) in a nutshell
With that feedback loop in mind I applied the same language we had introduced as part of Lean UX. Language that had been endorsed and adopted by Digital:
- Define customer value and create a mission around it to give purpose to a team and create alignment.
- Target metric. Set target metrics that enable the team to track value to the customer and business. Teams usually want to ‘just get going’ and brainstorm features, but how can they prioritise one feature over another when they have no reference point of why they were doing it and who it’s for?
- Current condition. Establish where the metric is now. It may be an estimate of the current state until the team can establish an accurate baseline.
- Establish assumptions that need to be (in)validated and prioritise ideas that may move our metrics.
- The smallest amount of effort you can build to (in)validate your next assumption.
- Measure and Learn. Learn from those experiments, (in)validate the assumption and update our current condition and assumptions.
The mission and metrics were foundational because I don’t believe you can take a Product Development approach unless you have a clear mission and set of (quantifiable) metrics. If everything we did followed that loop-to-learn and deliver value to our customer, the next step was establishing the types of experiments to run. One team I worked with that had a mission of #selfservice, was struggling with a metric of ‘reduce calls to helpdesk’. Using the playbook approach, I questioned: Which helpdesk? We have three, right? How many calls does that helpdesk receive today? After they had discovered the answers to these they had their baseline measure. They could now look at making assumptions, asking questions and running experiments. What is the number one call? Why is it the number one call? If we made that a self-service experience would it mean customers not phoning in and therefore reducing calls and our clan target metric? We’d figured out patterns of the types of experiments teams run, having worked and observed them running Lean UX over the past two years. They broadly fell into four buckets or as we called them, ‘Categories of work’:
- Talk to a customer. Get feedback from actual (did I say real?!) customers to learn what they really need
- Gather some data. Find data that allows you to make informed decision on what to do next
- Prototype something. Specs are boring, bring your idea to life and reduce doubt through rapid feedback
- Put it in production. We don’t deliver any value until it ends up in our customers’ hands. This is still a learning activity
When we first came up with the Categories of Work, we numbered them in chronological order. One of our product managers argued the numbers weren’t necessary and the steps could be carried out in any order. He was right. He also wanted to add Make a change as a subtitle to Put it in production as not everything we do is a software change (sometimes the customer impact is a result of a change to a bank process). These four categories enable us to ask questions around what a clan should do next on the on the B-M-L loop:
- What is our highest risk/most unknown assumption that we need to get early feedback on?
- Which Category of Work can help us run an experiment to (in)validate that assumption?
- Who do we need to get involved with the experiment, what experts do we need?
The benefit of this approach is that techniques used by various experts could be part of this framework. A Design Sprint, for example, is another way to prototype something (with a bit of Gather some data and some Talk to a customer). Ultimately, it’s lots of experiments over a focused week. For each of the categories we established a set of guiding characteristics, and drafted up some examples. We then asked the experts in those areas to add the detail. We applied the following common characteristics to each of the categories of work:
- Guided by a specialist (e.g. design, UX research, content)
- Involving the wider product team
- Tracked on the team board
We wanted everyone to be part of discovery work, without feeling overwhelmed. We learned pretty quickly that not all of our developers felt comfortable talking to customers. But others loved it. There’s nothing better than hearing a tester say, “it’s the first time in my 20-year-career that I’ve actually talked to customers”. Getting a specialist to lead the experiment meant the team didn’t need to be involved with everything, but they were still involved and supportive. The categories enabled people to ask the right questions to ensure good decision-making process, starting with, “which category of work?” and “who should be involved/lead?” and then, “what would the details of that experiment look like?” The table below gives a high-level view of our examples, which we’re refining all the time. The key is that no one person can be across all of these: it requires a collaborative approach, with an expert leading them.
3.3 Making the Playbook habitual
The product playbook had achieved Digital wide buy in, but I knew we needed ensure it was tied to each Product team’s way of working and making sure they had the discipline to track progress. This borrowed from the Lean mindset of deliberate practice, making it habitual. I had seen Melissa Perri’s Product Kata approach (based on the Improvement Kata by Mike Rother) and spoken to her about it at Agile17). This is a tool that would help the team take small purposeful steps toward solving actual customer problems. A team needs to start with a good question before it can start to discover what the answer might be. In line with the language of ‘why’, we adapted Product Kata to reflect the everyday vocabulary and approach we used with our Digital teams doing dual-track delivery. This resulted in a Product Experiment map that sets out the key steps of the empirical feedback loop:
- Current Condition - What is the current behaviour (of your user / system)? How do you measure it?
- Assumption - What must be true for your product to be successful?
- Experiments - Pick a learning activity: Talk to a Customer, Gather Some Data, Prototype Something or Put it in Production
- Learned - Results, observations and insights from your experiment.
This formed one of our templates at the back of the playbook (see Figure 2), which we encourage teams to print out in A3 (Tabloid or Ledger), put on the wall and use stickies to keep track of what they are doing.
Figure 2: The Product Experiment Map
The playbook was created to get people aligned behind a framework for how we build Digital products at BNZ. It has achieved this, by getting all of the teams involved. It has been endorsed by different disciplines (not often you see project managers, UX designers and coaches agreeing), by respecting their expertise and focusing on the high-level framework. We’ve done this by being tool and framework agnostic, and focusing on the mash-up of Design Thinking, Lean and Agile mindsets. Some of the key content areas that make our playbook are:
- Product Management as a mash-up of Design Thinking, Lean and Agile
- Build-Measure-Learn empirical approach at the heart of everything we do
- Team missions and importance of clarity
- Metrics, tracking progress and key characteristics
- Assumptions, how to map them and start an experiment
- Categories of Work, who to get involved and examples
- Powerful questions everyone should be able to ask and answer
- Practices and patterns of effective Product delivery teams, focusing on agile fluency ‘Optimizing zone’
- Opportunity Canvas template, to determine the problem/ opportunity, who it is for and how we will measure success (based on Jeff Patton’s Opportunity Canvas)
- Product Experiment map template, to enable disciplined discovery steps (based on Melissa Perri’s Product Kata)
It weighed in at 27 pages, so we achieved our self-imposed goal of 30 pages. It has been circulated and grown exponentially. What started out as a playbook for one Digital tribe, soon grew to a practical reference for the whole Digital team and is now being used more broadly across the bank in our product delivery, customer journeys, and operations teams. It also passed the key test of being understandable to people not involved in its creation.
We see this playbook as something everyone can use. It’s an over-arching field guide to running product experiments and delivering value to BNZ’s customers and users (in fact that’s its subtitle). Our Product Management approach follows Build-Measure-Learn empiricism, and dual-track delivery using Categories of Work experiments to discover and deliver value through collaboration and empowering experts. It borrows from Agile, Lean and Design Thinking to create an overall mindset of experimentation and customer feedback. We’re looking at creating other playbooks, following the same approach, next up will be a Technical Dev-Ops Playbook.
The Product management playbook was a huge success, providing a mechanism to address shared understanding and bring people and mindsets together. As one of our Product owners noted: “[it] brings together all the best parts of our practices in an easy to understand way, providing a framework that our team have adopted and then adapted”
There is now a common view on what we mean by dual-track delivery and a Product team-based approach at BNZ. It isn’t ‘Duel Track’ between discovery and delivery – instead, all teams use Categories of Work experiments, which are led by experts. These UX designers, researchers and data analysts are brought in by the clans as ‘lead specialists’ when needed, making much better use of their highly-sought-after time. They attend discovery workshops, attend stand-ups when working with a clan and update them and include them on discovery activities. Overall, it has brought these tracks and disciplines together to create an ‘All Track’ approach.
The Product Playbook now enables me, along with everyone else in Digital, to be the conscience of the team. It helps us to answer:
- Who is your customer or user?
- What is the problem you are looking to solve?
- What are your measures of success?
- What does your next experiment look like?
- When will you complete your next experiment?
- How long till you release to production next?
- What’s scaring you or stopping you from succeeding?
These powerful questions enable me to question the decision-making process and Build-Measure-Learn experiment steps, without coming across like I’m questioning their product decisions and expertise.
Due to its success, the biggest “problem” has been people wanting to add more to the playbook. We have to treat the playbook as we would a product, which means saying no to features that aren’t core to the vision, and reviewing existing features for value to our users. It was designed as a simple, high-level framework to provide guidance and help define what Product Management and dual-track development means to BNZ. Keeping that simplicity and ensuring we don’t ‘design in’ complexity and confusion will need to be our joint focus. It is a ‘living document’ that resides as a keynote presentation and interactive set of pages on our Digital wiki. We encourage people to post learnings and suggestions, that so we can evolve the Playbook as our experiences grow. The next step for us is to create a working group from each of the discipline areas, to own and curate it going forward.
4. What We Learned
The big (re)learning for us was that to enact change, you need to collaborate and get others to help you. The playbook was the mechanism we used to achieve that, helping us to define what Product Management and dual -track delivery means to us and remove any ambiguity. Rather like the Categories of Work we used in the playbook, the key for us was to let specialists use their expertise, by keeping the initial document draft and inviting their feedback. Keep it simple! If you have to spend more than five minutes explaining it, don’t use it.
By all means ‘think agile’, but you don’t have to always ‘talk’ agile. Agile to me might mean a mash-up of all three mindsets, but when I talk like that to others, it may sound like I’m trying to impose my mindset over theirs. I had to suppress my own agile ego (or at least the perception of it) and listen to where others were coming from, then we could drop the collective ego to embrace Agile, Lean and Design Thinking mindsets and tools to bring different skillsets together. Respect the mindsets of others and involve them, rather than trying to change them so that you can move from experts ‘dueling’ to All Track inclusiveness.
To echo the words of Jeff Bezos, “Good process serves you, so you can serve customers. But if you’re not watchful, the process can become the thing. This can happen very easily in large organizations. The process becomes the proxy for the result you want. You stop looking at outcomes and just make sure you’re doing the process right.” The playbook is our key to making sure we don’t let that happen.
It was a team effort to bring the playbook together, and it will be a team effort to keep it together. My role is the curator and to tell the team’s story. Thank you BNZ and the BNZ Digital leadership team, tribes, and experts who enable me to have a go with all this stuff. Many thanks and a special call out to those who have helped bring this experience report together; Rebecca Wirfs-Brock and Sue Burk (I was greedy and got two great shepherds!), Stu Collins, Elena Babitcheva, Pera Barrett, Claire Jaycock, Michelle Anderson and Michelle Maxwell.
Gothelf, Jeff and Seiden Jeff “Lean UX: Applying Lean Principles to Improve User Experience” O’Reilly Media, 2nd edition 2013
Patton Jeff “Dual-track development” blog https://jpattonassociates.com/tag/dual-track-development/
Cagan, Marty “Process vs model” blog https://svpg.com/process-vs-model/
Boobier, Ant and Jaycock, Claire, Language of Lean UX http://www.methodsandtools.com/archive/leanuxlanguage.php
Sinek, Simon “Start with why” https://startwithwhy.com
Shneider, Jonny “Understanding Design Thinking, Lean, and Agile” O’Reilly Media, 2017
Rising, Linda and Manns, Mary Lynn “Fearless Change: Patterns for Introducing New Ideas” Addision-Wesley, 2015
Fowler, Martin “Agile fluency model” blog https://martinfowler.com/articles/agileFluency.html
Perri, Melissa “The Product Kata” https://melissaperri.com/blog/2015/07/22/the-product-kata
Boobier, Ant “An Opportunity Canvas story” blog https://nomad8.com/an-opportunity-canvas-story/
Boobier, Ant “Product Experiment map template blog” https://nomad8.com/making-product-development-habitual/