SAFe®, Scrum, Kanban Share a Bottleneck and It’s Not What You Think
Even in today’s economic downturn, where U.S. job postings are down overall compared to 2019, tech hiring is showing signs of being more resilient than average. In fact, seven of the ten most in-demand tech jobs are for developers—and none are for QA.
That’s a bit alarming, because the Flow Metrics of several Fortune 100 companies now practicing value stream management (VSM) in their software portfolios have revealed an acute resource scarcity in QA. Precisely when the business is trying to improve market responsiveness and deliver faster with less resources, the flow of value creation grinds to a halt outside the gates of QA.
No one is immune to this finding: whether implementing SAFe®, Scrum, Kanban, or Waterfall, the fingerprints of insufficient QA resources are all over nearly every value stream.
The False Developer Quality Equivalency
Finding a product value stream’s bottleneck sounds easy-in theory. Ask around, you’ll get plenty of opinions. Get together once a year for a value stream mapping exercise, and you’ll emerge with a list of potential improvement hypotheses.
But from all those hotspots, identifying the system constraint, the one big, juicy bottleneck that at this very moment is negating and undermining the benefits from your optimization efforts? That’s virtually impossible without real-time systems data and analytics.
For that very reason, IT leadership often defaults to throwing more engineers at any problem. “Every time we were seeing results from a project, people wanted to add more developers. As a result, many teams have 18 developers and only one QA,” said Heather Munoz, Head of the Retail Brokerage Practice at E*Trade, on a recent podcast interview.
E*Trade isn’t alone. With every single company modernizing its software practices with agile, full-stack engineers (itself a myth), test-driven development and increased automation, many have grown to think QA is no longer the constraint it once was. Moreover, QA capacity is no longer taken into consideration when a team takes on work.
But reality tells a different story. Poor code quality ends up drowning teams in re-work and taking resources away from innovation work. Speed without quality creates re-work that actually negatively impacts time-to-market. “In the 17 transformations to agile at scale I’ve accompanied in my career, quality has always been at the root of the problem. If you deliver a product on-time but it has defects, then you have not delivered it on-time,” says Dan James, an agile and DevOps transformation coach at Icon Agility Services. “Companies gain speed by eliminating rework, which requires a test-first approach.”
What Does Under-Resourced QA Look Like?
The emergence of VSM solutions is helping CIOs replace intuitions and presumptions (like those about QA) with hard data and scientific methods. Flow diagnostics for software delivery, similar to crime scene forensics, can reveal the impediments to faster, better delivery.
Specifically, bottleneck diagnostics measure what’s causing the most impactful delays. VSM solutions integrate and analyze cross-tool value stream data in real-time and visualize all the workflows involved. The system bottleneck will appear in those visualizations as a clear outlier, characterized by the highest concurrent counts and longest durations.
In many of those cases, the outlier is QA. Work will appear piling up in states like Dev Done, Dev Test Complete, or Ready for Test and stagnating in states like In QA.
“I couldn’t get scrum teams to understand how disturbing the lack of QA resources was until we got real-time Flow Metrics in place. I had to bring in visualization to show product folks and my management the value of QA,” says Munoz.
Data Points to More Investment in QA, Not Development
For CIOs and senior IT leaders, it’s tremendously hard and stressful to make budgeting decisions and resource appropriation calls over a vast portfolio of digital products and services. And nothing is more frustrating than funding a strategic initiative, only to see it hit the market six months too late because it couldn’t be tested adequately and on-time.
“Most of my management believed developers solved all your problems. Having so many years of working as a development leader, I understand that my QA team are the ones that prevent us from having an embarrassing moment and more importantly can tell me whether we’re putting out good code or bad code,” said Munoz.
Development teams can and should be scaled up — that much is true — but only when accompanied by a proportionate investment in QA. The cost of delay can only be conquered with the right prioritization of test engineers, test automation suites, and modern, self-service test environments.
Real-time Flow Metrics with Tasktop Viz™
Many fleet-footed IT leaders are already reaping the benefits of measuring and improving value stream flow in real-time with out-the-box Flow Metrics with Tasktop Viz:
- A healthcare leader doubled its feature Flow Velocity, introducing new business capabilities faster than ever
- A telecom giant was able to change its outsourcing strategy and renegotiate better terms with its managed service partners
- An online investment firm discovered the system bottleneck impeding digital channel innovation and began working to dismantle it
- IT and business consulting firm CGI unlocked capacity equivalent to over 7,500 days per year
Speak to us today if you want to learn more about how Tasktop Viz can help you baseline your software portfolio’s performance against key business results to begin your data-driven continuous improvement journey.