From 22 to 38 Engineers: How One Team 10x'd Output in 6 Months [Case Study]

February 26, 2026 11 min readBy Bill Parker

The Problem: Growing Without Getting Faster

Twenty-two engineers across three teams: frontend, backend, and platform. On paper, the organization was healthy. Hiring was on track. Code was shipping. But something wasn't adding up.

New engineers took three or more months to become productive. Code reviews were backed up in a queue that grew longer every sprint. Knowledge was siloed — the platform team didn't know what frontend was building, and nobody could tell which engineers were actually carrying the load.

The metrics told a story that felt wrong:

  • 397 commits/month across the organization (~18 per person)
  • 710 code reviews/month (~32 per person, but heavily concentrated)
  • PR cycle time: 4–5 days on average
  • Team satisfaction: 6.8 out of 10, trending downward

The engineering director's assessment captured it well: they were growing, but it didn't feel like growth. Something was invisible.

The Diagnosis: Measuring the Wrong Things

The root cause wasn't effort — it was visibility. Like most engineering organizations, this team was measuring commits to the main branch and calling that "productivity." But main branch commits are the end of the pipeline, not the pipeline itself.

When they audited where work was actually happening, the picture shifted dramatically. Feature branches contained 3x more activity than main. Junior engineers were shipping meaningful work on branches that never surfaced in their analytics tools. Three senior engineers were shouldering 70% of all code reviews — a bottleneck that was throttling the entire team's throughput.

The measurement system was lying to them. Not maliciously — but by omission.

We've written about why main-branch-only measurement misses 80% of work.

The Intervention: Three Changes, Run in Parallel

This team didn't make one big bet. They made three simultaneous changes, each reinforcing the others.

Change 1: All-Branch Visibility

They switched from main-branch-only analytics to all-branch telemetry using Warclick. Instead of seeing commits that reached production, they could now see commits, PRs, and reviews across every branch in every repository.

The immediate discovery was staggering. Work they'd never been able to quantify — feature branch development, experimental prototypes, infrastructure changes, code review mentorship — suddenly had numbers attached to it. Junior engineers who appeared inactive in their old tools were, in reality, some of the most prolific contributors on feature branches.

Immediate actions taken: They redistributed code review load based on actual workload. They onboarded new junior engineers into active feature branches instead of isolated starter projects. They made team specializations explicit and used the data to prevent single points of failure.

Change 2: Developer Leaderboards

With complete data in hand, they turned on Warclick's leaderboard — the Clan Table. Every engineer could see their contributions ranked alongside their peers, across all work types: commits, reviews, PRs merged, and coding days.

The results in the first month were immediate. Commit visibility increased 150%. Code review participation broadened — junior engineers who had never reviewed code before started volunteering because they wanted to climb the board. Knowledge sharing spiked as engineers asked top performers how they worked.

But the most telling metric was retention. Over the six-month observation window, the team lost zero engineers. Historically, they'd expected 3–4 departures in that timeframe.

The psychology behind why leaderboards drive retention and motivation.

Change 3: AI Adoption Tracking

Warclick's AI detection revealed that 88.5% of active engineers on the team were already using AI coding tools — primarily Claude, which accounted for 85.9% of AI-assisted commits. More importantly, 90.1% of new code had some level of AI involvement.

This data shifted the conversation from "should we adopt AI?" to "how do we optimize the AI workflow we already have?" Junior engineers were adopting AI tools 2x faster than seniors, which meant new hires were reaching productivity in 6 weeks instead of 3 months.

We measured AI adoption across hundreds of engineers — here's the complete data.

Want to see numbers like these for your team? Warclick captures every commit, review, and PR across all branches — plus AI adoption data. See your real engineering output in 30 minutes. Start Your Free 7-Day Trial.

The Results: 6-Month Before and After

Here's the full dataset. None of this is cherry-picked — you're looking at the complete picture across the observation window.

Volume and Scale

MetricQ4 (Before)Q2 (6 Months After)Change
Team size2238+72.7%
Commits/month3974,028+914.9%
Code reviews/month7103,783+433.5%
Lines shipped/month121K2.89M+2,183.8%
PRs merged/month276900+226%
Active contributors1238+216%

A note on the commits number: the jump from 397 to 4,028 doesn't mean engineers suddenly started working 10x harder. Much of that increase came from visibility — work that was always happening on feature branches but was never counted before. The real productivity gain, once you normalize for the measurement change, is still substantial — but the headline number reflects a measurement correction as much as a behavior change. Transparency about this matters.

Velocity and Quality

MetricBeforeAfter
PR cycle time4–5 days2–3 days
Deployment frequency2x/month3x/week
Code review participation32%68%
Defect rateBaselineFlat (no regression)

Team Health

MetricBeforeAfter
Avg daily logins2.64.5 (+75.8%)
Team satisfaction6.8/108.4/10
Departures (6-month window)3–4 expected0 actual
New hire ramp-up time3 months6 weeks

Why It Worked: The Reinforcing Loop

These three changes weren't independent — they created a compounding cycle.

Visibility Reveals Capacity

When you can see work across all branches, you discover how much work was actually being done, who has real capacity vs. who's overloaded, and which teams are bottlenecked. This team stopped hiring based on gut feeling and started hiring based on data.

Recognition Drives Motivation

Leaderboards sound like they should create toxic competition. The opposite happened here. Engineers saw their peers' contributions for the first time and responded with curiosity, not jealousy. Recognition led to knowledge sharing. Competitive drive was channeled into helping the team.

Recognition drives retention more than compensation when the recognition is fair. And fairness requires complete data.

AI Accelerates Everything

With 85%+ of the team using Claude, the friction that slows down traditional engineering teams largely disappeared. Junior engineers could ask Claude before asking a senior teammate. Async work became viable. Onboarding compressed from months to weeks.

AI didn't replace engineers. It removed the friction between their intention and their output.

What This Means for Your Team

This case study comes from a team that was already functional. They weren't in crisis. They weren't failing. They were just growing — and their tools couldn't keep up with the complexity of that growth.

Most teams try to scale by hiring more people without improving visibility, adding process without adding clarity, and measuring metrics that only capture a fraction of the work. The result is typically modest growth paired with rising burnout.

This team's approach was different: visibility gave them clarity, recognition gave engineers motivation, and AI leverage gave everyone the tools to be effective faster. The result: 10x output growth with a culture that got stronger, not weaker.

How to Run This Playbook Yourself

  1. Switch to all-branch analytics. If your current tools only measure main branch, you're making decisions based on incomplete data. Start with a free Warclick trial and compare what you see to what you thought was happening.
  2. Introduce leaderboards with context. Share the data with your team before you turn on rankings. Build trust first.
  3. Measure AI adoption, then optimize it. Don't guess whether your team is using AI tools — measure it.

We built a detailed 30-day playbook for rolling this out step by step.

See your team's real numbers. Warclick gives you all-branch visibility, developer leaderboards, and AI adoption tracking — starting at $4/warrior/month. Setup takes 30 minutes. Start Your Free Trial.
Bill Parker
Bill Parker

Founder & CEO

Engineering leader and founder of Warclick. Helping teams measure what actually matters since 2021.

Ready to see what your team is really building?

Connect your GitHub in 30 seconds. See real data in minutes.

Schedule a Call