Home/Blog/Meta's surprising key to faster code reviews
Home/Blog/Meta's surprising key to faster code reviews

Meta's surprising key to faster code reviews

Jenna Nobs
4 min read
content
How code reviews impact your team
The state of code reviews at Meta
Cutting review times while preserving quality
The results
Accelerate your dev team with ML & AI

Developers at Meta discovered the secret to faster code reviews in an unexpected way: binge watching Netflix.

Today, we'll lay out Meta's unusual approach to optimizing code reviews. But first, why did engineering put so much effort into driving down review times?

How code reviews impact your team

Code reviews are a critical step in the software development process. They ensure the quality of your source code by catching bugs, upholding coding standards, and sharing knowledge.

Most code reviews combine manual and automated processes. For example, developers might engage in peer reviews, where they examine each line of code for logic, code style, readability, maintainability, and adherence to project guidelines. Automated unit testing can be used to flag syntax errors and potential bugs in the codebase.

The benefits of code reviews can’t be overstated. However, like all processes that require human oversight, code reviews can take a long time. Reasons might include:

  • Complex changes that require thorough examination

  • High volume of pull requests

  • Insufficient or unclear review guidance

  • Inadequate code review tools

  • Reviewer experience level

Whatever the cause, inefficient code reviews are silent killers of productivity. The longer you have to wait for code reviews, the more likely you are to miss deadlines. 

All programmers want faster quality assurance. However, there can be unintended impacts on the agile software development lifecycle. Reviewers who rush the code review process are more likely to miss bugs and approve substandard code. Over time, this can introduce significant vulnerabilities and necessitate refactoring.

With these complexities in mind, Meta decided to embrace the challenge: how do you cut review times without undermining code review best practices?

The state of code reviews at Meta

Meta's interest in driving down review times emerged from internal survey data (more on those surveys here).

In particular, Meta noticed a correlation between two survey metrics: "time in review" and "user satisfaction." The longer an engineer's slowest 25% of code changes (or diffs) took to review, the less satisfaction the engineer reported. This correlation persisted even when most of the engineer's reviews were completed promptly.

Meta predicted that decreasing "time in review" for the slowest 25% of diffs would boost engineering performance (as indicated by developers' self-reported satisfaction).

However, Meta needed to ensure that faster reviews didn't lead to "rubber stamping," or cursory approval of code changes that diminish code quality. To avoid this negative side effect, Meta chose "eyeball time" (the amount of time reviewers spend looking at a diff) as the guardrail metric.

In other words, Meta set out to decrease "time in review" without decreasing "eyeball time."

Cutting review times while preserving quality

So how do you decrease total time in review without decreasing eyeball time?

Target the time a diff is waiting on reviewer action.

Code reviews take a lot of time and mental energy, especially when team members have to context switch between reviews and other tasks. Code reviews become more efficient when developers can enter a flow state and complete several in a row.

This is where Meta turned to Netflix for inspiration.

Engineers realized that entering a flow state for reviewing is similar to binge watching a Netflix show. The autoplay feature queues up episodes so that you can keep watching without lifting a finger. This seamless experience encourages you to watch more episodes than you might otherwise.

To create a similar flow state, Meta engineers came up with "Next Reviewable Diff" – kind of like autoplay for code reviews.  

Next Reviewable Diff uses machine learning to identify a "recommended next diff" for the current reviewer (i.e., a diff they're likely to want to review). This recommendation system considers criteria like:

  • Work hours

  • Awareness

  • File ownership information

The next diff automatically pops up when the reviewer completes their current review, just like a Netflix recommendation.

Of course, this system isn't perfect. That's why Meta engineers made it easy to decline recommended reviews and cycle through upcoming diffs to find the right fit.

The results

Next Reviewable Diff had a significant impact on Meta's code review outcomes:

  • 17% overall increase in review actions per day

  • 44% more review actions per reviewer

Critically, the increase in review actions did not correlate with decreased eyeball time. Reviewers were more productive because they entered a flow state, not because they were less attentive to code changes.

Accelerate your dev team with ML & AI

Meta's Next Reviewable Diff is just one example of how Machine Learning and AI can optimize your team's workflows.

The key to Meta's success is a deep understanding of:

  1. ML and AI capabilities

  2. The importance of guardrails

Next Reviewable Diff doesn't try to replace developers or fully automate reviews. It reduces friction between code reviews so that developers can apply their expertise more efficiently. The point of this ML feature is to optimize software engineering’s most valuable resource: human discernment.

Generative AI skills are increasingly non-negotiable for development teams, regardless of your methodology. Even if you're not yet building AI features, the highest-performing teams leverage AI to work more efficiently.

To keep up with the rapidly evolving AI landscape, make sure your team has the skills and knowledge to:

  • Work directly with AI models

  • Build apps on AI tools

  • Write prompts that generate effective outputs for LLM tools

DevPath makes it easy to upskill developers in ML and AI, so you can futureproof your teams without slowing down current projects.

Upskill your team in Machine Learning & Gen AI

Upskill your team in Machine Learning & Gen AI

1000+ interactive courses, CloudLabs, and projects
1000+ interactive courses, CloudLabs, and projects

  

Free Resources

DevPath by Educative. Copyright ©2025 Educative, Inc. All rights reserved.

soc2