Get a 1-month FREE trial of Bito’s AI Code Review Agent  
Get a 1-month FREE trial
of Bito’s AI Code Review Agent

GitLab Code Review Best Practices in 2024

GitLab Code Review Best Practices

Table of Contents

Code reviews are an essential part of the development process at GitLab. As our codebase grows more complex with thousands of developers contributing code, it’s crucial that we have consistent, efficient, and effective code reviews. This allows us to maintain velocity while upholding quality, security, readability, and knowledge sharing.

In this article, I’ll cover GitLab’s code review workflow, values, tools, and best practices to perform code reviews successfully in 2024.

Introduction

Code reviews involve a developer submitting their code changes as a merge request (MR), which then gets reviewed, approved, and merged by maintainers with the appropriate domain expertise.

The primary goals of code reviews are to:

  • Catch bugs, errors, and edge cases early.
  • Improve code quality and readability.
  • Verify that changes meet product and security requirements.
  • Spread knowledge and learning throughout the organization.

At GitLab, code reviews are mandatory for all code changes, no matter how small, to uphold our standards. Reviewing code is also a great way to coach and mentor developers in skills like security, testing, and programming languages.

Since code quality is cumulative, it’s essential that our reviews are consistently high quality. Poor code that gets merged will make improving GitLab harder over time. That’s why we’ve created a standardized code review process outlined in this article.

Best Practices for Everyone

For both the merge request author and reviewers, here are some overall guidelines to ensure effective and friendly code reviews.

Be Kind, Explicit, and Humble

  • Provide feedback tactfully, keeping things friendly and focused on the code, not the person. Saying “What do you think about doing X?” is better than demanding changes.
  • Be explicit in communications. Remember that tone and context can be lost online. Comments like “This code is dumb” can come across as overly harsh.
  • Be humble and avoid hyperbole. Saying “I’m not sure, let’s discuss” is better than “This will never work”.
  • Summarize any offline discussions in MR comments so others are kept in the loop.

Seek to Understand Others’ Perspectives

  • Consider tradeoffs the author may have already made that led to the current solution.
  • If you disagree with something, first seek to understand the reasoning behind it.
  • Identify aspects you feel strongly about versus minor preferences. Focus feedback appropriately.
  • Make suggestions politely in the form of questions. For example: “What do you think about doing X?”

Keep Feedback Focused on the Code, Not the Person

  • Critique code, not coding abilities. Assume good intent and intelligence in contributors.
  • Don’t take feedback personally. The review is of the code, not you as a person.
  • Discuss any overly harsh feedback privately to resolve misunderstandings.

Ask Questions, Don’t Make Demands

  • Ask the author if you don’t understand a piece of code. Chances are that others would also be confused.
  • After suggesting changes, clarify if they are recommendations or requirements.
  • Don’t assume your solution is the only way. Suggest alternatives politely.

Avoid Hyperbole and Sarcasm

  • Avoid exaggerated statements like “This will never work!” Even if a section needs heavy changes, be constructive.
  • Be careful with sarcasm. Inside jokes with longtime colleagues may seem exclusionary to new team members.

Having Your Code Reviewed

As the merge request author, you are the “driver” of getting your changes reviewed, approved, and merged. Here are some tips for a smooth process:

Perform a Thorough Self-Review First

Before requesting a formal review, take the time to self-review your own MR first:

  • Re-read every line of the diff. Does it make sense? Did you include anything unrelated?
  • Test the code locally – don’t rely on CI tests alone.
  • Write a clear description of the changes, including screenshots and examples.
  • Include comments on important decisions or complex sections that warrant explanation.

Keep MRs Small and Focused

  • Split large or complex changes into multiple MRs when possible. Huge MRs are hard to review thoroughly.
  • Keep the scope of each MR focused. Fixing multiple issues or mixing refactors with functional changes makes reviews difficult.

Rebase Locally and Push Fixes as Isolated Commits

  • As you address review feedback in iterations, rebase your changes locally and push fixes as isolated commits. Don’t squash everything until the end.
  • This allows reviewers to clearly see incremental progress rather than having to re-review the entire MR each time.

Request Re-Reviews from Earlier Reviewers

  • Ping earlier reviewers once you’ve addressed their suggestions to confirm before passing on to a maintainer.
  • They may catch additional aspects after seeing updates, so looping them back in is helpful.

Don’t Add TODOs to Code

  • Avoid adding TODO comments in the code itself unless explicitly requested by reviewers. Create issues instead for future improvements.
  • TODOs increase technical debt. We want to avoid merging these unless absolutely necessary.

Reviewing Code

As a reviewer, your role is to thoroughly analyze the changes before passing the MR on to a maintainer for final approval and merging.

Understand the Purpose and Approach

  • Talk with the author to understand what problem is being solved and why this approach was taken.
  • Get familiar with any associated issues or domain context needed to review effectively.

Be Thorough But Keep Feedback Kind

  • Try to perform a thorough review to minimize back-and-forth iterations. But keep wording friendly and constructive.
  • Focus on higher-level design, tests, correctness, errors, edge cases, security issues. Nit-picking formatting is less important.

Focus on Design, Tests, Errors, Edge Cases

  • Suggest ways to simplify logic and improve the overall design while still solving the problem.
  • Identify gaps in test coverage or existing tests that may no longer be valid.
  • Carefully inspect for potential errors, exceptions, or edge cases that could lead to bugs down the line.
  • Try to reproduce issues locally if possible to confirm fixes.

Use Conventional Comments for Clear Intent

  • Use conventional comments like nitpick, question, or suggestion to make your intentions clear.
  • For example: nitpick: Add a newline here for readability.
  • This helps the author prioritize which feedback is required versus optional.

Check Linked Issues and Blockers

  • Verify linked issues and make sure nothing is blocking the MR from being merged.
  • Set MR dependencies on any other issues or MRs that must be resolved first.

Merging a Merge Request

As a maintainer, you are responsible for enforcing quality and consistency before allowing changes into the codebase.

Confirm Tests Pass and Requirements Are Met

  • Ensure all pipeline jobs are passing, including code quality checks. Fix warnings when possible.
  • Confirm with the author that the change meets its original requirements.

Start a Pipeline to Re-Test Right Before Merging

  • Start a new pipeline to re-test the MR against the latest target branch right before merging. This avoids hidden conflicts.
  • If tests don’t pass, work with the author to fix instead of merging broken code.

Use Squash Judiciously Based on Commit History

  • In general, avoid squashing commits unless the author agrees or the history is messy. This destroys context.
  • With squashing, consider rewriting the final commit message to summarize all changes rather than a generic message.

Take Over Any Revisions Needed After Setting Auto-Merge

  • Once a merge request is set to auto-merge, the maintainer should own any revisions needed after that point to unblock the MR.

GitLab-Specific Concerns

In addition to general code quality, GitLab reviewers and maintainers should watch for the following common areas of concern:

Test Database Queries at Scale

  • Any database changes should be tested for performance at GitLab.com scale before merging. Significant slowdowns can impact customers.

Categorize Migrations Properly

  • Database migrations must be categorized as post-deployment or background migrations correctly based on runtime expectations.

Don’t Make Backwards-Incompatible Sidekiq Changes

  • Don’t remove Sidekiq workers outright. Deprecate first and remove later once queues are drained to avoid job failures.

Change Cache Keys When Cached Data Changes

  • If the format of cached data changes, the cache keys must also change to avoid inconsistencies across versions.

Minimize Use of New Persistent Settings

  • Adding application settings requires extra care at large scales. Favor feature flags instead when possible.

Community Contributions

Reviewing external contributions requires additional diligence:

Review Thoroughly for Malicious Code

  • Scrutinize community MRs more closely, especially new dependencies which could introduce vulnerable packages.
  • Verify no malicious links or images have been added, especially in documentation.

Be Prepared to Take Over and Finish MRs

  • If a community MR goes stale, be ready to take it over or close it if unresponsive after a reminder.

Reviewer Tools and Process

GitLab provides tools to support efficient and consistent code reviews:

Use Danger Bot for Automated Checks

  • Danger bot runs on all MRs and leaves automated comments about violations of code quality, size, etc. Fix these first.

Follow the Code Owners Approval Process

  • Check CODEOWNERS rules so the right domain experts approve MRs modifying their areas of the codebase.

Adhere to Review Response SLOs Based on Author

  • GitLab team members should respond to review requests within 2 days, while leading organizations get 4 days.

Set Your Review Capacity Status Clearly

  • Set your GitLab status emoji (like :red_circle: for at capacity) so assigners can see if you have availability.

Check the Reviewer Roulette for Recommendations

  • Reviewer roulette suggests reviewers and maintainers to request based on code area and availability.

Conclusion

Performing effective code reviews is a crucial skill for GitLab engineers to maintain velocity and quality as we scale. By following these workflows and best practices, we can meet our goals of being an efficient, secure, and transparent organization. We must continue iterating on our code review process as GitLab evolves.

The practices outlined here require diligence and care from everyone involved. But the benefits are well worth the effort. Code reviews allow us to sustainably improve our codebase, prevent bugs, share knowledge, and uphold our standards.

Anand Das

Anand Das

Anand is Co-founder and CTO of Bito. He leads technical strategy and engineering, and is our biggest user! Formerly, Anand was CTO of Eyeota, a data company acquired by Dun & Bradstreet. He is co-founder of PubMatic, where he led the building of an ad exchange system that handles over 1 Trillion bids per day.

Amar Goel

Amar Goel

Amar is the Co-founder and CEO of Bito. With a background in software engineering and economics, Amar is a serial entrepreneur and has founded multiple companies including the publicly traded PubMatic and Komli Media.

From Bito team with

This article is brought to you by Bito – an AI developer assistant.

Latest posts

Bridging the Gap: AI Code Review vs Static Analysis Tools

Nitpicking in Code Reviews: Helpful or Harmful?

Are LLMs Commoditizing?

Major upgrades to the AI Code Review Agent and IDE extensions

The Hidden Cost of Code Review: Navigating Emotional Landscapes

Top posts

Bridging the Gap: AI Code Review vs Static Analysis Tools

Nitpicking in Code Reviews: Helpful or Harmful?

Are LLMs Commoditizing?

Major upgrades to the AI Code Review Agent and IDE extensions

The Hidden Cost of Code Review: Navigating Emotional Landscapes

From the blog

The latest industry news, interviews, technologies, and resources.

Get Bito for IDE of your choice