Faster, better AI-powered code reviews. Start your free trial!  
Faster, better AI-powered code reviews.
Start your free trial!

Get high quality AI code reviews

How to Properly Review a Merge Request in GitLab

How to Properly Review a Merge Request in GitLab

Table of Contents

Merge requests are a critical component of the GitLab workflow. They allow developers to propose changes to code repositories, get feedback from team members, and integrate approved changes into the main branch. As more teams adopt GitLab for version control, it’s important to follow best practices for reviewing merge requests in order to maintain code quality and enable effective collaboration. This comprehensive guide covers how to structure merge request reviews, provide constructive feedback, leverage automation, and customize practices based on your team’s needs. Properly reviewing merge requests in GitLab is key for upholding standards while empowering developers.

Establishing a Structured Review Process 

A structured review process sets clear expectations and optimizes the workflow for evaluating proposed code changes. Here are some best practices:

Create Standards for Reviews

  • Define a review checklist that covers important aspects like security, functionality, and coding conventions. This ensures a consistent quality bar.
  • Set expectations on turnaround time for reviews so developers aren’t left waiting indefinitely. For example, require reviewers to provide feedback within 1 business day.
  •  Require approvals from subject matter experts or team leads before merging non-trivial changes. This prevents unintended consequences.

Optimize the Review Workflow

  • Integrate merge requests into the agile process by reviewing related MRs in each sprint. 
  • Limit the scope of changes in each MR to make reviews more manageable. For example, create separate requests for frontend and backend changes instead of combining them.
  • Prioritize urgent merge requests for critical issues or releases over non-critical changes. Review SLAs can reflect priority.

Leverage Automation

  • Configure branch protection rules to require reviews before merging and automatic closing of stale MRs.
  • Integrate security scanning tools like static analysis into pipelines to identify vulnerabilities. Fixes can be made prior to human review.
  • Automatically request reviews from owners of affected files based on code changes rather than manually assigning reviewers.

Providing Effective Feedback

High quality feedback on merge requests allows contributors to improve their code and avoid defects before merging. Reviewers should:  

Focus Reviews on Requirements and Tests

  • Verify proposed changes meet the intended requirements and acceptance criteria.
  • Ensure new and modified code has adequate test coverage. Request additional tests if needed.
  • Run integration and system tests locally before approving to catch issues early.

Provide Clear and Actionable Feedback 

  •  Be specific in describing any issues found and improvements needed. Provide line numbers.
  • Suggest alternative implementations, libraries, or approaches as warranted to improve the code. 
  • Use screenshots, annotations, code snippets, or links to documentation to illustrate feedback.

Limit Style Critiques

  • Focus feedback on correctness and design rather than stylistic preferences.
  • Defer style guidance such as naming conventions and spacing to a coding standards doc.
  • Don’t block approval due to minor style changes – submit a follow-up MR.

Customizing Practices for Your Team

Merge request practices should be tailored to your team’s culture and needs. Here are some areas to focus on:

Set Review Expectations

  • Communicate standards for review quality and turnaround expected of all reviewers. 
  • Establish a mentorship program to train new reviewers on providing effective feedback.
  • Recognize reviewers who provide exceptionally thorough and constructive feedback.  

Adjust Automation to Your Needs 

  • Start with basic branch protections and add more automation gradually as needed.
  • Monitor results of automated checks and fine-tune rules and thresholds over time.
  • Balance automation with value of human judgment – don’t over-automate.

Solicit Feedback and Improve Process   

  • Gather team feedback on the MR process to identify pain points. Is feedback helpful? Are reviews timely?
  • Implement changes to address issues raised. For example, reduce review workload by limiting MR scope. 
  • Continuously refine practices as the team grows. Re-survey periodically to check for new issues.


Here are some examples of providing effective feedback on a merge request:

// Incorrect implementation of feature
function getUser(id) {
  return db.query('SELECT * FROM users WHERE id = ?', [id]);

This directly exposes us to SQL injection attacks. Please use parameterized queries or an ORM.

// Improved implementation
const getUser = async (id) => {
  return await UserModel.findById(id);

Much better – using an ORM prevents SQL injection vulnerabilities

// Complex conditional logic
if (order.status === 'processing' && order.paymentMethod && 
    order.shipmentType !== 'pickup' && === 'US' &&
    inventory > 0
  ) {
  // process order

Potentially could simplify this conditional logic for readability. Could we extract parts of the condition into well-named variables or functions?


Properly reviewing merge requests is key for maintaining code integrity while enabling collaboration. By establishing structured workflows, providing constructive feedback, leveraging automation, and customizing practices based on team needs, organizations can realize huge benefits. Reviewers should focus on requirements, tests, and actionable feedback while avoiding bike-shedding. With a sound MR review process in place, teams can deliver higher quality, more reliable software.

Amar Goel

Amar Goel

Bito’s Co-founder and CEO. Dedicated to helping developers innovate to lead the future. A serial entrepreneur, Amar previously founded PubMatic, a leading infrastructure provider for the digital advertising industry, in 2006, serving as the company’s first CEO. PubMatic went public in 2020 (NASDAQ: PUBM). He holds a master’s degree in Computer Science and a bachelor’s degree in Economics from Harvard University.

Amar Goel

Amar Goel

Amar is the Co-founder and CEO of Bito. With a background in software engineering and economics, Amar is a serial entrepreneur and has founded multiple companies including the publicly traded PubMatic and Komli Media.

Written by developers for developers

This article was handcrafted with by the Bito team.

Latest posts

7 Best Go Code Review Tools in 2024 (AI-powered & Automated)

Identify Insecure Dependencies Early and Often

How to Review Pull Requests at Scale using AI

How to run AI Code Review Agent in GitHub

6 Best CodiumAI’s PR-Agent Alternatives for AI Code Reviews

Top posts

7 Best Go Code Review Tools in 2024 (AI-powered & Automated)

Identify Insecure Dependencies Early and Often

How to Review Pull Requests at Scale using AI

How to run AI Code Review Agent in GitHub

6 Best CodiumAI’s PR-Agent Alternatives for AI Code Reviews

From the blog

The latest industry news, interviews, technologies, and resources.

Get Bito for IDE of your choice