How To Conduct Code Reviews Effectively

Explore top LinkedIn content from expert professionals.

Summary

Learning how to conduct code reviews helps software teams maintain high-quality code, catch hidden problems, and support better collaboration. A code review means checking someone else’s changes before they are finalized, making sure the updates are clear, reliable, and easy for others to handle in the future.

  • Communicate clearly: When submitting or reviewing code, always explain your reasoning, mark suggestions that are optional, and encourage open discussion to avoid confusion.
  • Break down changes: Submit small, logical updates with clear descriptions so others can focus and understand your intentions without being overwhelmed.
  • Think beyond bugs: Don’t just look for errors—ask how the code will work in real business scenarios, if it will be maintainable over time, and whether it could cause issues as the system grows.
Summarized by AI based on LinkedIn member posts
  • View profile for Sanchit Narula

    Sr. Engineer at Nielsen | Ex-Amazon, CARS24 | DTU’17

    38,526 followers

    100 lines of code: reviewed in 10 minutes. 1000 lines of code: reviewed never. Code reviews exist to catch bugs, improve maintainability, and help teams write better software together. But most engineers treat them like assignments to pass instead of collaborative checkpoints. That mindset kills the process before it starts. ➧ When you're submitting a PR: 1. Keep it small Aim for 10-100 lines of code per pull request. Past 100 lines, reviewers start skimming. Past 500, they stop caring entirely. Large PRs are harder to review, take longer to approve, and make it nearly impossible to catch real bugs. Break your work into isolated, logical chunks. Yes, it's more work upfront. But it ships faster. 2. Write a description Give context. Always. Your reviewer might be on a different team, in a different timezone, or new to the codebase. Don't make them guess what you're solving. If you're fixing a bug, explain what broke and link to the ticket. If it's a visual change, add before/after screenshots. If you ran a script that generated code, paste the exact command you used. Context turns a confusing diff into a clear story. 3. Leave preemptive comments If part of your diff looks unrelated to the main logic, explain it before your reviewer asks. "Fixed a typing issue here while working on the main feature." "This file got reformatted by the linter, no logic changes." These small clarifications save back-and-forth and show you're thinking about the reviewer's experience. ➧ When you're reviewing a PR: 1. Be overwhelmingly clear Unclear comments leave people stuck. If you're making a suggestion but don't feel strongly, say it: "This could be cleaner, but use your judgment." If you're just asking a question, mark it: "Sanity check, is this intentional? Non-blocking, just curious." Over-communicate your intent. Especially with remote teams or people you don't know well. 2. Establish approval standards with your team Decide as a team when to approve vs. block a PR. At Amazon and now at Nielsen, we approve most PRs even with 10+ comments because we trust teammates to address feedback. The only exception: critical bugs that absolutely can't go to production. Without clear standards, people feel blocked by style comments and approvals feel arbitrary. Talk to your team. Set the rules. Stick to them. 3. Know when to go offline Some conversations don't belong in PR comments. If the code needs a major rewrite, if there's a design disagreement, or if you're about to write a paragraph, stop. Ping your teammate directly. Have a quick call. Save everyone time. Leave a comment like "Let's discuss this offline" so they know you're not ignoring it.

  • View profile for Sujeeth Reddy P.

    Software Engineering

    7,914 followers

    In the last 11 years of my career, I’ve participated in code reviews almost daily. I’ve sat through 100s of review sessions with seniors and colleagues. Here’s how to make your code reviews smoother, faster and easier: 1. Start with Small, Clear Commits    - Break your changes into logical, manageable chunks. This makes it easier for reviewers to focus and catch errors quickly. 2. Write Detailed PR Descriptions    - Always explain the “why” behind the changes. This provides context and helps reviewers understand your thought process. 3. Self-Review Before Submitting    - Take the time to review your own code before submitting. You'll catch a lot of your own mistakes and improve your review quality. 4. Ask for Specific Feedback    - Don’t just ask for a “review”—be specific. Ask for feedback on logic, structure, or potential edge cases. 5. Don’t Take Feedback Personally    - Code reviews are about improving the code, not critiquing the coder. Be open to constructive criticism and use it to grow. 6. Prioritize Readability Over Cleverness    - Write code that’s easy to understand, even if it’s less “fancy.” Simple, clear code is easier to maintain and review. 7. Focus on the Big Picture    - While reviewing, look at how changes fit into the overall system, not just the lines of code. Think about long-term maintainability. 8. Encourage Dialogue    - Reviews shouldn’t be a one-way street. Engage in discussions and collaborate with reviewers to find the best solution. 9. Be Explicit About Non-Blocking Comments    - Mark minor suggestions as “nitpicks” to avoid confusion. This ensures critical issues get addressed first. 10. Balance Praise and Criticism    - Acknowledge well-written code while offering suggestions for improvement. Positive feedback encourages better work. 11. Always Follow Up    - If you request changes or leave feedback, follow up to make sure the feedback is understood and implemented properly. It shows you’re invested in the process. -- P.S: What would you add from your experience?

  • View profile for Kai Krause

    VP Engineering & AI @ Speechify | 50M+ Users | I still ship code

    4,847 followers

    Code reviews aren't about finding bugs. If that's all you're doing in reviews, you've already lost. I've seen teams spend hours debating variable names and missing the actual problem: the code works, but nobody else can maintain it. Here's what code reviews actually catch: The junior engineer who hard-coded a feature that should be configurable. Not a bug. But you just locked yourself into technical debt that'll take 6 months to fix. The senior engineer who built something clever. Too clever. It works perfectly but breaks the moment someone else touches it. The architect who designed a system only they understand. No tests. No docs. Just "trust me, it works." Then they go on vacation and the system breaks. These aren't bugs. They're decisions that will hurt you later. Good code reviews ask different questions: • Can someone else debug this at 2am? • Will this still make sense in 6 months? • What happens when we scale 10x? • Are we building the right thing? Most teams optimize code reviews for finding syntax errors. Your IDE already does that. The real value is catching the decisions that look fine today but become disasters tomorrow. If your code reviews only find bugs, you're using them wrong. What's the worst "it works but..." code you've caught in review? #SoftwareEngineering #CodeReview #EngineeringLeadership

  • View profile for Chad Dalton

    Founder @ Cloud Beacon | D365 F&O + AI/Copilot for enterprise | Specializing in food & bev and consumer goods

    2,764 followers

    We were days away from D365 go-live. One code review question changed everything. 🚀 The code looked clean. It passed the base technical review. Then someone asked: "What happens when a sales order has multiple allowances?" Silence. That one question in the deeper code review uncovered a bug that would have stopped order processing on day one. Most code reviews focus on the technical: - Are tts blocks handled correctly? - Is error handling in place? - Are we using set-based operations? - Are there performance implications? - Are there hardcoded values that should be parameters? These matter. But they're not enough. 🔍 The deeper code reviews also challenge the code functionally: - Where could this break in real business scenarios? - What edge cases will users hit? - Does this actually solve the problem it was meant to? This is why technical resources who understand the business catch what others miss. We now include functional scenario questions in every code review. What's your code review process? Have you ever caught a critical bug just by asking the right question?

  • View profile for Gilad Naor

    Building something new

    5,387 followers

    3:47 AM on a Tuesday. My phone buzzes. PagerDuty alert. The system is down. I scramble to my laptop. Database connections maxed out. API timeouts everywhere. Users can't access the service. We get it back up. Block the offending caller. System stabilizes. The post-mortem hits differently. Two experienced engineers reviewed the PR. Tests passed. Code worked exactly as specified. But nobody asked one question: "How would someone abuse this?" That single question would have saved us. One line of code. Five minutes. Crisis prevented. Here's what I learned after years of causing (and fixing) production incidents: Code review isn't about what to check. It's about how you think. Most engineers do one of two things: • Rubber-stamp with "LGTM" • Spend hours arguing about formatting Both miss the real problems. I tried comprehensive checklists. Ran formal review sessions. Eventually everyone burned out. Then I found something that actually works. Three focused passes. Each with a different persona. Each asking different questions. Pass 1: Does it work and make sense? Pass 2: Can we live with this code in six months? Pass 3: How would I break this? I wrote the full breakdown of the three-pass system, including exactly what to look for in each pass and how AI can help. https://lnkd.in/ehSMw8ka

  • View profile for Chandrasekar Srinivasan

    Engineering and AI Leader at Microsoft

    50,072 followers

    if (!high_Quality_Code_Review) { code_Quality.suffers(); } The biggest challenge when it comes to code reviews is time. We all want to provide thoughtful feedback that improves the codebase without spending endless hours dissecting lines of code. – Deliver high-quality feedback. – Do it efficiently. – Minimize the back-and-forth between the author and reviewer. Here’s how to do it: 1. Start with the Change Description Always start with the PR (Pull Request) Description: ↳Understand the Goal: What problem is this change solving? Why is it necessary now? Does it align with the product or technical vision? ↳ Key Design Insights: Look for architectural decisions and trade-offs. Is the proposed solution justified, or are there better alternatives? ↳ Clarity Check: If the PR description needs to be more specific or complete, request the author to refine it. It’s better to clarify the intent than to misunderstand the implementation. 2. Focus on the Interface First Now, move on to the interface, not the implementation. ↳Abstraction: Does the interface present a clean abstraction? Is it intuitive for others to use? A good abstraction hides unnecessary details and provides a natural way for other parts of the system to interact with the component. ↳ Naming Conventions: Are the names of methods, classes, or variables clear and self-explanatory? Names should reflect their purpose without needing additional comments. ↳Contracts: -Does the interface define clear inputs, outputs, and side effects? - Inputs and Preconditions: What does the function or class expect? - Outputs and Postconditions: What does it guarantee? - Side Effects: Are there implicit changes, such as modifying global state? 3. Review the Implementation and Tests Last Once the interface is solid, dive into the implementation. Here’s how to structure your review: ↳ Correctness: –Does the implementation meet the intended functionality? –Test it against the stated goals in the PR description. ↳Edge Cases: Does the code handle unexpected inputs gracefully? –What happens if something goes wrong (e.g., network issues, null inputs)? – Can someone unfamiliar with the change understand the logic quickly? –Use proper indentation, modularization, and logical flow. –Is the solution simpler than necessary? ↳Efficiency: Does the code perform well under expected load? - Are there any unnecessary loops or expensive operations? - Is memory or CPU usage optimized? ↳ Test Coverage Checklist - Do the tests cover all important scenarios, including edge cases? - Tests should be simple and obvious. Avoid abstracting tests too much, even if it involves repetition. - If this change fixes a bug, ensure there’s a regression test to prevent it from resurfacing. - For changes with performance implications, validate them under real-world conditions.

  • View profile for Dhirendra Sinha

    SW Eng Manager at Google | Mentor | Advisor | Author | IIT

    48,887 followers

    9 code review practices your team should follow to go from Good → Great projects. (these helped my team deliver 100s of projects without wasting hours fixing bugs) 🟢As a team: ➡️Establish goals and expectations beforehand: for example: + functional correctness + algorithmic efficiency + improving code quality + ensuring code standards are met ➡️Use code review tools Use: (GitHub PRs, GitLab MRs, and Atlassian Crucible). + easily track changes + streamline the review process ➡️Automate code checks:  It will help you to: + find syntax errors + avoid common issues + reduce code style violations and potential bugs. 🟡As a reviewer: ➡️Start early, review often:  do this to: + catch issues early + prevent technical debt + ensure that code meets project requirements. ➡️Keep reviews small and focused:  you get: + an easier process + shorter turnaround time. + better collaboration in the team ➡️Balance speed and thoroughness:  + do comprehensive reviews  + but avoid excessive nitpicking ➡️Give constructive feedback: always be: + specific, actionable, and respectful + focus on improvement rather than criticizing.  + make a space for open communication to answer questions & give clarifications. 🟠As a reviewee: ➡️follow up on feedback:  + don’t take the comments personally + actively work on feedback after the session + make necessary revisions and follow up to confirm ➡️Follow coding standards:  focus on improving: + readability + maintainability Remember - mutual respect during the code reviews is crucial for a great team culture! – P.S: If you’re a Sr. Software engineer looking to become a Tech Lead or manager. I’m doing a webinar soon. Stay tuned :)

  • View profile for Mayank A.

    Follow for Your Daily Dose of AI, Software Development & System Design Tips | Exploring AI SaaS - Tinkering, Testing, Learning | Everything I write reflects my personal thoughts and has nothing to do with my employer. 👍

    174,289 followers

    Wish someone had told me this about code review etiquette 🚀 The goal is shipping reliable code, not winning arguments. 😊 𝐀𝐬 𝐚 𝐑𝐞𝐯𝐢𝐞𝐰𝐞𝐫: 𝐃𝐨'𝐬 ✅ ◾ Review the code within 24 hours — blocking teammates kills productivity ◾ Start with "What problem is this code solving?" ◾ Look for security vulnerabilities first, then architecture, then style ◾ Ask questions instead of making accusations ("What's the reason for...?" vs "This is wrong") ◾ Suggest alternatives with code examples when possible ◾ Acknowledge good patterns and clever solutions 𝐃𝐨𝐧'𝐭𝐬 ❌ ◾ Don’t nitpick about style if there's an automated linter ◾ Don’t rewrite the code in your preferred style ◾ Never make it personal, critique the code, not the coder ◾ Don’t approve without actually reviewing ◾ Don’t block PRs for minor issues 𝐀𝐬 𝐚 𝐂𝐨𝐝𝐞 𝐀𝐮𝐭𝐡𝐨𝐫: 𝐃𝐨'𝐬 ✅ ◾ Keep PRs small (under 400 lines when possible) ◾ Add context in PR description (screenshots for UI changes) ◾ Self-review before requesting others ◾ Break down large changes into smaller PRs ◾ Respond to comments within one business day ◾ Add tests for new code ◾ Document non-obvious decisions 𝐃𝐨𝐧'𝐭𝐬 ❌ ◾ Don’t take feedback personally ◾ Don’t push back without explanation ◾ Don’t mark conversations resolved without addressing them ◾ Don’t submit PRs without testing locally ◾ Don’t expect instant reviews for massive changes You can add more, based on your experience. 👍 But let me leave you with one final thought. Code reviews are still mostly manual, but the landscape around us is shifting fast. With tools like Cursor, Replit, Devin, and others, AI-generated code is becoming the norm. Teams are shipping faster, and the volume of code is growing. But our review processes haven’t caught up. And this gap is only going to widen. That’s why I find tools like Korbit AI interesting. Instead of reviewing PRs in isolation, it brings full codebase context into the review. It also helps engineering managers track things like security risks, code health, and developer insights, all of which get harder as AI-generated code scales. korbit.ai

  • View profile for Allen Holub

    I help you build software better & build better software.

    33,688 followers

    Last night, I was chatting in the hotel bar with a bunch of conference speakers at Goto-CPH about how evil PR-driven code reviews are (we were all in agreement), and Martin Fowler brought up an interesting point. The best time to review your code is when you use it. That is, continuous review is better than what amounts to a waterfall review phase. For one thing, the reviewer has a vested interest in assuring that the code they're about to use is high quality. Furthermore, you are reviewing the code in a real-world context, not in isolation, so you are better able to see if the code is suitable for its intended purpose. Continuous review, of course, also leads to a culture of continuous refactoring. You review everything you look at, and when you find issues, you fix them. My experience is that PR-driven reviews rarely find real bugs. They don't improve quality in ways that matter. They DO create bottlenecks, dependencies, and context-swap overhead, however, and all that pushes out delivery time and increases the cost of development with no balancing benefit. I will grant that two or more sets of eyes on the code leads to better code, but in my experience, the best time to do that is when the code is being written, not after the fact. Work in a pair, or better yet, a mob/ensemble. One of the teams at Hunter Industries, which mob/ensemble programs 100% of the time on 100% of the code, went a year and a half with no bugs reported against their code, with zero productivity hit. (Quite the contrary—they work very fast.) Bugs are so rare across all the teams, in fact, that they don't bother to track them. When a bug comes up, they fix it. Right then and there. If you're working in a regulatory environment, the Driver signs the code, and then any Navigator can sign off on the review, all as part of the commit/push process, so that's a non-issue. There's also a myth that it's best if the reviewer is not familiar with the code. I *really* don't buy that. An isolated reviewer doesn't understand the context. They don't know why design decisions were made. They have to waste a vast amount of time coming up to speed. They are also often not in a position to know whether the code will actually work. Consequently, they usually focus on trivia like formatting. That benefits nobody.

  • View profile for Fatima Taj

    Senior Software Engineer at Yelp • LinkedIn Learning Instructor • I help software engineers go from offer → impact → promotion.

    6,932 followers

    TIPS FOR YOUR INTERNSHIPS AND NEW GRAD POSITIONS - 2024 EDITION There are some things you learn better once the roles are reversed: I learned the importance of a good pull request (PR) once I started reviewing them myself. Here is a checklist you can refer to: 1. Getting your work reviewed doesn't shift the responsibility of catching issues to your reviewers. The prime responsibility of ensuring your work is defect-free and won't cause problems in prod is always on you, the author. The code review process is a guardrail, but don't treat it as a crutch: 'I'll have a senior engineer review my work, so I don't have to worry about testing the edge cases, they'll catch those.' This is the wrong mentality to create a PR with. If all your PRs involve reviewers pointing out edge cases, you're not doing your job diligently. 2. Document your PRs properly. Provide context, and don't take this for granted. Just because someone reviews your PR doesn't mean they'll have the complete background. Include the WHAT, WHY, and HOW. WHAT: Provide background on the issue. Example: this PR fixes an uncaught exception (include details about the exception). WHY - Why is this fix necessary? Example: the fix is needed because it helps prevent the app from crashing unexpectedly because of the uncaught exception. HOW - Example: It's fixed by encapsulating this block of code within a try-catch and logging the error. 3. Add instructions on how to reproduce the error and verify the fix locally. 4. For UI changes, including screenshots of before and after can be helpful. 5. Add tests! You'd be surprised how often this step is forgotten. 6. Keep the PR small, so it's easy to review. The usual guideline is less than 250 lines of code per PR. If it's too large, break it down into multiple PRs. 7. Review it first yourself. You'd be surprised by how many print statements you'll find that you forgot to clean up. 8. Assign the right reviewers. 9. Call out things you want to bring specific attention to, and you can cc specific people. 10. You don't have to necessarily agree with every piece of feedback provided - if there's something you feel strongly about, feel free to discuss it. If the discussion is getting too long, consider switching to a different medium - my goto is to jump on a quick call. 11. Give people enough time to review large PRs. If you're planning on merging a big feature on Friday afternoon (which isn't a great idea to begin with), don't create the PR on Thursday evening. There can be exceptions to this rule, but rushed reviews should be avoided. In the worst case, keep in mind that your PR could be reverted, which is why keeping the PR detailed is necessary. Got any more suggestions? Drop them in the comments below! #softwareengineering #technology

Explore categories