Tips for Rethinking Coding Assessments

Explore top LinkedIn content from expert professionals.

Summary

Coding assessments are tests used during the hiring process to measure a candidate’s programming skills, but many traditional formats focus on trivia or artificial challenges rather than real-world abilities. Rethinking these assessments can help companies identify candidates who are stronger problem-solvers, communicators, and collaborators—qualities that matter most on the job.

  • Focus on real-world tasks: Choose exercises that reflect the practical challenges developers actually face, such as reviewing past projects or discussing system design.
  • Encourage collaboration: Include interactive activities like pair programming or technical discussions to gauge teamwork and adaptability.
  • Prioritize clear communication: Ask candidates to explain their thought process and decision-making so you can understand how they approach problem-solving.
Summarized by AI based on LinkedIn member posts
  • View profile for Mike Kyle

    Finding Elite Talent in HPC, Fintech, and Technology!

    6,562 followers

    “Would your own team pass the test?” A CTO I spoke with admitted they were struggling to hire senior engineers. Candidates kept dropping out after seeing their coding test. I asked, “Would your own team pass it?” His silence said everything. Many coding tests don’t reflect the real work engineers do daily. Instead, they test for algorithm trivia, unrealistic time constraints, or problems engineers would just Google on the job. So, top candidates—especially experienced ones—opt out. Some of the best companies are ditching traditional coding tests for better alternatives like: ✔ 𝗣𝗮𝗶𝗿 𝗽𝗿𝗼𝗴𝗿𝗮𝗺𝗺𝗶𝗻𝗴 – Solve a problem together in real-time. ✔ 𝗣𝗼𝗿𝘁𝗳𝗼𝗹𝗶𝗼 𝗿𝗲𝘃𝗶𝗲𝘄𝘀 – Discuss past projects and decision-making. ✔ 𝗖𝗼𝗱𝗲 𝘄𝗮𝗹𝗸𝘁𝗵𝗿𝗼𝘂𝗴𝗵𝘀 – Review and refactor real-world code. ✔ 𝗧𝗲𝗰𝗵𝗻𝗶𝗰𝗮𝗹 𝗱𝗲𝗲𝗽 𝗱𝗶𝘃𝗲𝘀 – Whiteboard system design challenges instead. One of our clients replaced their coding test with a live technical discussion and saw interview completion rates jump by 50%. If candidates are avoiding your coding test, it’s not because they lack skills—it’s because the test isn’t worth their time. Would you hire your own team if they had to take your assessment? If not, it might be time for a change.

  • View profile for Harshit Sharma

    SWE • Google, Amazon • 75K+ @ Linkedin • 150+ Interviews taken • Tech Interview Mentor • Story Teller

    77,468 followers

    After taking 75 Software Engineer interviews at Google in < 7 months, I’ve seen a range of mistakes all of us make in coding interviews. Here’s a compiled list to help you (and me) avoid these pitfalls in our future interviews! 1️⃣ Not Clarifying Requirements > Many candidates jump straight into coding. Often without fully understanding the problem. This can waste time and lead to errors. Tip: Always ask clarifying questions. To ensure you get the requirements. Confirm edge cases and input constraints early on. 2️⃣ Overcomplicating Solutions > In the heat of the moment, it is easy to overthink a problem. And this complicates the solution, both for you and your interviewer. Tip: Start with a brute-force approach (just explain it), then iterate towards optimization (code it up). Easy-to-understand solutions get bonus points. 3️⃣ Under-Communication > Interviews are not just about coding. They’re also about conveying your thought process. Silence takes away the only help you have during the interview—your interviewer. Tip: Think out loud! Explain your reasoning and approach as you code. This helps the interviewers understand you and even guide you if needed. 4️⃣ Ignoring Edge Cases > Many candidates create a working solution. But fail to consider edge cases. This can lead to catastrophic failures. Tip: After arriving at a solution, always discuss potential edge cases. Explain how your code handles them. This shows your thoroughness. 5️⃣ Neglecting to Optimize > Even if your solution works, failing to consider optimization can cost you points. Tip: After solving the problem, re-read your solution and discuss ways to improve time and space complexity. No micro-optimizations. Interviewers appreciate candidates who think about efficiency in big-oh notation. 6️⃣ Skipping Dry Runs > 80%+ candidates skip the dry run of their code, leading to overlooked mistakes. Tip: Walk through your code with sample inputs. This helps catch errors early and makes you look proactive. 7️⃣ Getting Flustered > Interviews are stressful. And it is easy to panic if you hit a roadblock. Tip: If you’re stuck, ask for a minute or 2 to gather your thoughts. Ask for hints if necessary—interviewers appreciate candidates who are willing to seek help. Those were my 2 cents on how to tackle coding interviews. But believe it or not, the best way to realize your interview mistakes would be to start taking interviews (even mock ones). After conducting so many interviews at Google, I realized how I often fell into the same traps as everyone. Like going completely silent or forgetting to do a dry run for the interviewer. Taking interviews altered my perspective, and now I advise everyone preparing for interviews to take a couple of them first. Total game changer! #codingInterviews #jobPrep #softwareEngineering #Google #interviewTips

  • View profile for Aimee Thompson

    Recruitment Consultant | Software Development Specialist | Championing Inclusive & Equitable Hiring | Launch Recruitment (Female‑Founded & Female‑Led)

    16,581 followers

    The smartest developer I’ve ever hired failed the coding test. Here’s why: The test didn’t measure what mattered most—problem-solving, creativity, and adaptability. The story: We had a candidate who blew us away during discussions and real-world problem-solving, but they stumbled on a timed whiteboard exercise. Instead of writing them off, we: - Reviewed their portfolio. - Discussed real-world challenges. - Conducted a pair programming session. The result? They not only thrived in the role but became a mentor, driving critical projects and elevating the entire team. This taught me a powerful lesson: Traditional coding tests often fail to identify top talent. They focus on artificial environments, not real-world impact. Instead, we should consider: ✔️ Project-based tasks that replicate real challenges. ✔️ Portfolio reviews that highlight past achievements. ✔️ Collaborative exercises like pair programming. Now, over to you: Do you think coding tests accurately measure a developer’s ability? Why or why not? Let’s discuss. #codingtests #softwaredevelopment #hiringtrends

  • View profile for Puneet Patwari

    Principal Software Engineer @Atlassian| Ex-Sr. Engineer @Microsoft || Sharing insights on SW Engineering, Career Growth & Interview Preparation

    67,779 followers

    7 general tips that helped me get 300% better results at coding interviews and flipped my results from rejections to success (Salesforce, Atlassian, Deliveroo, Uber). If you have a Leetcode-style round coming up, pay attention. [1] Start with constraints before you touch the keyboard - Ask 4 to 6 sharp questions on input size, value range, edge cases, and “impossible” scenarios. - Use those answers to rule out dumb ideas in your head instead of jumping into half-baked solutions. - Say it out loud: “Given these constraints, O(n²) will / will not fly, so I am thinking of pattern X.” [2] Always show the brute-force path first, then walk to the better one - Sketch the simplest correct idea in plain language, even if you would never code it in production. - Use it to introduce your data structures, invariants, and what “valid answer” means. - Then upgrade: “This works in O(n²). To make it scale, I will turn this into a sliding window / hash map / BFS.” [3] Name the pattern early so the interviewer can follow your mental model - Say what you see: “This feels like a sliding window problem” or “This is BFS on a grid with extra state.” - Anchor the discussion on that pattern so the interviewer knows you are not guessing randomly. - Reuse that pattern vocabulary while you code: “left and right pointer”, “current window state”, “queue of positions”. [4] Dry-run your idea on the example like a human interpreter - Before coding, walk through the example exactly like in the video: move pointers, grow and shrink windows, run BFS layers. - Say where your current state is stored: “target map”, “current frequency”, “distance matrix”, “matches vs needed”. - Use the dry run to catch missing pieces early, like duplicate counts, disconnected components, or off-by-one mistakes. [5] Code in small, logical chunks, instead of one giant block - Set up core structures first: maps, counters, queues, and pointers, and say why each one exists. - Implement one clear loop or helper at a time, then mentally test it on part of the example before moving on. - Keep your names meaningful so you can talk through them: targetFreq, currentMatches, needs, distanceGrid. [6] Talk through trade-offs instead of just time complexity formulas - After you finish, do more than “O(n + m), O(1) space”. Say what actually makes it faster than the naive version. - Call out where it can break in real systems: duplicates, empty cases, disconnected components, large grids, long strings. - Mention alternatives briefly: “We could also do X, but it would recompute too much / use too much space.” – P.S: Say Hi on Twitter: https://lnkd.in/g9H82Q98 — P.P.S: Feel free to reach out to me if you're preparing for a switch, want to chat about interview preparation, or how to move to the next level in your career: https://lnkd.in/guttEuU7

  • View profile for Chandrasekar Srinivasan

    Engineering and AI Leader at Microsoft

    50,096 followers

    I’ve reviewed close to 2000+ code review requests in my career. At this point, it’s as natural to me as having a cup of coffee. However, from a senior engineer to now an engineering manager, I’ve learned a lot in between. If I had to learn to review code all over again, this would be the checklist I follow (inspired from my experience) 1. Ask clarifying questions:      - What are the exact constraints or edge cases I should consider?      - Are there any specific inputs or outputs to watch for?      - What assumptions can I make about the data?      - Should I optimize for time or space complexity?  2. Start simple:      - What is the most straightforward way to approach this?      - Can I explain my initial idea in one sentence?      - Is this solution valid for the most common cases?      - What would I improve after getting a basic version working?  3. Think out loud:      - Why am I taking this approach over another?      - What trade-offs am I considering as I proceed?      - Does my reasoning make sense to someone unfamiliar with the problem?      - Am I explaining my thought process clearly and concisely?  4. Break the problem into smaller parts:      - Can I split the problem into logical steps?      - What sub-problems need solving first?      - Are any of these steps reusable for other parts of the solution?      - How can I test each step independently?  5. Use test cases:      - What edge cases should I test?      - Is there a test case that might break my solution?      - Have I checked against the sample inputs provided?      - Can I write a test to validate the most complex scenario?  6. Handle mistakes gracefully:      - What’s the root cause of this mistake?      - How can I fix it without disrupting the rest of my code?      - Can I explain what went wrong to the interviewer?      - Did I learn something I can apply to the rest of the problem?  7. Stick to what you know:      - Which language am I most confident using?      - What’s the fastest way I can implement the solution with my current skills?      - Are there any features of this language that simplify the problem?      - Can I use familiar libraries or tools to save time?  8. Write clean, readable code:      - Is my code easy to read and understand?      - Did I name variables and functions meaningfully?      - Does the structure reflect the logic of the solution?      - Am I following best practices for indentation and formatting?  9. Ask for hints when needed:      - What part of the problem am I struggling to understand?      - Can the interviewer provide clarification or a nudge?      - Am I overthinking this?      - Does the interviewer expect a specific approach?  10. Stay calm under pressure:      - What’s the first logical step I can take to move forward?      - Have I taken a moment to reset my thoughts?      - Am I focusing on the problem, not the time ticking away?      - How can I reframe the problem to make it simpler?  

  • View profile for Brijesh Deb

    Principal Consultant, Infosys · Founder, The Test Chat · I help organisations turn quality from a late testing conversation into a leadership discipline that protects revenue, reputation, speed, and trust.

    48,669 followers

    The way we hire testers today has taken a concerning turn. Interviews for SDETs and test automation specialists even for testing roles are heavily focused on tools and coding skills, often sidelining the core testing mindset and skills. While automation is important, prioritizing it above all else in interviews leads to a dangerous imbalance. The result? Testers enter the workforce with weak testing foundations, and the quality of products inevitably suffers. Hiring someone who can write impeccable code but lacks the ability to think like a tester is akin to buying a high-speed car without a skilled driver—it might look promising, but it’s destined for a crash. What’s Gone Wrong? 1. Overemphasis on Tools: Tools are enablers, not the end goal. Interviews dominated by tools and syntax fail to evaluate whether a candidate understands what to test, how to test, and why to test. 2. Neglecting Testing Fundamentals: Many interviews gloss over essential testing skills like critical thinking, exploratory testing, root cause analysis, and risk assessment. 3. Theoretical Overload: Questioning often revolves around textbook definitions or hypothetical scenarios that don’t reflect real-world challenges. This rut we’re in has led to a generation of testers who excel at operating tools but falter when it comes to actual testing. What Needs to Change? It’s time to redefine how we evaluate testing talent: 1. Balance Coding with Testing: Yes, automation and coding are important, but they are not substitutes for testing skills. Interviews must equally assess the ability to think critically, question requirements, and design effective test strategies. 2. Focus on Practical Assessments: Move away from theoretical Q&A. Instead, give candidates practical challenges. For instance: • Ask them to test a simple application, evaluate their approach, and observe their thought process. • Present a flawed requirement or user story and see how they identify gaps or inconsistencies. • Include a collaborative session where they discuss risks and test ideas with others. 3. Assess the Tester’s Mindset: Evaluate their curiosity, attention to detail, and ability to uncover hidden assumptions. A strong tester is one who thinks beyond the obvious. 4. Educate Hiring Managers: It’s crucial to help decision-makers understand the true value of testing skills. A tester’s role goes beyond automation—it’s about safeguarding quality. The industry needs to step back and rethink how we define, evaluate, and value testing expertise. Testing is about uncovering the unknown, finding and highlighting those risks and questioning the obvious — not just automating steps. If we continue down the current path, we risk compromising the essence of testing and, ultimately, product quality. It’s time to prioritize hiring testers who can think critically, test creatively, and drive quality—beyond the tools they know. #softwaretesting #softwareengineering #hiringtesters #brijeshsays

  • View profile for Neha Bhargava

    Senior Software Engineer | JavaScript, React, Angular, Node | WTM Ambassador

    36,235 followers

    After giving & conducting 100+ coding interviews over the past 12+ years at startups and MAANG+ companies, I’ve realized one thing: —Interviews aren’t about who writes the most code. —they’re about who thinks in the most structured way. Here are 30 key insights I’ve learned from sitting on both sides of the table - as an interviewee striving to prove my skills and as an interviewer evaluating candidates:   - Communicate trade-offs clearly - Test your code with multiple cases - Start with a brute force, then improve - Clarify edge cases & constraints early - Know when to trade-off time vs. space - Explain time & space complexity upfront - Know when to use BFS vs. DFS in graphs - Optimize only after correctness is ensured - Don’t overcomplicate: simple solutions win - Think out loud & show your thought process - Handle errors & unexpected inputs gracefully - Use stack for problems with nested structures - Understand system design even at junior levels - Recognize patterns (many problems are variations) - Cache results for repeated calculations (Memoization) - Understand when to use heaps & priority queues - Confidence matters(believe in your approach) - Master sliding window for subarray questions - Use binary search for optimization problems - Use modular functions for better readability - Know when recursion vs. iteration is better - Use meaningful variable & function names - Write clean, readable, and modular code - Divide & conquer for large problem sets - Refactor before finalizing your solution Understand the problem before coding - Use two pointers for array problems - Use hash maps for quick lookups - Know how to debug efficiently At the end of the day, coding interviews aren’t about memorization, they’re about structured thinking.  Which lesson do you wish you knew earlier?

  • View profile for Cristina Guijarro-Clarke

    PhD Principal Bioinformatics Engineer | DevOps | Nextflow | Cloud | Leader | Mentor | Scientist

    7,533 followers

    Can we just stop and rethink the interview/recruitment process for #Bioinformaticians? No #recruitment process should ever need more than three #interviews - including the initial screening call - putting that out there, but that's not the focus here. There’s another important consideration: how bioinformatician's are being assessed at a technical level. For technical roles in bioinformatics, competency tests are often part of the process. For early-career roles, that can make sense - it’s a chance to see how someone approaches a problem and writes code under reasonable conditions. However, for senior bioinformatics engineers, data scientists or computational biologists, standard coding tests are often totally meaningless. In this day and age, with AI tools widely available, a timed coding test proves little about ability or impact. Even if candidates are allowed to use AI, what exactly are we testing? Typing speed? Prompts? Memory under pressure? Writing code in isolation (including alone in front of an audience/panel) tells you nothing about how someone reasons through complexity, communicates, or builds for maintainability and reproducibility. The best engineers and scientists don’t ~simply~ code - they make design decisions, guide others, manage trade-offs, and think in systems. So what should be done instead? For senior roles, the focus should shift to thought process and team fit. 🔹 Code review discussions – how do they evaluate quality, scalability, and clarity? 🔹 Scenario-based conversations – how do they choose a tech stack or approach a complex data integration problem? 🔹 Walkthrough of a real-world pipeline or data challenge 🔹 Best-practice debates – what trade-offs do they consider, and why? These give genuine insight into how someone thinks, collaborates, and leads. As bioinformaticians progress in their #careers, the less their raw coding ability matters. What matters far more is how they apply their experience to drive projects forward, enable teams, and deliver reproducible, reliable science. Hiring managers, if you’re assessing senior+ level bioinformaticians with coding tests - please reconsider what you’re actually measuring. It's likely not to be what truly matters. And quite frankly, with years of #experience in industry, it can be quite insulting to a candidate.

  • View profile for Rajya Vardhan Mishra

    Engineering Leader @ Google | Mentored 300+ Software Engineers | Building High-Performance Teams | Tech Speaker | Led $1B+ programs | Cornell University | Lifelong Learner | My Views != Employer’s Views

    114,202 followers

    Here are the 11 most actionable tips I can give you on approaching coding problems in technical interviews after having interviewed 1000+ Software Engineers across Google, Paytm, Amazon & various startups (in the last 15+ years of my journey) Step 1: Start With Clarifying Questions | |__ ➝ Don’t rush into coding. |      Ask about edge cases, constraints, and input formats: |         – Can parameters be empty? |         – Are there duplicates? |         – Are inputs always lowercase? |         – What should I return if there’s no valid answer? |      These answers shape your approach and avoid rework. | v Step 2: Manual Walkthrough With Examples | |__ ➝ Use given test cases. |      Draw out the example, underline or highlight key words, and manually reduce the problem. |      This helps you: |         – Find optimal substructures (e.g., shortest valid substring) |         – Catch mistakes before coding |      If you can “see” the answer by hand, you can code it more confidently. | v Step 3: Start Naive, Think Out Loud | |__ ➝ Always share your brute-force approach. |      Describe it step-by-step: |         – Use nested loops to anchor possible start/end indices |         – Check validity at each step |         – Keep track of the best result (length, indices) |      This shows the interviewer you understand basics before optimizing. | v Step 4: Recognize Patterns Early | |__ ➝ Ask yourself: |         – Is there a window I can slide over the input? |         – Can I avoid redundant work using two pointers? |      If yes, transition to a sliding window approach. |      Don’t stick with brute-force if a better pattern fits. | v Step 5: Build the Right Data Structures | |__ ➝ Use hash maps, not just sets. |      When frequency or duplicates matter, always track counts, not just presence. |      E.g., if a substring must contain all required words with their counts, you need a map for both “target” and “current window.” | v Step 6: Dry Run Your Optimized Approach | |__ ➝ Before you code, walk through your window logic by hand: |         – Expand right pointer to include more words |         – Shrink left pointer to minimize window once all requirements are met |         – Update best answer (start, end, length) as you go |      *Keep track of when your window is valid and when it isn’t.* | v Step 7: Implement, Then Tighten the Loop | |__ ➝ When you start coding: |         – Set up all maps and pointers first |         – Incrementally update your window |         – Always check: Did you match all targets? Can you shrink further? |      Use variables like minLength, bestStart, bestEnd to track answers. | v Step 8: Check Edge Cases (Empty/No Solution) | |__ ➝ Always handle what to return if there’s no valid solution. |      Don’t forget: If your bestStart/bestEnd were never updated, return an empty string (or -1, depending on the problem). | v Continued in Comments ↓

  • View profile for Arslan Ahmad

    Author of Bestselling ‘Grokking’ Series on System Design, Software Architecture & Coding Patterns | Founder DesignGurus.io

    189,475 followers

    I'm often asked what to do if one can't solve a coding problem after pondering it for 15-20 minutes. People often hit a wall while preparing for coding interviews. I did too. Here are my suggestions: 👉𝐏𝐫𝐚𝐜𝐭𝐢𝐜𝐞 𝐀𝐜𝐭𝐢𝐯𝐞𝐥𝐲: Instead of passively reading the explanation after 15-20 minutes, try to struggle with the problem a bit longer. This is where the learning really happens. If you can't solve it, try to identify which part of the problem you find challenging. Is it the initial approach? Is it a tricky corner case? Once you've identified your weak point, you can then focus on solutions to that specific issue. 👉𝐔𝐧𝐝𝐞𝐫𝐬𝐭𝐚𝐧𝐝 𝐭𝐡𝐞 𝐂𝐨𝐧𝐜𝐞𝐩𝐭𝐬, 𝐍𝐨𝐭 𝐉𝐮𝐬𝐭 𝐭𝐡𝐞 𝐒𝐨𝐥𝐮𝐭𝐢𝐨𝐧𝐬: While it's tempting to memorize solutions, the interviewers are more interested in how you think and approach a problem. So, focus on the underlying techniques, patterns, and algorithms. Once you deeply understand a concept, you'll be able to apply it to a variety of questions. 👉𝐒𝐩𝐚𝐜𝐞𝐝 𝐑𝐞𝐩𝐞𝐭𝐢𝐭𝐢𝐨𝐧: Instead of reviewing all questions every day, use the spaced repetition technique. For example: 1. Review a question you've solved today. 2. Review it again in two days. 3. If you solve it successfully, review it again in a week. 4. If you still solve it successfully, review it again in two weeks. This technique will help you remember the approach over the long term. 👉𝐃𝐢𝐬𝐜𝐮𝐬𝐬 𝐰𝐢𝐭𝐡 𝐏𝐞𝐞𝐫𝐬: Talking through your solution, or even your confusion, with someone else can be very beneficial. This could be in online forums, study groups, or with friends preparing for similar interviews. Explaining your thought process to someone else can help solidify your understanding. 👉𝐂𝐚𝐭𝐞𝐠𝐨𝐫𝐢𝐳𝐞 𝐏𝐫𝐨𝐛𝐥𝐞𝐦𝐬: Many problems can be grouped together into certain categories like sliding window. Once you've solved a few problems in a category, try to summarize the general approach that apply to that category. This way, when faced with a new problem, you can try to fit it into a known category and apply the corresponding techniques. 👉𝐌𝐨𝐜𝐤 𝐈𝐧𝐭𝐞𝐫𝐯𝐢𝐞𝐰𝐬: Consider mock interviews with friends or using platforms that offer this service (check https://lnkd.in/gwrarnyD). This not only helps with problem-solving but also gets you comfortable with explaining your thought process. 👉𝐕𝐚𝐫𝐢𝐚𝐭𝐢𝐨𝐧 𝐢𝐬 𝐊𝐞𝐲: Instead of solving similar problems repeatedly in a short span, try a mix. For instance, after two-pointer problems, move on to recursion, then sliding window, and then come back to two-pointers. This cyclic variation helps cement your learning better. 👉𝐔𝐧𝐝𝐞𝐫𝐬𝐭𝐚𝐧𝐝 𝐌𝐢𝐬𝐭𝐚𝐤𝐞𝐬: Whenever you can't solve a problem, instead of just reading the solution, ask yourself why you couldn't solve it. Is there a pattern or concept you're consistently missing? By recognizing your weak spots, you can focus on improving in those areas. #codinginterview #datastructures

Explore categories