
TOWARDSDATASCIENCE.COM
Beyond the Code: Unconventional Lessons from Empathetic Interviewing
Recently, I’ve been interviewing Computer Science students applying for data science and engineering internships with a 4-day turnaround from CV vetting to final decisions. With a small local office of 10 and no in-house HR, hiring managers handle the entire process.
This article reflects on the lessons learned across CV reviews, technical interviews, and post-interview feedback. My goal is to help interviewers and interviewees make this process more meaningful, kind, and productive.
Principles That Guide the Process
Foster meaningful discussions rooted in real work to get maximum signal and provide transferrable knowledge
Ensure applicants solve all problems during the experience– Judge excellence by how much inspiration arises unprompted
Make sure even unsuccessful applicants walk away having learned something
Set clear expectations and communicate transparently
The Process Overview
Interview Brief
CV Vetting
1-Hour Interview
Post-Interview Feedback
A single, well-designed hour can be enough to judge potential and create a positive experience, provided it’s structured around real-world scenarios and mutual respect.
The effectiveness of the tips would depend on company size, rigidity of existing processes, and interviewers’ personality and leadership skills
Let’s examine each component in more detail to understand how they contribute to a more empathetic and effective interview process.
Photo by Sven Huls on Unsplash
Interview Brief: Set the Tone Early
Link to sanitized version.
The brief provides:
Agenda
Setup requirements (debugger, IDE, LLM access)
Task expectations
Brief Snippet: Technical Problem Solving
Exercise 1: Code Review (10-15 min)
Given sample code, comment on its performance characteristics using python/computer science concepts
What signals this exercise provides
Familiarity with IDE, filesystem and basic I/O
Sense of high performance, scalable code
Ability to read and understand code
Ability to communicate and explain code
No one likes turning up to a meeting without an agenda, so why offer candidates any less context than we expect from teammates?
Process Design
When evaluating which questions to ask, well-designed ones should leave plenty of room for expanding the depth of the discussion. Interviewers can show empathy by providing clear guidance on expectations. For instance, sharing exercise-specific evaluation criteria (which I refer to as “Signals” in the brief) allows candidates to explore beyond the basics.
Code or no code
Whether I include pre-written code or expect the candidate to write depends on the time available. I typically reveal it at the start of each task to save time , especially since LLMs can often generate the code, as long as the candidate demonstrates the right thinking.
CV Vetting: Signal vs Noise
You can’t verify every claim on a CV, but you can look for strong signals
Git Introspection
One trick is to run git log — oneline — graph — author=gitgithan — date=short — pretty=format:”%h %ad %s” to see all the commits authored by a particular contributor.
You can see what type of work it is (feature, refactoring, testing, documentation), and how clear the commit messages are.
Strong signals
Self-directed projects or open-source contributions
Evidence of cross-functional communication and impact
Weak or Misleading signals
Guided tutorial projects are less effective in showing vision or drive
Bombastic adjectives like passionate member, indispensable position.
Photo by Patrick Fore on Unsplash
Interview: Uncovering Mindsets
Reflecting on the Interview Brief
I begin by asking for thoughts on the Interview Brief.
This has a few benefits:
How conscientious are they in following the setup instructions? – Are they prepared with the debugger and LLM ready to go?
What aspects confuse them?– I realized I should have specified “Pandas DataFrame” instead of just “dataframe” in the brief. Some candidates without Pandas installed experienced unnecessary setup stress. However, observing how they handled this issue provided valuable insight into their problem-solving approach– This also highlights their attention to detail and how they engage with documentation, often leading to suggestions for improvement.
What tools are they unfamiliar with?– If there’s a lack of knowledge in concurrent Programming or AWS, it’s more efficient to spend less time on Exercise 3 and focus elsewhere.– If they’ve tried to learn these tools in the short time between receiving the brief and the interview, it demonstrates strong initiative. The resources they consult also reveal their learning style and resourcefulness.
Favorite Behavioral Question
To uncover essential qualities beyond technical skills, I find the following behavioral question particularly revealing
Can you describe a time when you saw something that wasn’t working well and advocated for an improvement?
This question reveals a range of desirable traits:
Critical thinking to recognize when something is off
Situational awareness to assess the current state and vision to define a better future
Judgment to understand why the new approach is an improvement
Influence and persistence in advocating for change
Cultural sensitivity and change management awareness, understanding why advocacy may have failed, and showing the grit to try again with a new approach
Effective Interviewee Behaviours (Behavioural Section)
Attuned to both personal behavior and both its effect on, and how it’s affected by others
Demonstrates the ability to overcome motivation challenges and inspire others
Provides concise, inverted pyramid answers that uniquely connect to personal values
Ineffective Interviewee Behaviours (Behavioural Section)
Offers lengthy preambles about general situations before sharing personal insights
Tips for Interviewers (Behavioural Section)I’ve never been a fan of questions focused on interpersonal conflicts, as many people tend to avoid confrontation by becoming passive (e.g., not responding or mentally disengaging) rather than confronting the issue directly. These questions also often disadvantage candidates with less formal work experience.
A helpful approach is to jog their memory by referencing group experiences listed on their CV and suggesting potential scenarios that could be useful for discussion.
Providing instant feedback after their answers is also valuable, allowing candidates to note which stories are worth refining for future interviews.
Technical Problem Solving: Show Thinking, Not Just Results
Measure Potential, Not Just Preparedness
Has high agency, jumps into back-of-the-envelope calculations instead of making guesses
Re-examines assumptions
Low ego to reveal what they don’t know and make good guesses about why something is so based on limited information
Makes insightful analogies (eg. database cursor vs file pointer) that show deeper understanding and abstraction
Effective Interviewee Behaviours (Technical Section)
Exercise 1 on File reading with generators: admitting upfront their unfamiliarity with yield syntax invites the interviewer to hint that it’s not important
Exercise 2 on data cleaning after JOIN: caring about data lineage, constraints of the domain (units, collection instrument) shows systems thinking and a drive to fix the root cause
Ineffective Interviewee Behaviours (Technical Section)
Remains silent when facing challenges instead of seeking clarification
Fails to connect new concepts with prior knowledge
Calls in from noisy, visually distracting environments, thus creating friction on top of existing challenges like accents.
Tips for Interviewers (Technical Section)
Start with guiding questions that explore high-level considerations before narrowing down. This helps candidates anchor their reasoning in principles rather than trivia.
Avoid overvaluing your own prepared “correct answers.” The goal isn’t to test memory, but to observe reasoning.
Withhold judgment in the moment , especially when the candidate explores a tangential but thoughtful direction. Let them follow their thought process uninterrupted. This builds confidence and reveals how they navigate ambiguity.
Use curiosity as your primary lens. Ask yourself, “What is this candidate trying to show me?” rather than “Did they get it right?”
Photo by Brad Switzer on Unsplash
LLM: A Window into Learning Styles
Modern technical interviews should reflect the reality of tool-assisted development. I encouraged candidates to use LLMs — not as shortcuts, but as legitimate creation tools. Restricting them only creates an artificial environment, divorced from real-world workflows.
More importantly, how candidates used LLMs during coding exercises revealed their learning preferences (learning-optimized vs. task-optimized) and problem-solving styles (explore vs. exploit).
You can think of these 2 dichotomies as sides of the same coin:
Learning-Optimized vs. Task-Optimized (Goals and Principles)
Learning-Optimized: Focuses on understanding principles, expanding knowledge, and long-term learning.
Task-Optimized: Focuses on solving immediate tasks efficiently, often prioritizing quick completion over deep understanding.
Explore vs. Exploit (How it’s done)
Explore: Seeks new solutions, experiments with various approaches, and thrives in uncertain or innovative environments.
Exploit: Leverages known solutions, optimizes existing strategies, and focuses on efficiency and results.
4 styles of prompting
In Exercise 2, I deleted a file.seek(0) line, causing pandas.read_csv() to raise EmptyDataError: No columns to parse from file.
Candidates prompted LLMs in 4 styles:
Paste error message only
Paste error message and erroring line from source code
Paste error message and full source code
Paste full traceback and full source code
My interpretations
(1) is learning-optimized, taking more iterations
(4) is task-optimized, context-rich, and efficient
Those who choose (1) start looking at a problem from the highest level before deciding where to go. They consider that the error may not even be in the source code, but the environment or elsewhere (See Why Code Rusts in reference). They optimize for learning rather than fixing the error immediately.
Those with poor code reproduction discipline and do (4) may not learn as much as (1), because they can’t see the error again after fixing it.
My ideal is (4) for speedy fixes, but taking good notes along the way so the root cause is understood, and come away with sharper debugging instincts.
Red Flag: Misplaced Focus on Traceback Line
Even though (2) included more detail in the prompt than (1), more isn’t always better.In fact, (2) raised a concern: it suggested the candidate believed the line highlighted in the Traceback ( — -> 44 df_a_loaded = pd.read_csv) was the actual cause of the error.
In reality, the root cause could lie much earlier in the execution, potentially in a different file altogether.
Prompt Efficiency Matters
After Step (2), the LLM returned three suggested fixes — only the third one was correct. The candidate spent time exploring Fix #1, which wasn’t related to the bug at all. However, this exploration did uncover other quirks I had embedded in the code (NaNs sprinkled across the joined result from misaligned timestamps as the joining key)
Had the candidate instead used a prompt like in Step (3) or (4), the LLM would’ve provided a single, accurate fix, along with a deeper explanation directly tied to the file cursor issue.
Style vs Flow
Some candidates added pleasantries and extra instructions to their prompts, rather than just pasting the relevant code and error message. While this is partly a matter of style, it can disrupt the session’s flow , especially under time constraints or with slower typing , delaying the solution.
There’s also an environmental cost.
Photo by Anastasia Petrova on Unsplash
Feedback: The Real Cover Letter
After each interview, I asked candidates to write reflections on:
What they learned
What could be improved
What they thought of the process
This is far more useful than cover letters, which are built on asymmetric information, vague expectations, and GPT-generated fluff.Here’s an example from the offered candidate.
Excelling in this area builds confidence that colleagues can provide candid, high-quality feedback to help each other address blind spots. It also signals the likelihood that someone will take initiative in tasks like documenting processes, writing thorough meeting minutes, and volunteering for brown bag presentations.
Effective Interviewee Behaviours (Feedback Section)
Communicates expected completion times and follows through with timely submissions.
Formats responses with clear structure — using paragraph spacing, headers, bold/italics, and nested lists — to enhance readability.
Reflects on specific interview moments by drawing lessons from good notes or memory.
Recognizes and adapts existing thinking patterns or habits through meta-cognition
Ineffective Interviewee Behaviours (Feedback Section)
Submits unstructured walls of text without a clear thesis or logical flow
Fixates solely on technical gaps while ignoring behavioural weaknesses.
Tips for Interviewers (Feedback Section)
Live feedback during the interview was time-constrained, so give written feedback after the interview about how they could have improved in each section, with learning resources– If done independently from the interviewee’s feedback, and it turns out the observations match, that’s a strong signal of alignment – It’s an act of goodwill towards unsuccessful candidates, a building of the company brand, and an opportunity for lifelong collaboration
Carrying It Forward: Actions That Matter
For Interviewers
Develop observation and facilitation skills
Provide actionable, empathetic feedback
Remember: your influence could shape someone’s career for decades
For Interviewees
Make the most of the limited information you have, but try to seek more
Be curious, prepared, and reflective to learn from each opportunity
People will forget what you said, people will forget what you did, but people will never forget how you made them feel – Maya Angelou
As interviewers, our job isn’t just to assess — it’s to reveal. Not just whether someone passes, but what they’re capable of becoming.
At its best, empathetic interviewing isn’t a gate — it’s a bridge. A bridge to mutual understanding, respect, and possibly, a long-term partnership grounded not just in technical skills, but in human potential beyond the code.
The interview isn’t just a filter — it’s a mirror. The interview reflects who we are. Our questions, our feedback, our presence — they signal the culture we’re building, and the kind of teammates we strive to be.
Let’s raise the bar on both sides of the table. Kindly, thoughtfully, and together.
Photo by Shane Rounce on Unsplash
If you’re also a hiring manager passionate about designing meaningful interviews, let’s connect on LinkedIn (https://www.linkedin.com/in/hanqi91/).
I’d be happy to share more about the exercises I prepared.
Resources
Writing useful commit messages: https://refactoringenglish.com/chapters/commit-messages/
Writing impactful proposals: https://www.amazon.sg/Pyramid-Principle-Logic-Writing-Thinking/dp/0273710516
http://highagency.com/
Glue work: https://www.noidea.dog/glue
The Missing Readme: https://www.amazon.sg/dp/1718501838
Why Code Rusts: https://www.tdda.info/why-code-rusts
The post Beyond the Code: Unconventional Lessons from Empathetic Interviewing appeared first on Towards Data Science.
0 التعليقات
0 المشاركات
39 مشاهدة