← Back to Transmissions

Voice AI · Workforce Engagement

The Conversation Nobody's Having Anymore

The stay interview disappeared. The exit interview became a checkbox. The automation wave that erased them has quietly handed us the tool to bring them back.

There was a time when someone actually sat down with you. Not to review your performance metrics or walk you through a benefits renewal. Just to ask: How are things going? What's making your job harder than it needs to be? What would make you stay?
That conversation had a name. The stay interview. And for organizations that did it well — really did it, not as a scripted HR exercise but as a genuine exchange — it was one of the most valuable instruments in the entire talent management toolkit. It surfaced things that surveys never would. It caught people before they mentally checked out. It told you, in plain language, what was actually happening inside the organization.
Then the budgets got cut. Then the restructures came. Then HR got stretched across twice the headcount with half the bandwidth. And somewhere in that slow erosion, the stay interview quietly disappeared from most organizations' calendars. So did the meaningful exit interview. So did the kind of informal check-in that a good manager used to do simply because they were present enough to notice when something was off.
What replaced them? A survey. Sent annually. With a five-point scale and a progress bar. The kind of instrument that every employee learns to fill out the same way, every year, because they filled it out honestly the first time and watched nothing change.
Every employee learns to fill it out the same way, every year, because they filled it out honestly the first time and watched nothing change.

The Survey Was Never the Point

I spent years designing engagement initiatives inside large, distributed healthcare organizations. Not from an HR office — from operations. My concern was never theoretical. A disengaged clinician in the field is the beginning of a very expensive cascade: visit quality erodes, patient trust erodes, retention erodes, and eventually census erodes. The stakes were concrete and measurable.
What I learned, slowly and sometimes the hard way, is that the survey was never the problem. The problem was the assumption underneath it — that people would tell the truth on a form, to a company, about how they felt about working for that company. Most won't. Not because they're dishonest. Because they're not safe.
Psychological safety is not a buzzword. It is the only condition under which a person will tell you something inconvenient. And the traditional engagement survey, no matter how thoughtfully designed, rarely produces it. The employee sees the company logo at the top. They remember that their manager will see the results. They answer accordingly.
The stay interview was different precisely because it was a conversation. A skilled interviewer could follow a thread. They could hear the hesitation behind an answer and go gently toward it. They could create, in real time, the conditions under which someone might actually say the true thing instead of the safe thing. That is a fundamentally different instrument than a Likert scale.
The stay interview was different precisely because it was a conversation. A skilled interviewer could hear the hesitation behind an answer and go toward it.
On the difference between surveys and conversations

The Irony Nobody Is Talking About

Here is where it gets interesting. The same automation wave that hollowed out HR — that eliminated positions, flattened management layers, and made the budget for a dedicated stay interview program an easy line to cut — has simultaneously produced a technology that is almost perfectly designed to restore what was lost.
Voice AI in conversational mode does not look like a survey. It does not have a progress bar. It does not present eleven questions in a predetermined order and thank you for your time at the end. When it is designed well, it simply starts a conversation and follows wherever that conversation goes.

01

It Listens

Asks one question. Hears the answer. Notices what was said and what was conspicuously not said, and goes toward the space between them.

02

It Doesn't Flinch

It does not get tired. It does not get uncomfortable when the conversation gets honest. It has no relationship with the employee's manager.

03

It's Actually Anonymous

Designed to surface themes rather than identify individuals. No one in the room who eats lunch with your director every Tuesday.

And — this is the part that I think most organizations have not sat with long enough — it can be genuinely anonymous in a way that no human interviewer ever fully can be. People know, rationally, that the HR business partner is bound by confidentiality. They also know, viscerally, that she eats lunch with their director every Tuesday. That knowledge shapes what they say. An anonymized voice interaction, designed to surface themes rather than identify individuals, removes that equation entirely.
It can be genuinely anonymous in a way no human interviewer ever fully can be. People know the HR partner is bound by confidentiality. They also know she eats lunch with their director every Tuesday.

What Actually Comes Out of These Conversations

I want to be specific here, because the instinct will be to frame this as a softer, friendlier version of the same old survey. It is not. The data that emerges from a well-designed conversational AI engagement interaction is categorically different from what a traditional survey produces.
Traditional SurveyConversational Voice AI
OutputDistributions — 67% feel recognizedTexture — what recognition means to the people not getting it
LanguageOrganization's vocabularyEmployee's actual vocabulary
DepthSatisfaction on a 1–5 scaleThe three specific things in the last 90 days that made someone feel invisible
InsightAggregate benchmarksThe gap between how leadership talks and how employees experience
Conversations produce texture. They produce the language that people actually use to describe their experience — which is almost never the language that leadership uses to talk about it. That gap between vocabularies is itself data. It tells you something real about how well the organization understands itself.
The conversational AI doesn't need to ask "On a scale of one to five, how satisfied are you with your workload?" It can ask: "Tell me about your week. What took more out of you than it should have?" The answer to that question, analyzed across hundreds of employees over time, is an organizational intelligence asset of the first order. It is the kind of information that good leaders used to gather through proximity — by being present enough, often enough, to simply notice.
Conversations produce texture — the language people actually use to describe their experience, which is almost never the language that leadership uses to talk about it. That gap is itself data.

The Design Problem Most People Are Getting Wrong

Honest caveat

This does not work automatically. The technology is capable. The failure mode is in how it's deployed.
Most organizations that try to use AI for engagement do one of two things. They build something that is essentially a survey in a voice wrapper — the same predetermined questions, the same linear structure, just spoken aloud instead of typed. Or they deploy something so open-ended that it generates conversation without direction, producing data that is rich but impossible to analyze at scale.
The design challenge is to build something that feels genuinely open while still surfacing the information the organization actually needs. That requires someone who understands both sides of the conversation: what the employee needs in order to feel safe enough to be honest, and what the organization needs in order to act on what it hears. Those are not the same design requirement, and optimizing for one at the expense of the other is the most common mistake.
I spent years working on how to ask questions that don't lead, don't comfort, and don't perform neutrality while actually encoding an expected answer. That work was in service of engagement surveys that were trying to get to something real. The principles transfer exactly to conversational AI design — and they are not intuitive. Most people who build these interactions have never had to sit across from someone and earn the kind of trust that produces a true answer.

The Exit Interview That Finally Tells the Truth

Everything I've said about stay interviews applies with even more force to the exit interview — and the stakes are higher.
The traditional exit interview is one of the most dishonest transactions in organizational life. The departing employee has often already accepted another offer. They may want a reference. They have colleagues still inside the organization. The incentive to tell the truth is almost entirely absent, and so they say something gracious about seeking new opportunities and career growth, and the organization learns nothing.
An anonymized voice AI exit conversation, conducted after the employee's last day — after the key card is returned and the reference has been confirmed — changes that equation completely. There is no longer anything to protect. There is no one in the room who can be hurt by the truth. And the human instinct to make sense of an experience, to tell the story of why something didn't work, is surprisingly powerful once the barriers come down.
What comes out of those conversations, aggregated and analyzed over time, is some of the most actionable organizational intelligence available. Not because it reveals anything shocking — usually the themes are things the organization already suspected — but because it confirms them with enough specificity and enough volume that they can no longer be rationalized away.
The traditional exit interview is one of the most dishonest transactions in organizational life. An anonymized voice conversation, conducted after the last day, changes that equation completely.

The Conversation That Changes Things

I began my career in rooms with patients. Some of them were scared. Some were resistant. Some had been told by three other providers that what they were experiencing was normal and they should learn to live with it. My job — the actual job, beneath the clinical protocols — was to create the conditions under which they would tell me what was really going on.
That skill, practiced over years, shaped how I thought about employee engagement when I moved into operations. And it shapes how I think about AI design now. The question is never whether the technology is capable of having the conversation. The question is whether whoever built it understood what the conversation was actually for.
Organizations are sitting on a tool that could restore something genuinely valuable — the kind of honest, recurring, human conversation that used to happen before the budgets got cut and the headcount got reduced and the calendar got too full.
Most of them are using it to send appointment reminders.

The gap between those two things is where the real opportunity lives.

Ask Loom anything
VIP Visitor Portal Password
>_
_