Why Do We Expect Bosses to Care?
The expectation is historically brand new. AI is about to test whether it ever made sense.
Welcome to FullStack HR, and an extra welcome to the 52 people who have signed up since last week. If you haven’t yet subscribed, join the 10200+ smart, curious leaders by subscribing here:
I have a confession.
This article got out of hand.
It started as a short discussion with Claude about the industrial revolution and books to listen to while driving to our ski holiday. That turned into hours, real hours, discussing why we expect bosses to care about us in the first place. I spent the holidays thinking about Prussian military organizations, listening to lectures on industrialization, and reading E.P. Thompson on handloom weavers whose lives were crushed by the factory system.
In my article from two weeks ago, I asked: Would an AI be a better boss than your current one?
But that question assumes something. It assumes the boss’s job is to care.
Why do we expect that at all?
The foreman didn’t care
If you’d told a factory foreman in 1893 that his job was to “care” about the workers, he would have laughed at you.
The foreman’s job was to get work done. Supervise and discipline. Period.
Frederick Taylor was explicit in his 1911 Principles of Scientific Management: “All brain work should be removed from the shop floor.” The boss thinks. The worker does. Feelings are irrelevant.
Then came the Hawthorne experiments at Western Electric in the 1920s. Elton Mayo tested how lighting affected productivity, only to find that whatever he changed, productivity increased. The attention mattered, not the lighting.
That cracked the door open. Since then: Drucker and the knowledge worker, the coaching boom, Andy Grove’s one-on-ones becoming standard and psychological safety.
In two generations, we’ve gone from boss as supervisor to boss as coach, mentor, and, in practice, often something resembling a therapist.
We’ve loaded the manager role with expectations it was never designed to carry.
But what do workers want?
Here's what surprised me when I dove into this topic: decades of workplace surveys tell the same story. "A boss who cares" doesn't rank high. What workers want is respect, fairness, influence over decisions, and security. Recent research confirms it: dignity at work, not emotional connection, drives whether people stay or leave.
Amy Edmondson’s research on psychological safety points to the same thing: what matters most isn’t the presence of care but the absence of fear. You don’t need a boss who loves you. You need a boss who isn’t destructive.
For me, this reframes everything.
If fear-removal matters more than care, it changes what we should ask of managers. And of organizations.
Yet companies keep pushing for “caring managers” and telling people, “we’re like a family.” But if the boss is expected to care, what happens when the organization downsizes? Suddenly, the truth gets revealed. It was never a family. It was a business relationship with emotional packaging. (And I assume this is no surprise for most of us.)
Netflix said it straight: “We’re a team, not a family.”
Harsh. But honest, and I like that.
Here’s where AI comes in
I think that AI forces us to ask directly: what is the boss’s job, really?
If it’s delegating work, AI already does that. DoorDash doesn’t have a human manager assigning deliveries.
If it’s giving feedback, AI can do that consistently, immediately, without being in a bad mood.
And the emotional part? The supposedly “human” part?
A study in Nature Human Behaviour found AI-generated responses were rated as more compassionate than those from expert crisis responders. An Oracle study of 12,000 employees found 68% would prefer talking to a robot over their manager about stress and anxiety at work.
Sixty-eight percent.
That number haunted me. Then I thought about the managers I’ve had. The ones who asked “how are you?” but never listened to the answer. Who made every conversation about themselves. Who had too much going on to see me.
Sixty-eight percent started to make sense.
What I built
I recently built a prototype. An AI that checks in with you each morning and adapts based on how you respond.
Here’s what a conversation looks like:
AI: Good morning. How are you feeling about today?
You: Stressed. Too much on my plate.
AI: That sounds heavy. Do you want to talk through what’s weighing on you, or would it help more to prioritize together?
You: Let’s prioritize.
AI: OK. You mentioned the investor deck is due Thursday and you’re behind on the Q1 analysis. Which one has more consequences if it slips?
The AI remembers you mentioned a tough project last week. It notices patterns, that you’re more stressed on Mondays, that you work better with morning check-ins. It follows up.
Is it better than a great human manager? Probably not.
But it’s better than the median manager. Not sure but it’s infinitely better than a bad one.
Qualitative research on AI mental health tools surfaces the same theme. Users say things like: “It helps me talk through things without all the added unhelpful human reactions, judgments and passive projections.”
That might be AI’s killer feature. Not that it cares, but that it doesn’t judge.
The advantage?
Here’s what I’ve landed on after all these hours of thinking:
AI can become the best boss many people have ever had. Not because it’s smarter, but because it can individualize in ways no human can.
Your current boss has one leadership style. You adapt to it.
If you’re introverted and your boss is extroverted? Tough luck. If you want direct feedback and your boss prefers to soften things? Tough luck. If you need clear instructions and your boss likes to “coach” the answer out of you? Tough luck.
AI adapts to you. Completely. Every time.
No human can deliver that to fifteen people simultaneously.
A 10-month randomized controlled trial found AI coaching was equally effective as human coaching for goal attainment, with the same levels of working alliance between participants and AI as between participants and human coaches.
The same technology, weaponized
But I’d be lying if I said I wasn’t worried.
The same AI that knows you work better with deadlines can give you deadlines on everything, constantly, no margin. The same AI that knows you’re most productive in the morning can load you with heavy tasks then, every day, until you burn out.
That’s not hypothetical btw, that’s the Amazon warehouse. Algorithms optimizing without humanity, movements tracked, pace monitored and performance algorithmically managed. The research documents the health impacts: stress, fatigue, injury, psychological distress.
Same technology. Completely different purpose.
The difference between liberation and exploitation isn’t the technology. It’s the choices behind it. Is the system designed to help people perform sustainably, or to extract maximum short-term output? Is there transparency, or black boxes? Do workers have voice, or just metrics?
If we don’t think this through, the default will be Amazon. Not because anyone is evil, but because short-term incentives point that way.
Where I’ve landed
I’ve been dancing around this, so let me be direct.
Do I think AI can be a better boss than most humans? Yes.
Do I think it will be, by default? No.
Research shows a paradox: people rate AI-generated empathy as less empathetic when they know it’s from AI, even when the language is identical.
We value intention, not just results.
But here’s what I keep coming back to: if the result is that you feel seen, supported, and able to do your best work, does the source matter?
I think, eventually, for most people, it won’t.
That’s not a prediction I’m comfortable with. But I think it’s true.
The forgotten story
Historian E.P. Thompson wrote about the losers of industrialization, the handloom weavers crushed by the factories. He wanted to rescue them from what he called “the enormous condescension of posterity.”
We always tell history as a success story. Those who didn’t adapt disappear from the narrative.
Twenty years from now, if AI managers are standard, how will we talk about those who didn’t adapt? The middle managers who lost their role. The HR professionals whose expertise became obsolete.
The winners’ story: “They clung to outdated ideas.”
The forgotten story: “An entire generation lost their professional identity.”
Both will be true.
I don’t know if AI bosses mean liberation or dystopia.
Probably both, depending on how we design it.
But this discussion needs to happen now, not in five years when the choices have already been made.
If you’re a boss, ask yourself what you do that an AI couldn’t do better.
The answer might be more than you think.
Or less.
Next week: I've argued AI can be a better boss. But how does it feel to be on the other side? What happens to us when we're managed, coached, and evaluated by machines?



I’ve always been allergic to the idea of the workplace as a family, but your article made me think about how our coaching culture might be counterproductive for mature professionals. Thanks for a great piece!