You're Getting Faster. Is Your Team Getting Lonelier?

 TLDR; Shifting your team from "doers" to "directors" is the right move — but it comes with a hidden cost nobody is talking about: the social fabric of your team is quietly fraying. New research confirms that adding AI to a human team measurably reduces cohesion, trust, and shared identity. Your next leadership challenge isn't just what your team does with AI. It's making sure they still feel like a team while they do it. And if you're a manager already stretched thin — this post is written for you too.

In my last post, I talked about the essential shift your team needs to make right now — from "doers" to "directors." When AI handles the generation, your people need to step up as the strategic editors, the context-keepers, and the judgment-makers.

I believe that deeply.

But here's what I've been sitting with since I wrote it: the transition itself is a threat to the very team you're trying to empower.

And the data is beginning to back that up.

What the Research Is Telling Us

A study published in March 2026 in the International Journal of Human-Computer Interaction looked at something most leaders aren't measuring yet. Researchers compared teams of two humans working together versus the same teams after an AI was added as a third "teammate." The results were striking — and sobering.

Human-AI teams showed lower cohesion and lower team identification than their human-only counterparts. Critically, this wasn't just about performance. The effect showed up because team members reported lower trust and rated their own collective performance more poorly once the AI was introduced. The AI didn't just change the work. It changed how people felt about each other.

A separate 2026 study found something nuanced but equally important: how your team frames the AI matters enormously. When people perceived the AI as a tool, team spirit dropped. When they perceived it as a partner or teammate, cohesion stayed stronger — but only when trust in the AI was high. The framing you set as a leader isn't just philosophical. It has a direct, measurable effect on whether your team holds together.

And Amy Edmondson — she literally wrote the book on psychological safety — is showing up in this space now too. A February 2026 Harvard Business Review piece she co-authored found that despite expected productivity gains from AI integration, team performance is frequently declining, and trust is eroding in ways that are hard for leaders to name or pinpoint.

We are getting faster. And lonelier.

Why Executives Should Be Paying Attention to This Now

If you're leading at the organizational level, here's the business case in plain language: cohesion erosion is a leading indicator of the lagging metrics you actually get measured on — retention, quality, and delivery. By the time it shows up in your attrition numbers or delivery quality, you've already lost ground that's expensive to recover.

The good news is that this is visible before it becomes a crisis — if you know what to look for. Treat team health as a first-class metric alongside throughput during any AI adoption initiative. Not annual engagement surveys. Consistent, lightweight pulse checks that give you an early signal. Are people still collaborating voluntarily, beyond what the process requires? Are retrospectives generating new ideas or just rubber-stamping outputs? Is your best talent leaning in or going quiet?

If your AI transformation roadmap doesn't have a team health track running alongside it, it has a blind spot. The organizations that will win in the long run aren't the ones who adopted AI fastest. They're the ones who adopted it without losing the people who knew how to use it well.

The Hidden Cost of the "Doers to Directors" Shift

Here's the friction worth naming, because I think it's the next conversation we all need to have.

Think about what cohesion is actually built from: shared struggle, mutual dependence, the

experience of figuring something out as a unit. When AI absorbs the "doing," it also absorbs much of the raw material that bonds people together. The collaborative problem-solving. The brainstorm that goes sideways before it goes somewhere brilliant. The back-and-forth that turns a good idea into a great one.

AI can mimic the output of those moments. It cannot replicate their bonding function.

Research published in Computers in Human Behavior makes this concrete: AI agents impose less social and emotional burden than human teammates — and that absence of emotional involvement can reduce the cohesion and support that hold teams together when things get hard.

Less friction, yes. Less glue, too.

A Note for the Manager Who's Already Maxed Out

If you're a middle manager reading this, I want to acknowledge something directly: you are being asked to do a genuinely hard thing. You're navigating the AI transition and keeping delivery on track and trying to hold your team together — often without additional bandwidth, budget, or recognition for the invisible labor of all three.

The ideas below are not meant to add three new programs to your plate. They're meant to be simple pivots inside the work you're already doing. A different question at the end of a review meeting. A two-minute check-in added to a sync you're already running. A norm-setting conversation that takes twenty minutes once and saves you significant confusion later.

Small and consistent beats big and occasional every time. You don't need to get this perfect. You need to stay intentional.

Three Things Leaders Can Do Right Now

This isn't about slowing down your AI adoption. It's about being intentional about what you protect while you accelerate.

1. Design for Collective Wins, Not Just Efficient Outputs

Retrospectives and reviews can't just be quality gates on AI output. They need to be spaces

​A cartoon illustration presents a team of four colleagues who are thriving and engaged, leveraging advanced technology to achieve great results. Three wall clocks show harmonious times, a vibrant whiteboard features graphs labeled "TEAM OUTPUT," "PROJECT MILESTONES," "ENGAGEMENT & HAPPINESS," and "INNOVATION IDEAS" with positive upward trends, and the atmosphere is collaborative and optimistic. Thought bubbles above two of the workers contain a "constellation of ideas" and a "lightbulb moment." A manager figure actively guides the team with a smile. The glowing holographic interfaces are filled with positive symbols, and a notebook on the desk contains an organized thought map. The entire group is smiling and interacting, showcasing a balance of productivity and positive morale. An 'AI' block with a friendly green light sits on the desk.

where the team experiences shared ownership of the outcome — wins and misses alike.

Try this: when the AI produces something good, don't just approve it. Ask the team, "What did we bring to this that the AI couldn't?" Make the human contribution visible and specific. That's not just a morale exercise. It's how you rebuild the narrative that the human team is the source of value. For executives, this practice also surfaces the institutional knowledge and judgment your organization depends on — knowledge that rarely shows up in a capacity plan.

2. Reframe How You Introduce AI Into the Team

If you hand your team a new AI tool without a deliberate onboarding conversation, you've essentially added a new "teammate" on Day One and skipped every social norm-setting ritual you'd run for a human hire. No wonder things feel off.

Have the conversation explicitly: What is this tool to us? Where does it earn our trust and where do we verify it? What does it handle, and what will we always own? Research is clear — teams that build a shared, trusted mental model of their AI's role maintain stronger cohesion than those left to figure it out individually. For a stretched manager, that single conversation will pay dividends in fewer one-off clarifications and more consistent adoption. Do it once. Do it well.

3. Protect the Spaces Where Humans Connect as Humans

Your team's identity is built in moments of unmediated human interaction. A weekly sync where you lead with "how is everyone actually doing?" A walking retrospective. A low-stakes team lunch with no laptops. Whatever form it takes — protect it deliberately, because in the age of AI, that kind of intentional human space doesn't survive on its own.

Psychological safety research consistently shows that teams with high safety are trusting, open, and willing to take interpersonal risks. That environment doesn't build itself in Slack threads and AI-reviewed pull requests. It builds in the in-between moments — and as leaders, we have to stop letting those moments get squeezed out by efficiency. For executives: these are also your earliest warning system. What your people say — and don't say — in an unstructured human moment tells you more about organizational health than most metrics you'll see next quarter.

The Bottom Line

The core question of any team transformation isn't "how do we go faster?" It's what is this team for?

The answer has never been "to generate output." It's always something richer — to solve hard problems, to serve customers in ways that matter, to grow, to innovate, to build something worth building together.

AI can supercharge every one of those things. But it can also, quietly and without malicious intent, hollow out the relational infrastructure that makes a team more than a collection of individuals processing tasks.

Your job as a leader — whether you're setting strategy from the C-suite or holding your team together from the middle — is to hold both things at once. Empower your team to direct the machine. And protect the human glue that makes them a team worth directing with.

The shift from doers to directors is right. The next shift is making sure they remain, through all of it, each other's.

We all win together,

Coach Dan



Comments

Popular posts from this blog

True Leadership Is About Helping People To Grow

Stop A Moment And Look Around You

Never be afraid