There are corporate crises.

And then there was OpenAI in November 2023.

Within days, one of the most powerful AI companies in the world went from firing its CEO to reinstating him under employee revolt. The board removed Sam Altman citing a lack of candor — and provided no context, no supporting explanation, nothing beyond a vague statement that created more questions than it answered.

What followed was chaos in real time. Employees revolted. Microsoft stepped in. Negotiations imploded. Even leaders who had supported the firing signed the letter demanding his return. The board folded. Altman came back stronger than before.

But here's what most people missed.

The crisis didn't spiral because of the decision.

It spiralled because no one explained it.

OpenAI didn't just lose control of the narrative. They never owned it in the first place.

They communicated a decision without communicating a reason

The board fired Altman for being "not consistently candid."

That's it. No context. No detail. No grounding. No emotional truth. Just a vague statement that created more questions than it answered and gave everyone watching —employees, investors, the press, the public — the same thing: an open gap.

And when clarity is missing, people fill the gap with fear, bias, and suspicion. Every version they create is worse than whatever the truth actually was.

If your message creates more confusion than direction, you haven't announced a decision. You've announced a problem — and handed the interpretation to everyone except yourself.

They underestimated the emotional impact on stakeholders

OpenAI wasn't a normal company.

It was a mission-driven organization with employees deeply tied to a specific vision, a specific identity, and a specific leader who had become inseparable from the existential ethos of the organization. Removing a symbolic leader without preparing the emotional landscape wasn't just a governance misstep.

It was a psychological one.

Employees weren't reacting to the firing. They were reacting to the void of meaning around the firing. Nobody told them what it meant for the mission. Nobody told them what it meant for their work. Nobody told them what it meant for them.

In mission-driven organizations, clarity isn't operational. It's psychological. The meaning of the work is part of the compensation, and when that meaning is suddenly unclear, the response isn't quiet acceptance. It's existential alarm.

They didn't anticipate what people would do

Most boards assume employees will grumble quietly. OpenAI's employees did the opposite.

Seven hundred of seven hundred and seventy signed a letter saying bring him back or we all leave.

The board had misread almost everything: the emotional loyalty to Altman, the internal power dynamics, Microsoft's influence, the external pressure, the cultural moment, the willingness of a technically elite workforce to actually walk out the door. This wasn't a vacuum. It was a narrative environment with enormous latent energy, and the board fired into it without understanding what would happen next.

Good communicators ask what people will feel. Great communicators ask what people will do. OpenAI asked neither.

They left a clarity vacuum — and Microsoft filled it

When OpenAI failed to explain what was happening, Microsoft stepped in and did it for them.

Quickly. Cleanly. Strategically.

Microsoft effectively said: we hired Altman, we support him, we support OpenAI employees.

The translation was unmistakable: we have clarity. They don't.

In the absence of clarity, power shifts to whoever provides it. The organization that controls the narrative isn't necessarily the one with the most authority — it's the one with the most coherent story at the moment everyone else is confused. OpenAI surrendered that position the moment they issued a statement that explained nothing.

If you don't own the narrative, someone with more confidence will own it for you.

They failed the rhythm test

The cadence of OpenAI's communication throughout the crisis was slow, reactive, inconsistent, contradictory, and defensive. Each statement seemed to arrive too late and say too little. Each clarification created new questions.

In a crisis, timing is tone. Silence signals disorganization. Vagueness signals instability. And a pattern of reactive communication — always one step behind the story that's already forming — signals that nobody is actually steering.

Predictability matters more than positivity in a crisis. Cadence matters more than comfort. Clarity matters more than certainty.

OpenAI had none of the three.

The real lesson

Behind the scenes there were likely deeper governance disputes about AI safety and commercialization — genuinely complex issues worth serious debate. But none of that context was communicated. None of the reasoning was shared. None of the human reality of the decision was acknowledged.

The board didn't create chaos by firing Altman.

They created chaos by withholding the why, misreading the emotional reality of their organization, underestimating their stakeholders, losing their cadence, and communicating like a governance body rather than like leaders.

This wasn't a tech crisis. It wasn't a governance crisis.

It was a clarity crisis.

And it's a reminder that leadership isn't about forcing decisions. It's about communicating them in a way that preserves trust.

Because when clarity collapses, everything else collapses with it.

Until next time,
Ana

Clarity isn’t corporate - it’s human.

Reply

Avatar

or to participate

Keep Reading