Long before electricity or algorithms, the ancient Greeks told a story about Prometheus, the Titan who defied the gods by stealing fire from Mount Olympus and giving it to humanity. Until that moment, humans lived in darkness but with fire, they could cook, forge, build and begin to control their destiny. Even more, this fire symbolized the raw power that shape our world: knowledge, creativity and invention. Prometheus’ gift was transformative but came at a cost.
Zeus, king of the gods, was enraged by this act of rebellion and punished Prometheus for eternity: Prometheus was chained to a remote mountain, where an eagle would devour his liver each day, only for it to regenerate overnight, repeating the torment endlessly. Zeus’ punishment wasn’t for giving humanity warmth or light, but for giving them power without permission, without a plan, and without ensuring they were ready for it. A thousand of years old story still echoes with me today; now we face a similar moment where AI is our modern fire.
We’ve created something powerful: analyze massive datasets, generate insights, automate decisions, and learn and adapt on its own. Like fire, AI has the potential to accelerate progress in ways we’re only beginning to understand. But also, like fire, it doesn’t come out of the box with wisdom or intention; it doesn’t know what matters; AI reflects what we give it and scales what we teach it.
In many organizations, AI is being introduced with great enthusiasm: new platforms, fresh pilots, powerful models. But often, the outcomes don’t quite match the promise. Projects stall and insights get ignored while models underperform or fail to scale. The problem, more often than not, isn’t the technology but the environment it’s introduced into.
AI doesn’t transform organizations but scales them. It reflects the systems, behaviours, and dynamics already in place so if your teams don’t collaborate, your AI won’t learn across silos. If knowledge in your environment is hoarded, your models will miss the context that matters. If decisions are made behind closed doors, AI will simply accelerate the same inefficiencies. AI won’t fix what’s broken but will codify it and make it harder to change.
That’s why culture, not data and infrastructure, is the real foundation of an effective AI strategy. If you want intelligence to flow through your organization, your people need to do the same. If you want learning to accelerate, your culture must support reflection, experimentation, and shared understanding. Without these conditions, AI is just another tool presenting risks in automating dysfunction rather than solving it.
Organizations that are seeing meaningful returns from AI aren’t necessarily the ones with the most technical sophistication but the ones where people communicate openly, where knowledge moves without friction, and where learning is valued as much as efficiency. In these environments, AI it enhances human thinking rather than replacing it and becomes part of the way decisions are made.
Creating that kind of culture requires intentionality. This means examining how knowledge is shared across teams, how insights are surfaced, and whether failure is treated as a threat or an opportunity to grow. It means creating space for real conversations between technical and non-technical teams, so that context drives decision alongside computation and the language of AI is translated in ways that are accessible, engaging and meaningful.
And perhaps most importantly, it means asking what kind of behaviours you culture rewards: if you celebrate only control, certainty, and speed, your teams will hide uncertainty and resist change. But if you reward curiosity, reflection, and collaborative problem-solving, your AI efforts will benefit from the full intelligence of your people.
Right now, most AI systems still require human supervision: we build them, feed them data, guide their outputs. But that won’t always be the case. AI is evolving to become more autonomous, more capable of self-learning, and more deeply embedded in the systems we rely on. In time, it will make decisions we don’t fully understand, and it will do so at a scale and speed beyond what we can manually oversee.
When Prometheus brought fire down from Olympus, he set something irreversible in motion. Once that power was in human hands, it could no longer be contained and the same is true of AI. We’ve already released it into the world and what remains is to decide what kind of intelligence we want it to become and what values we want it to carry forward.
AI won’t grow wise on its own; it will continue learning from our data, as well as our habits, our structures and blind spots. If we embed short-term thinking, it will optimize for short-term outcomes. If we normalize exclusion or bias, it will reproduce those dynamics at scale. But if we teach it in environments grounded in trust, transparency, and accountability, it has the potential to amplify our best thinking.
Prometheus gave humanity fire, and in doing so, accelerated our evolution. But he also gave us a warning: that power alone is never enough. Without foresight and responsibility, even the brightest gift can become dangerous. Today, we stand at a similar threshold. AI is here and its power is real. But its value depends entirely on the culture it grows within: if we want it to drive progress alongside productivity and insight alongside automation, then we need to focus on the human systems surrounding it. It won’t be AI that transforms our companies or communities; it will be us the people who shape it, guide it, and decide what it learns to value.