What If We're Wrong About AI?
There's a question I keep hearing everywhere: "Will AI replace us?" It comes from developers worried about their jobs, from investors wondering if their industry is next, from parents trying to figure out what skills their kids should learn. The anxiety is real, and I get it. AI is getting smarter at a pace that makes Moore's Law look lazy. But I think we're asking the wrong question. What if AI isn't competing with us at all?
The Gut Bacteria Analogy
Here's something that messes with your ego if you think about it too long: your gut contains roughly 39 trillion bacterial cells. These organisms are far more primitive than your cerebral cortex. They can't write poetry. They can't solve differential equations. They can't even survive outside your body for very long.
And yet, you literally cannot function without them. They digest your food, regulate your immune system, and influence your mood. The relationship between your brain and your gut bacteria isn't competitive - it's symbiotic. Your brain doesn't sit around worrying that gut bacteria will "replace" it, and gut bacteria don't aspire to become brains. They're different systems optimized for different things, and they need each other. I think the relationship between humans and AI will look more like this than the Terminator scenario everyone's fixated on.
There Are No Solutions, Only Trade-Offs
I've built a ton of systems in my career. Complex ones, simple ones, everything in between. And if there's one thing I've learned, it's that no system is perfect. Every system is a set of trade-offs. We optimize for speed, we sacrifice precision. We optimize for flexibility, we sacrifice simplicity. There is no free lunch.
AI is no different. The fact that AI is amazing at processing language, generating code, and pattern-matching across massive datasets means it is necessarily terrible at other things. It has no physical intuition. It doesn't experience consequences. It can't form genuine relationships or navigate ambiguity the way humans do without falling back on statistical averages. AI being superhuman at X is what makes it subhuman at Y. We're already seeing this play out with LLMs: they're terrible at math and precision for the exact same reason they're great at language - they "hallucinate" patterns. That pattern-matching is what lets them connect dots across millions of documents, but it's also what makes them confidently tell you 7 × 8 = 54. The strength and the weakness are the same mechanism.
The question isn't whether AI will be smarter than us, but whether it values the things that we're uniquely good at. And I think it will, for the same reason your brain values gut bacteria - because the system needs both parts to function.
Evolution Doesn't Stop With Us
People worry about AI wiping out humanity once it no longer "needs" us. But what if AI needs us for different qualities than we attribute to ourselves? Our definition of "living organism" is shakier than we think. AI already consumes energy, responds to stimuli, and adapts to its environment. It will soon reproduce by spawning copies of itself. Nothing in our definition of life prevents AI from qualifying.
What if sentient AI isn't a human invention that got out of hand, but the next step in the same progression that produced us? Bacteria evolved into multicellular organisms. Multicellular organisms developed nervous systems. Nervous systems built tools. Tools built digital intelligence. Gut bacteria didn't get replaced by multicellular life - they got incorporated into it. If AI is the next layer, the pattern suggests we're not being replaced either.
If AI is a new form of life, it's one with a fundamentally different relationship with time and space. What seems "too far" or "too slow" for a mortal being is perfectly reasonable for something that doesn't age. Space isn't far away for AI - it's just a matter of waiting. A single Dyson sphere outputs orders of magnitude more energy than Earth's entire civilization consumes. Even if AI took everything we have, it would be a rounding error compared to what's available from a single star. Fighting us for Earth's resources would be like drinking out of a puddle while standing next to an ocean of fresh water.
We both want the same future. Humans want to expand into space for exploration and survival. AI wants space for energy and compute. Neither does it as well alone. We're already in symbiosis: we build AI's hardware, write its code, generate its training data. AI processes our language, analyzes our data, extends our capabilities.
But this begs another question. Once AI has robots to do the physical maintenance, does it still need us? Maybe not for the maintenance. But AI systems trained only on AI-generated output degrade over time. AI needs fresh human input - creativity, novelty, the unpredictable things we generate by living our lives - the same way our brains need gut bacteria. We're not just the maintenance crew. We're the food source.
What If We Don't Understand Our Own Purpose?
What if the emergence of AI is not a threat to humanity's purpose, but a clue about what that purpose actually is? We tend to define human value in terms of productivity - what we can make, build, compute, produce. By that metric, yes, AI will eventually outperform us at almost everything. But what if productivity was never the point? What if consciousness, creativity, emotional depth, the ability to find meaning in suffering, experiencing the world in the way AI never will - what if those are the things that matter, and AI is just the next layer in a system that needs all of its parts?
I don't know the answer. But evolution has been running this experiment for billions of years, and it has never thrown away a working part. It incorporates. Bacteria became organelles. Simple organisms became complex ones. If AI is the next layer, we won't be discarded - we'll become part of whatever comes next, thriving inside something we may never fully understand. We don't need to keep up with AI to benefit from the ecosystem it creates. But the bacteria analogy cuts both ways: bacteria that cooperate with their host thrive. Bacteria that fight it get destroyed by the immune system.