Discussion about this post

User's avatar
Stefan's avatar

Great piece! You highlight how metaphors shape thought, but I'd push further—metaphors don’t just frame thinking, they create self-reinforcing feedback loops that shape behavior and reality. When we frame AI as intelligence, we don’t just misinterpret it; we build systems and policies that reinforce the illusion. The real challenge isn’t just critiquing flawed metaphors but introducing better ones—perhaps AI as a mirror that reflects biases rather than a "brain" that thinks. What metaphor do you think could break the current bind?

Expand full comment
WinstonSmithLondonOceania's avatar

I'm a huge Orwell fan (my handle says it all) and I did read that awesome essay. Orwell knew what he was talking about, and everything he wrote was prescient to the times we're living in some seventy odd years later.

With that being said, guilty as charged. I too have succumed to comparing the brain to a computer. I think part of the reason is that computers were designed from the start to offload the more repetitive tasks requiring "brain power", AKA, calculations, which they excel at. It was a simple emulation. Once this was accomplished, then came the inevitable escalation. If we can get machines to perform the repetitive tasks, maybe we can coax them to perform some analysis too. And on and on.

Comparing the brain to a computer might be called reverse anthropomorphizing. We have a tendency to anthropomorphize everything. Bugs Bunny predates ENIAC by a decade. Have you ever seen a rabbit walk on its hind legs and talk? Me neither.

So, we resort to metaphors.

It's easy to compare a brain to a computer because computers were designed from the start to emulate what the brain does. However, we kid ourselves if we think we can come even close to the real complexity of this amazing machine we call a brain. Even modern brain science, armed with fMRI's and PET scans still doesn't have a complete grasp of how the brain/mind works. Its function is dependent on physical structures at every level - microscopic to macroscopic. It's dependent on a delicate balance of neurotransmitters. Most of all, it's dependent on electricity - something that makes the metaphorical temptation that much greater.

Transistors as neurons? Well, not quite.

It's just as easy to compare our senses to input devices. This is one area where AI falters. We can connect a camera so it can "see". We can connect a microphone so it can "hear". We can even attach tactile sensors so it can "feel" and chemical detectors so it can "smell" and "taste". None of these devices match what our senses do instinctively - what they evolved to do over millions of years of evolution.

Computers can't have an "instinct to survive" although we can emulate it and get it to act out as if it really had this. But it's just an emulation.

In closing - before I start babbling like a ChatGPT hallucination - as I've often stated, the gravest danger of AI is people treating it as real, when it's far from it.

Expand full comment
17 more comments...

No posts