21 Comments
User's avatar
David Harris's avatar

Our growing separation from direct experience has always been with us. Printing technology gave us books in which to bury our noses and ignore the turmoil around us. Before that, we had narrators to tell us about the world removed from our direct access. I agree that 'this time', the nature of the insulation seems radically different. But culturally, we have been prepared for its arrival.

Expand full comment
The One Percent Rule's avatar

You are right, the desire to engage with the world indirectly, whether through a storyteller or a printed book, isn't new. Christine Rosen herself acknowledges this historical arc when she discusses how even earlier technologies altered human interaction and perception.

The time feels "radically different", as you so aptly put it. It's perhaps the pervasiveness, the interactivity, and the algorithmic nudging of today's digital mediations that are so transformative. As Rosen argues, and as I attempted to convey, the issue isn't just that we're burying our noses in something, but that the "something" is now designed to constantly harvest our attention and data, subtly reshaping our experiences and even our ontology.... to get us addicted

The idea that "culturally, we have been prepared for its arrival" is a fascinating and slightly unsettling thought.

Expand full comment
Justin Reidy's avatar

You raise a good point, and that shows the complexity of the matter. Does technological abstraction “extend” experience or abstract it? The printing press provided insulation, but it also provided access to knowledge and “indirect experience” that would not otherwise be possible.

The same can be said for AI. But does technology add to and enhance lived experience, or supplant it?

All technology is meant to be a tool. Not a world or home. And that’s the crisis we face in this moment.

Expand full comment
The One Percent Rule's avatar

Thank you Justin. The crux is whether the current wave of digital, algorithmic, and immersive technologies is fundamentally different, whether it's crossing a threshold from enhancement to wholesale supplantation. Is AI, as you mention, a sophisticated tool that augments our abilities, or is it, as my essay puts it, part of a system that "colonized our ontology" and turns lived reality into a "confusion of simulacra"?

Your point is so accurate, when the tool becomes the environment, when the "User Experience" becomes the dominant mode of interacting with reality, then we are indeed in a new kind of crisis, one where the very nature of direct, embodied experience is at stake. Thank you for crystallizing that so clearly.

Expand full comment
The One Percent Rule's avatar

There is a new video with the former CEO of Google Eric Scmidt on AI Replacing Your Friend Group! https://www.youtube.com/watch?v=GIvOw5YI_4A

Expand full comment
Marginal Gains's avatar

So the plan is to isolate people by making them rely solely on AI instead of real human interaction, leading to mental health issues—and then conveniently offer AI therapy as the solution? What a genius business model!

Expand full comment
Marginal Gains's avatar

With the rise of AI, we are entering a world rife with grifters and snake oil salesmen. Without a governing authority to verify their claims, the industry has become an "anything goes" culture, where exaggerated promises and unchecked hype often overshadow accountability and truth.

This reminds me of what led to the formation of the FDA in the early 1900s. Back then, the market was flooded with fraudulent "cure-all" medicines, false claims, and dangerous products, putting public health at risk. Public outcry over these deceptive practices eventually led to the Pure Food and Drug Act of 1906 and the establishment of the FDA, ensuring safety, efficacy, and truth in marketing. Similarly, today's unchecked nature of the AI industry highlights the urgent need for regulatory frameworks to prevent harm, protect the public, and foster trust in this transformative technology.

Expand full comment
Marginal Gains's avatar

This is an excellent summary of the book that I have not read. However, I have downloaded it now for reading.

We may or may not be living in a simulation, but we are undeniably building a digital world, slowly moving us, one skill and experience at a time, toward living as though we are part of one. As the post and Christine Rosen’s book “The Extinction of Experience” highlight, there are profound consequences to this shift. Technology is no longer just a tool—it is becoming an extension of us, often at the expense of real-life experiences. We are introducing technology to children earlier and earlier, prioritizing screens over skills and connections with the real world. Instead of teaching children to navigate the complexities of life, we offer them the simplicity of mediated experiences. I wouldn’t be surprised if, in the future, robots or AI become our primary companions, replacing human relationships—a trend I’ve commented on before.

As technology fulfills more of our needs and desires, it reduces the necessity of engaging with the world and other people. If everything we want can be delivered with a click or a call, why bother with the messy, unpredictable nature of human interaction? Yet, as Rosen argues, this avoidance has serious consequences. Much of the mental health crisis we see today in both children and adults can be tied to this lack of interaction with the world because the real world, unlike the digital one, is messy. Building relationships with people takes effort, empathy, and resilience—qualities that cannot be cultivated through screens alone. Technology has become a full-time companion for many with its ease and immediacy, but we cannot blame the tech industry alone. We, too, are responsible for allowing it to dominate our lives. While the industry profits from addictive designs, we must consciously prioritize the real world—for ourselves and our children.

Both parents and schools, for instance, should teach children how to interact with others and navigate relationships. One way to support this is by restricting the use of phones and laptops during school hours so children are encouraged—if not forced—to engage with their peers and teachers. This could help them learn the foundational social and emotional skills for a healthy, connected life.

I’ve learned over time that you understand a person better by engaging with them face-to-face and observing their emotions through their body language and expressions. Even a video call cannot replicate the depth of understanding of being physically present with someone. The lived experience of interacting with others teaches far more than any digital medium ever could. This is why AI and robots struggle with the real world—because it is full of infinite possibilities and uncertainties. If we continue retreating into the certainty of the digital world, we may become like AI ourselves: highly specialized in narrow skills but unable to navigate the unpredictability of real life.

Uncertainty, as Ursula K. Le Guin so beautifully put it, is what makes life meaningful:

"The only thing that makes life possible is permanent, intolerable uncertainty—not knowing what comes next."

Technology and the digital world, however, are leading us toward a false certainty in a world where true certainty is impossible. While humans naturally dislike uncertainty, embracing it gives life its richness and vibrancy. We risk losing what makes life worth living by chasing the illusion of control and predictability that technology offers.

I will end with this quote from Sebastian Junger, which perfectly encapsulates the heart of the issue:

"Humans don’t mind hardship; in fact, they thrive on it; what they mind is not feeling necessary. Modern society has perfected the art of making people not feel necessary. It's time for that to end."

This speaks directly to the dangers of allowing technology to replace real-world interactions and experiences. We lose our sense of purpose and belonging when we outsource so much of what makes us human to digital devices. Belonging to something bigger than ourselves—a family, a community, or even the natural world—is essential for a meaningful life. It is not the hardship that breaks us; it is the lack of connection and purpose.

This is why we must consciously fight for the real. As Rosen suggests, reclaiming experience does not require grand gestures—it starts with small, daily acts of presence. Whether it’s having an unrecorded conversation, sharing a meal without posting it online, or simply meeting someone’s gaze, these moments remind us of what it means to be human. Technology may make life easier, but it should not become a substitute for living.

Expand full comment
The One Percent Rule's avatar

Thank you MG. I’m particularly glad the essay prompted you to download her book. I think you'll find her perspective even more compelling in its full scope.

Your reflections capture and expand on the core anxieties Rosen voices, and I share, about our gradual immersion into a digitally mediated existence. That sense you described, of technology becoming an "extension of us" often at the "expense of real-life experiences," is exactly the kind of subtle but profound shift Rosen is trying to reveal.

The concern you raise about children being introduced to screens so early, and how we might be offering them "the simplicity of mediated experiences" instead of teaching them to navigate life's complexities, is a exactly what we should be highlighting, this is hugely problematic and connects with Rosen’s discussion of Robert Michael Pyle's "extinction of experience," particularly how children might grow up thinking of nature primarily through digital representations. I like your practical suggestion about schools restricting device use to foster direct engagement, we need that kind of "humanist resistance" and active choice.

So true, "building relationships with people takes effort, empathy, and resilience—qualities that cannot be cultivated through screens alone." This gets to the crux of what Rosen details in her chapter on face-to-face interaction, which I highlight in the "Emoji Nations" section. The degradation of our "vagal tone" and the rise of the "text shrug" are, as Rosen argues and you so clearly see, symptoms of a much larger societal shift away from the "messy, unpredictable nature of human interaction."

The Ursula K. Le Guin quote the "permanent, intolerable uncertainty" making life possible is just perfect, brilliant. That is the essence of what is lost when we allow technology to engineer away the friction and delay that are not bugs in the human system, but rather the fertile ground for "attention, grace, and memory". Your insight that we risk becoming "like AI ourselves" if we retreat too far into digital certainty is terrifying, and exactly why these conversations are essential

The Sebastian Junger quote about the human need to feel necessary ties everything together so well. Rosen's work, and what I hoped to convey, is that this "extinction of experience" isn't just about individual sensory deprivation; it's about the erosion of our roles within communities and the very fabric of a shared, embodied reality. When we "outsource so much of what makes us human to digital devices," that sense of purpose and belonging inevitably frays.

Thank you again for enriching and taking the conversation further.

Expand full comment
Marginal Gains's avatar

As also said here (https://tinyurl.com/26epdfyh):

Langdon Winner called “reverse adaptation” or “the adjustment of human ends to match the character of the available means.” In short, we’ll lose sight of our purpose and goals and instead adapt them to fit the constraints of a chatbot.

Don’t reduce your mental health to chatbot form — or even the emotional work of getting over your ex! The work of being a human in this messy world is deeper, harder, and more complex than the shallow interface of a chatbot affords. You might leave the interaction feeling better — chatbots have become total sycophants — but the problem will be there waiting for you when you’re done prompting.

Expand full comment
The One Percent Rule's avatar

That is a hugely relevant point MG, connecting directly to the anxieties Christine Rosen explores, and which I tried to convey. Langdon Winner's concept of "reverse adaptation" is a perfect lens through which to view this phenomenon.

You're absolutely right; Rosen's concern, as I understand it, isn't just that we're using new tools, but that these tools, particularly those like chatbots designed to simulate human interaction, are subtly, or not so subtly, reshaping our expectations and even our human ends to fit their own limited capacities. I touch on this as to how the "User Experience" is optimized not to deepen our humanity but to "bypass it, to turn us from embodied persons into dopamine-seeking behavior loops".

Your warning, "Don’t reduce your mental health to chatbot form," so very, very true.

Expand full comment
Veronika Bond's avatar

Thank you for introducing me to this book and author. I managed to find and download the book and started reading. One thing I found confusingly absent is a definition of the word 'experience'. In English this word covers such a wide range of meanings that it seems essential to let the reader know what the author has in mind

(see also my wordcast https://veronikabondsymbiopaedia.substack.com/p/the-wilderness-of-experience)

As you point out, Rosen has adopted the phrase from another author where 'experience' is defined broadly as ' the loss of human–nature interactions'... this is helpful.

In general understanding, however, 'experience' is an internal phenomenon related to the perception of the individual. Experience is entirely subjective. For this reason, something like 'consensus reality' doesn't really exist... (see the concept of Maya = physical reality as illusion https://en.wikipedia.org/wiki/Maya_(religion)

Compared with virtual reality, this makes the experience of so-called 'consensus reality in nature' not hugely different...

When quoting Frisch's Homo Faber, it might be helpful to add that Faber's view of technology and relationship towards experiencing transforms radically in the storyline of the novel.

If experiencing is an internal process ~ triggered either by interaction with living nature or virtual events on a dead screen ~ I wonder whether it is appropriate to call it 'extinction of experience'. Humans have been avoiding their own experience for generations (as described by Frisch in Homo Faber) and lamented by Jean Gebser in his writings and talks about 'Erfahrung (= something we need to travel through)'.

Virtual reality doesn't enable humans to avoid their own experiences. Perhaps the interactions with technology numbs and distorts the experience to a certain extent, but in the long run I don't see how it would be possible to get away from 'one's own experience' which is such an essential part of life itself.

I admit, I haven't read Rosen's book to the end yet. Is she warning of the extinction of human life through technology?

Expand full comment
The One Percent Rule's avatar

Thank you, I hope you find Rosen's book as thought-provoking as I did.

Very good poins about the definition of "experience". It's true that Rosen, in her introduction, acknowledges the breadth of the term, stating that "Experiences, broadly considered, are the ways we become acquainted with the world. Direct experience is our first teacher". While she doesn't offer a single, narrow definition, she focuses on the disappearance of "certain types of experience," particularly those "rooted deeply in our evolutionary history, such as face-to-face interaction," and others reflective of cultural norms like "patience and our sense of public space and place". I attempt to follow her lead by exploring these specific categories of experience she identifies as fading. Her adoption of Pyle's phrase, as you note, initially referred to the loss of direct engagement with nature, but she expands it to a much broader "human ecology".

I had no idea about the Maya concept. Rosen touches on 'consensus reality' when she writes, "We can no longer assume that reality is a matter of consensus", acknowledging that technology enables individuals to "create their own realities". Hence, my attempt to reflect this concern, particularly when discussing how online immersion can lead individuals to feel their offline experiences are "surreal or cinematic". While Rosen might not delve into the concept of Maya, her work definitely grapples with the idea that our shared understanding of what is "real" is becoming increasingly fragmented and personalized due to technology. I wonder how that compares with the Maya philosophy?

Thank you for the important nuance regarding Frisch's Homo Faber, it's a valuable reminder that characters and their perspectives can evolve within a narrative, and Faber's transformation is indeed significant. I read Rosen as using Frisch's initial stark statement as a powerful framing device for her concerns about how technology can "arrange the world so that we need not experience it".

The question of whether technology truly allows us to "avoid" our own internal experiences, or merely "numbs and distorts" them, is essential. Rosen seems to argue that while the internal spark of experience might always be present, the nature, quality, and shared understanding of those experiences are being profoundly altered by mediation/technology. She suggests that by constantly opting for the "virtual" or the "mediated," we are losing practice in processing direct, unmediated experiences, leading to a kind of "deskilling" or "disorientation". This is what I tried to capture from her work.

To your final question, Rosen is not necessarily warning of the extinction of human life in a literal, biological sense, but rather the extinction of certain fundamental human experiences that have historically shaped our consciousness, communities, and understanding of 'reality'. She is concerned that the "User Experience" is supplanting the "Human Condition", and that this shift has profound implications for what it means to be human in an increasingly disembodied world.

Please let me know your overall thoughts on the book.

Expand full comment
Veronika Bond's avatar

"She suggests that by constantly opting for the "virtual" or the "mediated," we are losing practice in processing direct, unmediated experiences, leading to a kind of "deskilling" or "disorientation"."

Yes, this is what I understood too, and what confused me at the same time. As someone who has studied 'subjective experience' for the past 27 years, I'm aware that 'experience' is a word that has also been appropriated by the marketing industry, 'selling experience' of anything, from a cheeseburger to a sunset cocktail in the Maldives, or a Safari in Africa, or skydiving in Switzerland, or...

In other words, before the 'virtual experience industry' of which Rosen warns us, we already had the 'experience industry', trying to sell 'peak experiences' while distracting humans from the real experience of their lives.

About 60 years before Rosen's 'Extinction of Experience' Jean Gebser lamented that most humans don't process their own experience. This was before the 'peak experience industry' had started, I guess, and definitely before the 'user experience' brought to us through virtual technology. Unmediated experience, according to Gebser, is not something people know how to process, or do automatically. And I would agree. We can see this especially in relation to traumatic experience, or so-called 'adverse childhood event' experience, or historic trauma experience (such as the holocaust). If humans knew how to process unmediated experience, perhaps they would not be drawn so easily into the world of virtual experience (promising infinite control over our own experience, while undermining it all).

I understand the argument Rosen is trying to make. What I struggle with is the conclusion made (perhaps on a false premise), that humans are processing unmediated experience. What if they don't? Why do so many people find themselves in a kind of 'groundhogday' situation repeating the same experience over and over because they, in Gebser's words, are unable to travel through and out?

What if the virtual experience has the power to do that...?

Expand full comment
Ruth Gaskovski's avatar

Thanks for bringing my attention to Rosen's book. I deeply resonate with the issues that you highlight and am encouraged that her book not only contains critique (which people are growing tired of already), but a hopeful perspective of where we need to turn. My husband and I first strated writing in this vein in 2023 with "The 3Rs of Unmachining", and it seems the emphasis on returning to true experience is gaining momentum. Thanks again for your writing!

Expand full comment
The One Percent Rule's avatar

hat's wonderful to hear! Thank you. I agree it is definitely encouraging that Rosen's book moves beyond critique to offer that "hopeful perspective of where we need to turn". I tried to capture that closing sense of "moral urgency" and the idea that "extinction is not fate, but surrender," pointing towards the "daily acts of presence" that can help us reclaim those vital experiences.

It's also fascinating to hear about your own work, I was reading about your upcoming pilgrimage, that will be magnificent. It sounds like you and your husband are very much part of this 'awareness' that emphasizes a return to "true experience." It certainly feels like there's a growing recognition of what's at stake, and voices like yours and Rosen's are crucial in navigating this complex shift in how we live.

Expand full comment
Ruth Gaskovski's avatar

Colin, not only are you a thoughtful writer, but have some of the most thoughtful comment responses I have encountered. I noted in your bio that you are a professor of AI and will be sure to follow your writing on this topic (we likely have very different perspectives on this, but given your attention to "reclaiming vital experiences" I am curious to read your take). My husband and I recently attended a talk on AI and the future of education and wrote about it here: https://schooloftheunconformed.substack.com/p/learning-fast-and-slow-why-ai-will. It was the most hopeful perspective on AI and education I have come across.

Expand full comment
The One Percent Rule's avatar

Thank you Ruth, I try to have a balanced view on AI, but I am deeply concerned about the over reliance on AI and believe this will get worse if we do not, as educators, encourage responsible use of AI, I've advocated for a significant AI education program on this in many posts.

I note your surname, I am based in Poland, teach at the University of Warsaw (eg. http://en.bwz.uw.edu.pl/solidarity-with-ukraine-4eu-for-ukraine-ii-ai-for-higher-education-institutions/) I also teach at the London School of Economics and once a year in the US, typically short term workshops, this year at Berkeley. Most of my focus is on using AI to augment human performance, thinking, etc as I mentioned in the comment on your post. It is good that the talk you attended was a 'hopeful perspective' we need more of that.

Two weeks ago I ran a workshop for the management board of a major bank, when asked about my biggest fears, it was not job displacement as most expected, but 'dumbing down of society due to over reliance on LLMs to do the thinking work!'

Expand full comment
Maxwell E's avatar

Thank you so much for bringing this book to my attention. I will seek it out. I can’t tell you how strongly I resonated with the themes you covered in this post.

Expand full comment
Joshua Bond's avatar

Great article, thank you. We humans are experiencers of experiences. So I wouldn't say experience is being extinguished, but the fully human experience is being castrated, and its richness and depth is being lost.

This is a general atrophying of consciousness (of conscious accurate discernment, and wisdom) as the important 'experiences' of life become less valued than the new technologically mediated experiences.

Expand full comment