Quote from Michel Foucault, "Power, a Magnificent Beast"
"Power doesn't always shout. It almost never strikes head-on. Truly dangerous power whispers, manages, nurtures, normalizes. It disguises itself as science, security, the "common good." It doesn't tell you "obey me": it classifies you, corrects you, diagnoses you. It doesn't punish you for what you do, but for what you could become. When power ceases to resemble power, when it becomes routine, a file, an expert opinion, a protocol, then it becomes a magnificent beast: efficient, elegant, devastating.
Prison doesn't fail. It works. It works when it produces criminals, when it manufactures fear, when it justifies surveillance, when it turns injustice into normality. It works when judicial error is not an anomaly but a technique. When torture is not excess, but reason. When the system needs culprits to keep breathing. The problem isn't that power makes mistakes: it's that it often does exactly what it was designed to do.
That's why criticism doesn't serve to reassure or offer easy solutions. It serves to unsettle. To make suspicious what seemed natural. To break the anesthesia. Because resisting today is not always about seizing power: sometimes it is something more difficult and more urgent —learning to see how it operates, where it hides, who it silently crushes— and refusing, with lucidity and dignity, to continue being governed in that way."
Excellent addition, Daniel. That Foucault quote perfectly distills the 'jurisdictional shift.'
Foucault captures the 'technical, procedural, calm' nature of modern governance that I mention in the post. What he calls 'normalizing' and 'classifying' is exactly what happens when we trade human judgment for systemic metrics. It’s that 'anesthesia' he mentions, the feeling that because a process is scientific or efficient, it must be right. His call to 'break the anesthesia' is precisely the 'refusal' I’m suggesting.
Foucault’s point that power works exactly as designed, by classifying and diagnosing, aligns perfectly with Heidegger’s fear of 'enframing.' When we are reduced to files and protocols, the system isn't failing; it’s reaching its final, most devastating form.
“Cybernetic inheritance” - “not machines that think, but decisions that no longer need reasons.” The loss of due process as “there is no official to confront and no rule to debate. There is only a screen that reports compliance or deviation, a form of listening stripped of any listener.” The loss of reflection and “an argument that had run since Plato by declaring that control is superior to contemplation.” Discipline is demanded, not decided in mutual understanding.
The signal as the stick denying deliberation. Independent action denied, independent thought cancelled. “Explanation as secondary to control.”
Could you say this is “reasoning at the speed of light” in that its arrival comes in a flash so bright it blinds us?
‘The signal as the stick denying deliberation’, that is a brilliant way to frame it. You are right that the speed of these systems is part of their power. They move so much faster than human judgment that they effectively 'blind' us to the fact that no reasoning occurred at all. We are left with only the result and the demand for compliance. Great reflection.
To watch dog trainers, horse trainers, child trainers you see the value of both discipline and “giving them their head”. To create autonomy over dependency. Freedom beyond force. Mutual expectations of behavior. A “contract law” of respect both ways. This way removes tyranny as unnecessary for compliance and complementarity that brings ahead “a head”.
Brilliant Cathie, the difference between training for dependency and training for autonomy. Cybernetics, in its modern form, tends to create dependency, the system requires constant feedback and correction to function. But your examples of horse and dog training point toward a 'mutual expectation' that transcends mere force. It’s the difference between 'ordering' (in the Heideggerian sense) and 'allowing' (poiēsis). When the goal is optimization, we get tyranny; when the goal is complementarity, we get a shared life. A very insightful point.
Cathie and Colin, this resonates with the work of Thomas Ogden, a psychoanalyst who has written extensively on intersubjectivity. For Ogden, experience is co-created in the space between self and other, through exchanges both conscious and unconscious. What you're calling "complementarity" and "mutual expectation" is what Ogden sees in the analytic relationship: two nervous systems in dialogue, each shaping and being shaped by the other.
The cybernetic model assumes a subject who steers and an object that is steered. Ogden's model, like your horse and dog examples, assumes something different: a third space that belongs to neither party alone. Relationship as medium, not mechanism.
Dependency requires one to remain the system. Autonomy requires both to be changed by the encounter. That's the difference between control and accompaniment.
Stephen, thank you for bringing Thomas Ogden into this. The 'third space' is a perfect way to describe what cybernetics excludes. A feedback loop is a closed circuit where everything is predictable or correctable; Ogden’s intersubjectivity is an open encounter where both parties are changed.
Your distinction between control and accompaniment is beautiful. Control requires the other to stay a fixed 'object' to be steered, whereas accompaniment allows for the 'medium' of the relationship to bring forth something new. It’s the difference between managing a system and inhabiting a world. That is exactly the 'refusal' I was hoping to articulate.
I think of the horse, for instance, that remains frisky (friction) once saddled and ridden and must be reined in (controlled) as opposed to the willing (faithful) horse that can be ridden gently in reliable relaxation without the rider losing balance and gentle (minimal) reins. This may be your term “intersubjectivity” as the horse and rider intuitively know each other well and have correspondent fluid journeying. Neither makes demands and both travel without the “nay” (or “neigh”) of “My way”. Both horse and rider retain relationship that is not coercive.
I think of the horse, the dog, the friend in gentle terms because I have witnessed the harshness of delivered control that broke the spirit of the animal or human. Retaining mutual autonomy as relational integrity is what creates that third space as a medium and not a mechanism. Then independent action occurs in tandem, not tyranny.
‘I have witnessed the harshness of delivered control that broke the spirit’, that line hits hard. It is the somatic reality of what Heidegger called 'ordering.' When a system treats a living being (or a society) as a variable to be optimized, it inevitably breaks the very 'friskiness' or 'friction' that makes that being autonomous.
Your point that 'independent action occurs in tandem' is exactly the 'third space' we have been discussing. It’s a form of governance that feels like accompaniment rather than steering.
This piece names the failure mode with rare precision.
What you describe as “decisions that no longer need reasons” is exactly the condition that motivated the most recent build of The Faust Baseline — Phronesis 2.6. The problem is not that systems are inaccurate, but that they are procedurally correct without being judgment-bound. Stability replaces justification. Output replaces accountability.
In 2.6, we treated this not as a philosophical concern but as an operational one. The core change was enforcing explicit stop conditions: if a claim cannot be defended with reasons, the system does not continue. Silence becomes a requirement, not an error. Refusal is treated as a success state when authority, scope, or certainty collapses.
Where cybernetic systems optimize through feedback, Phronesis enforces bounded judgment:
– scope locking instead of domain drift
– authority gating instead of fluency escalation
– stopping instead of compensatory continuation
In other words, we did not try to make systems “think better.” We forced them to know when thinking must end.
Your essay articulates why this matters culturally and metaphysically. Phronesis 2.6 is one attempt to encode that refusal mechanically, before judgment dissolves entirely into dashboards and equilibrium.
Excellent. Phronesis 2.6 seems to be solving for the 'epistemological reconstruction' I mentioned at the end of the piece. By forcing a system to know when thinking must end, you are preventing the very 'substitution' that makes modern AI feel so uncanny. You have essentially turned 'refusal' into a feature rather than a bug. It’s encouraging to see the 'saving power' Heidegger mentioned being encoded into the tools themselves."
Thank you — that’s exactly the problem we were trying to address.
The goal of Phronesis 2.6 was not to make AI wiser in the abstract, but to prevent substitution at the moment it becomes dangerous. Once systems continue reasoning past the point where reasons, authority, or consequence can be owned, judgment quietly collapses into process.
Refusal, in this sense, isn’t moral signaling or safety theater. It’s an epistemic boundary. When thinking must end, the system must end with it — and return responsibility to the human where it belongs.
Your framing of “epistemological reconstruction” captures that intent well. If there is any saving power here, it lies in making limits explicit and non-negotiable, rather than implicit and optimizable.
I appreciate this deeply. The shift from seeing refusal as 'error' to seeing it as an 'epistemic boundary' is exactly the kind of reconstruction I was hoping for. You are essentially building a machine that refuses to participate in the 'substitution' I’m worried about. By making those limits non-negotiable, you’re creating the space where human judgment has to reappear because the dashboard has the courage to go dark. It’s a rare example of building Heidegger’s 'saving power' into the architecture itself.
Let's just say we are both on the right track to getting an AI governance into the system to guide us back to order and a normal existence of some sort.
Efficiency and speed is for the rich and privileged...
That all want that partridge in a pear tree and want it now by plane...
Consensus and the common good is like baking a cake...all that messy input...to arrive at group pleasure...everyone gets a piece of that mythical pie...
We need more speed humps on the 4 lane highway of life...not less...
We need more comedians, pop singers, graffiti artists, the rebels, craftspeople, gardeners, bicycle riders, more surfers, more singers, more whistlers, more daydreamers...daydream believers...more dog and cat lovers...to pull us up...
We need more mess...not less...
We need the voice of the village not the police whistle of the city...
We need more wee wee of accountability...not the shifty tax avoiding accountability either...
We all need a serious conversation day...a day to have, swap, share serious conversations...like we all did as youngsters with comics...
We need to mobilize our collective power in boycotting, mancotting, femalecotting all the unresponsive corporate powers that be...
I really like the way you have framed this. You are exactly right: consensus and the common good are 'messy' by design. Cybernetics tries to bake the cake without the mess, it wants the output without the 'wee wee' of accountability. When you call for more gardeners, daydreamers, and whistlers, you are calling for the very things that a dashboard can’t measure and therefore tries to exclude. We definitely need more speed bumps to slow us down enough to actually look at each other again. Thank you for the 'collective power' in your words.
Haha :-) I really like that Lanier quote: we often use 'automation' as a euphemism for 'I don’t want to be the one to decide.' It’s the ultimate abdication. And you are probably right about Heidegger, if he had spent a bit more time with the 'Ma' and less time in the 'dense' woods of the mind, he might have found a simpler way to say that we are losing our humanity to the machine! Thanks for bringing Lanier into this; it’s the perfect closing thought for that section on accountability.
Most alarming. For some reason, as I was reading this, Palantir stood firm in the back of my mind. While Palantir doesn't build the cybernetic systems, it does collect and compile the data - our data - that's fed into cybernetic systems so as to exert control. We have some dark days ahead of us.
It is an interesting connection to make. Having spent ten years at Palantir working on the finance tools, I saw firsthand how these systems are built to handle the sheer scale of modern data. You are right that the 'collect and compile' phase is the prerequisite for the cybernetic loops I’m describing. The danger isn't necessarily in the data itself, but in the 'jurisdictional shift' that happens once that data is used to automate judgment. When the dashboard becomes the only source of truth, we move from understanding a system to merely managing its outputs. The 'dark days' you mention are really the moments when we forget how to look past the screen.
Colin, what a wonderful and thought provoking essay. A few thoughts that resonated.
What you call "decisions that no longer need reasons," I see in nervous systems that react before they can think. The cybernetic loop and the trauma response share the same structure: signal, correction, signal. No deliberation. No pause. Just continuous adjustment in service of stability.
Your line, "a form of listening stripped of any listener," names exactly what the attention economy extracts. Not just time, but somatic presence: the body's capacity to stay with experience long enough to be changed by it. Contemplation requires presence. Presence is what gets colonized.
Where I think our work converges most sharply: you end with refusal. Refusal requires capacity. A body in chronic contraction cannot refuse, it can only react. The precondition for the kind of thinking you're calling for is a nervous system that can hold complexity without collapsing into compliance.
Heidegger's "letting beings emerge" has a somatic correlate: regulation that allows presence rather than forces performance. The opposite of cybernetic steering is not rebellion. It's the capacity to stay present long enough to respond rather than react.
What a startling and clarifying connection. The idea that a cybernetic loop and a trauma response share the same structure is a major insight. Both are closed systems where 'speed' is used to bypass 'presence' in the name of safety or stability.
You have pinpointed why the attention economy feels so hollowing: it isn’t just taking our time; it’s colonizing our 'somatic presence.' If we are always in a state of 'continuous adjustment,' we never stay with an experience long enough to be changed by it. As you put it so well, the opposite of steering isn't rebellion, it's the capacity to stay present. That really is 'the work underneath the work.' Thank you.
Just a short thing on timing and framing. The Macy Conferences and First-Order Cybernetics were conducted during the height of McCarthyism. This political climate deeply influenced the scientific framework; the definition of Cybernetics as 'communication and control' reflects the 'Red Scare' anxiety of that era. During this time, the observer remained outside the loop—a position of detached surveillance.
Margaret Mead faced scrutiny from the FBI, while Gregory Bateson was better positioned due to his WWII service with the OSS. Bateson had already applied 'schismogenesis'—a concept he developed while studying the Iatmul people—as a tool for psychological warfare against Japan. Meanwhile, Ludwig von Bertalanffy, the father of General Systems Theory (very close to Second-Order Cybernetics), was largely excluded from the early Macy circle, likely due to his recent de-nazification and past Nazi Party membership in Austria.
Second-Order Cybernetics only emerged in the late 1960s and early 1970s as the political atmosphere shifted. When Mead and Bateson pushed for the 'Cybernetics of Cybernetics' alongside Heinz von Foerster, they moved the observer inside the loop. This transition from 'control' to 'participation' was not just a scientific evolution, but a reflection of the changing times—moving from the paranoia of the Cold War to the reflexive, systemic thinking of the 1970s.
That is a chillingly accurate observation. The elegance you describe is exactly what Heidegger meant by the perfection of domination. In a truly cybernetic system, even a critique is treated as just more feedback, a new data point to be integrated, smoothed over, and optimized. It makes pushing back feel like shouting into a storm that just converts your voice into more wind. You weren't just questioning data; you were questioning a metaphysical settlement that had already decided 'data' was the only language that mattered. Thank you for sharing that front-line perspective
I'm in Uruguay for the holidays and just noticed Animal Farm sitting in my childhood bedroom. Neural Foundry and Colin, your observation brought it back to mind. Orwell understood this dynamic precisely: the pigs don't seize power through force alone. They seize the language. By the time the other animals realize "all animals are equal, but some animals are more equal than others," the system has already absorbed the original critique and reformatted it as policy.
That is the cybernetic move. Resistance becomes input. Dissent becomes data. The system does not argue with you. It metabolizes you.
What makes Animal Farm endure is that Orwell saw how domination perfects itself not through violence but through redefinition. The animals never stop believing in the revolution. They just lose the capacity to remember what it meant.
‘The system does not argue with you. It metabolizes you.’ That might be the most chilling sentence in this entire thread. It perfectly captures how modern systems, whether in tech or governance, absorb dissent and reformat it as a 'new feature' or a 'policy update.' Orwell’s barn wall is the original 20th-century dashboard. The tragedy of Animal Farm is exactly what I meant by 'decisions that no longer need reasons': the animals follow the process because they have lost the capacity to remember the purpose.
Quote from Michel Foucault, "Power, a Magnificent Beast"
"Power doesn't always shout. It almost never strikes head-on. Truly dangerous power whispers, manages, nurtures, normalizes. It disguises itself as science, security, the "common good." It doesn't tell you "obey me": it classifies you, corrects you, diagnoses you. It doesn't punish you for what you do, but for what you could become. When power ceases to resemble power, when it becomes routine, a file, an expert opinion, a protocol, then it becomes a magnificent beast: efficient, elegant, devastating.
Prison doesn't fail. It works. It works when it produces criminals, when it manufactures fear, when it justifies surveillance, when it turns injustice into normality. It works when judicial error is not an anomaly but a technique. When torture is not excess, but reason. When the system needs culprits to keep breathing. The problem isn't that power makes mistakes: it's that it often does exactly what it was designed to do.
That's why criticism doesn't serve to reassure or offer easy solutions. It serves to unsettle. To make suspicious what seemed natural. To break the anesthesia. Because resisting today is not always about seizing power: sometimes it is something more difficult and more urgent —learning to see how it operates, where it hides, who it silently crushes— and refusing, with lucidity and dignity, to continue being governed in that way."
Michel Foucault, "Power, a magnificent beast"
Excellent addition, Daniel. That Foucault quote perfectly distills the 'jurisdictional shift.'
Foucault captures the 'technical, procedural, calm' nature of modern governance that I mention in the post. What he calls 'normalizing' and 'classifying' is exactly what happens when we trade human judgment for systemic metrics. It’s that 'anesthesia' he mentions, the feeling that because a process is scientific or efficient, it must be right. His call to 'break the anesthesia' is precisely the 'refusal' I’m suggesting.
Foucault’s point that power works exactly as designed, by classifying and diagnosing, aligns perfectly with Heidegger’s fear of 'enframing.' When we are reduced to files and protocols, the system isn't failing; it’s reaching its final, most devastating form.
“Anesthesia” a perfect description. But to many, as in the song, “It Feels So Right”. Orwell predicted this predicament. And Huxley.
Very well summarized.
“Cybernetic inheritance” - “not machines that think, but decisions that no longer need reasons.” The loss of due process as “there is no official to confront and no rule to debate. There is only a screen that reports compliance or deviation, a form of listening stripped of any listener.” The loss of reflection and “an argument that had run since Plato by declaring that control is superior to contemplation.” Discipline is demanded, not decided in mutual understanding.
The signal as the stick denying deliberation. Independent action denied, independent thought cancelled. “Explanation as secondary to control.”
Could you say this is “reasoning at the speed of light” in that its arrival comes in a flash so bright it blinds us?
‘The signal as the stick denying deliberation’, that is a brilliant way to frame it. You are right that the speed of these systems is part of their power. They move so much faster than human judgment that they effectively 'blind' us to the fact that no reasoning occurred at all. We are left with only the result and the demand for compliance. Great reflection.
To watch dog trainers, horse trainers, child trainers you see the value of both discipline and “giving them their head”. To create autonomy over dependency. Freedom beyond force. Mutual expectations of behavior. A “contract law” of respect both ways. This way removes tyranny as unnecessary for compliance and complementarity that brings ahead “a head”.
Brilliant Cathie, the difference between training for dependency and training for autonomy. Cybernetics, in its modern form, tends to create dependency, the system requires constant feedback and correction to function. But your examples of horse and dog training point toward a 'mutual expectation' that transcends mere force. It’s the difference between 'ordering' (in the Heideggerian sense) and 'allowing' (poiēsis). When the goal is optimization, we get tyranny; when the goal is complementarity, we get a shared life. A very insightful point.
Cathie and Colin, this resonates with the work of Thomas Ogden, a psychoanalyst who has written extensively on intersubjectivity. For Ogden, experience is co-created in the space between self and other, through exchanges both conscious and unconscious. What you're calling "complementarity" and "mutual expectation" is what Ogden sees in the analytic relationship: two nervous systems in dialogue, each shaping and being shaped by the other.
The cybernetic model assumes a subject who steers and an object that is steered. Ogden's model, like your horse and dog examples, assumes something different: a third space that belongs to neither party alone. Relationship as medium, not mechanism.
Dependency requires one to remain the system. Autonomy requires both to be changed by the encounter. That's the difference between control and accompaniment.
Stephen, thank you for bringing Thomas Ogden into this. The 'third space' is a perfect way to describe what cybernetics excludes. A feedback loop is a closed circuit where everything is predictable or correctable; Ogden’s intersubjectivity is an open encounter where both parties are changed.
Your distinction between control and accompaniment is beautiful. Control requires the other to stay a fixed 'object' to be steered, whereas accompaniment allows for the 'medium' of the relationship to bring forth something new. It’s the difference between managing a system and inhabiting a world. That is exactly the 'refusal' I was hoping to articulate.
Control and accompaniment well differentiated.
I think of the horse, for instance, that remains frisky (friction) once saddled and ridden and must be reined in (controlled) as opposed to the willing (faithful) horse that can be ridden gently in reliable relaxation without the rider losing balance and gentle (minimal) reins. This may be your term “intersubjectivity” as the horse and rider intuitively know each other well and have correspondent fluid journeying. Neither makes demands and both travel without the “nay” (or “neigh”) of “My way”. Both horse and rider retain relationship that is not coercive.
I think of the horse, the dog, the friend in gentle terms because I have witnessed the harshness of delivered control that broke the spirit of the animal or human. Retaining mutual autonomy as relational integrity is what creates that third space as a medium and not a mechanism. Then independent action occurs in tandem, not tyranny.
‘I have witnessed the harshness of delivered control that broke the spirit’, that line hits hard. It is the somatic reality of what Heidegger called 'ordering.' When a system treats a living being (or a society) as a variable to be optimized, it inevitably breaks the very 'friskiness' or 'friction' that makes that being autonomous.
Your point that 'independent action occurs in tandem' is exactly the 'third space' we have been discussing. It’s a form of governance that feels like accompaniment rather than steering.
This piece names the failure mode with rare precision.
What you describe as “decisions that no longer need reasons” is exactly the condition that motivated the most recent build of The Faust Baseline — Phronesis 2.6. The problem is not that systems are inaccurate, but that they are procedurally correct without being judgment-bound. Stability replaces justification. Output replaces accountability.
In 2.6, we treated this not as a philosophical concern but as an operational one. The core change was enforcing explicit stop conditions: if a claim cannot be defended with reasons, the system does not continue. Silence becomes a requirement, not an error. Refusal is treated as a success state when authority, scope, or certainty collapses.
Where cybernetic systems optimize through feedback, Phronesis enforces bounded judgment:
– scope locking instead of domain drift
– authority gating instead of fluency escalation
– stopping instead of compensatory continuation
In other words, we did not try to make systems “think better.” We forced them to know when thinking must end.
Your essay articulates why this matters culturally and metaphysically. Phronesis 2.6 is one attempt to encode that refusal mechanically, before judgment dissolves entirely into dashboards and equilibrium.
Well said.
Excellent. Phronesis 2.6 seems to be solving for the 'epistemological reconstruction' I mentioned at the end of the piece. By forcing a system to know when thinking must end, you are preventing the very 'substitution' that makes modern AI feel so uncanny. You have essentially turned 'refusal' into a feature rather than a bug. It’s encouraging to see the 'saving power' Heidegger mentioned being encoded into the tools themselves."
Thank you — that’s exactly the problem we were trying to address.
The goal of Phronesis 2.6 was not to make AI wiser in the abstract, but to prevent substitution at the moment it becomes dangerous. Once systems continue reasoning past the point where reasons, authority, or consequence can be owned, judgment quietly collapses into process.
Refusal, in this sense, isn’t moral signaling or safety theater. It’s an epistemic boundary. When thinking must end, the system must end with it — and return responsibility to the human where it belongs.
Your framing of “epistemological reconstruction” captures that intent well. If there is any saving power here, it lies in making limits explicit and non-negotiable, rather than implicit and optimizable.
Appreciate the clarity of your work.
I appreciate this deeply. The shift from seeing refusal as 'error' to seeing it as an 'epistemic boundary' is exactly the kind of reconstruction I was hoping for. You are essentially building a machine that refuses to participate in the 'substitution' I’m worried about. By making those limits non-negotiable, you’re creating the space where human judgment has to reappear because the dashboard has the courage to go dark. It’s a rare example of building Heidegger’s 'saving power' into the architecture itself.
Let's just say we are both on the right track to getting an AI governance into the system to guide us back to order and a normal existence of some sort.
Efficiency and speed is for the rich and privileged...
That all want that partridge in a pear tree and want it now by plane...
Consensus and the common good is like baking a cake...all that messy input...to arrive at group pleasure...everyone gets a piece of that mythical pie...
We need more speed humps on the 4 lane highway of life...not less...
We need more comedians, pop singers, graffiti artists, the rebels, craftspeople, gardeners, bicycle riders, more surfers, more singers, more whistlers, more daydreamers...daydream believers...more dog and cat lovers...to pull us up...
We need more mess...not less...
We need the voice of the village not the police whistle of the city...
We need more wee wee of accountability...not the shifty tax avoiding accountability either...
We all need a serious conversation day...a day to have, swap, share serious conversations...like we all did as youngsters with comics...
We need to mobilize our collective power in boycotting, mancotting, femalecotting all the unresponsive corporate powers that be...
We need to weaponize our purchasing power...
We need more wee wee on the legs of power...
I really like the way you have framed this. You are exactly right: consensus and the common good are 'messy' by design. Cybernetics tries to bake the cake without the mess, it wants the output without the 'wee wee' of accountability. When you call for more gardeners, daydreamers, and whistlers, you are calling for the very things that a dashboard can’t measure and therefore tries to exclude. We definitely need more speed bumps to slow us down enough to actually look at each other again. Thank you for the 'collective power' in your words.
I used to joke when I studied Heidegger that DaSein need to spend more time getting it on with MaSein and his books wouldn’t be so dense😂😂.
I like the juxtaposition you depicted with Heidegger and Wiener.
Jaron Lanier once said that automation is the abdication of responsibility.
Haha :-) I really like that Lanier quote: we often use 'automation' as a euphemism for 'I don’t want to be the one to decide.' It’s the ultimate abdication. And you are probably right about Heidegger, if he had spent a bit more time with the 'Ma' and less time in the 'dense' woods of the mind, he might have found a simpler way to say that we are losing our humanity to the machine! Thanks for bringing Lanier into this; it’s the perfect closing thought for that section on accountability.
In school I was stymied by Heidegger and humor was my relief valve. Being and Time was a rough read.
Humor is the best relief valve for sure. Being and Time has become incredibly influential, but I agree it is a tough read at any age.
Most alarming. For some reason, as I was reading this, Palantir stood firm in the back of my mind. While Palantir doesn't build the cybernetic systems, it does collect and compile the data - our data - that's fed into cybernetic systems so as to exert control. We have some dark days ahead of us.
It is an interesting connection to make. Having spent ten years at Palantir working on the finance tools, I saw firsthand how these systems are built to handle the sheer scale of modern data. You are right that the 'collect and compile' phase is the prerequisite for the cybernetic loops I’m describing. The danger isn't necessarily in the data itself, but in the 'jurisdictional shift' that happens once that data is used to automate judgment. When the dashboard becomes the only source of truth, we move from understanding a system to merely managing its outputs. The 'dark days' you mention are really the moments when we forget how to look past the screen.
Colin, what a wonderful and thought provoking essay. A few thoughts that resonated.
What you call "decisions that no longer need reasons," I see in nervous systems that react before they can think. The cybernetic loop and the trauma response share the same structure: signal, correction, signal. No deliberation. No pause. Just continuous adjustment in service of stability.
Your line, "a form of listening stripped of any listener," names exactly what the attention economy extracts. Not just time, but somatic presence: the body's capacity to stay with experience long enough to be changed by it. Contemplation requires presence. Presence is what gets colonized.
Where I think our work converges most sharply: you end with refusal. Refusal requires capacity. A body in chronic contraction cannot refuse, it can only react. The precondition for the kind of thinking you're calling for is a nervous system that can hold complexity without collapsing into compliance.
Heidegger's "letting beings emerge" has a somatic correlate: regulation that allows presence rather than forces performance. The opposite of cybernetic steering is not rebellion. It's the capacity to stay present long enough to respond rather than react.
That's the work underneath the work.
What a startling and clarifying connection. The idea that a cybernetic loop and a trauma response share the same structure is a major insight. Both are closed systems where 'speed' is used to bypass 'presence' in the name of safety or stability.
You have pinpointed why the attention economy feels so hollowing: it isn’t just taking our time; it’s colonizing our 'somatic presence.' If we are always in a state of 'continuous adjustment,' we never stay with an experience long enough to be changed by it. As you put it so well, the opposite of steering isn't rebellion, it's the capacity to stay present. That really is 'the work underneath the work.' Thank you.
Just a short thing on timing and framing. The Macy Conferences and First-Order Cybernetics were conducted during the height of McCarthyism. This political climate deeply influenced the scientific framework; the definition of Cybernetics as 'communication and control' reflects the 'Red Scare' anxiety of that era. During this time, the observer remained outside the loop—a position of detached surveillance.
Margaret Mead faced scrutiny from the FBI, while Gregory Bateson was better positioned due to his WWII service with the OSS. Bateson had already applied 'schismogenesis'—a concept he developed while studying the Iatmul people—as a tool for psychological warfare against Japan. Meanwhile, Ludwig von Bertalanffy, the father of General Systems Theory (very close to Second-Order Cybernetics), was largely excluded from the early Macy circle, likely due to his recent de-nazification and past Nazi Party membership in Austria.
Second-Order Cybernetics only emerged in the late 1960s and early 1970s as the political atmosphere shifted. When Mead and Bateson pushed for the 'Cybernetics of Cybernetics' alongside Heinz von Foerster, they moved the observer inside the loop. This transition from 'control' to 'participation' was not just a scientific evolution, but a reflection of the changing times—moving from the paranoia of the Cold War to the reflexive, systemic thinking of the 1970s.
That is a chillingly accurate observation. The elegance you describe is exactly what Heidegger meant by the perfection of domination. In a truly cybernetic system, even a critique is treated as just more feedback, a new data point to be integrated, smoothed over, and optimized. It makes pushing back feel like shouting into a storm that just converts your voice into more wind. You weren't just questioning data; you were questioning a metaphysical settlement that had already decided 'data' was the only language that mattered. Thank you for sharing that front-line perspective
I'm in Uruguay for the holidays and just noticed Animal Farm sitting in my childhood bedroom. Neural Foundry and Colin, your observation brought it back to mind. Orwell understood this dynamic precisely: the pigs don't seize power through force alone. They seize the language. By the time the other animals realize "all animals are equal, but some animals are more equal than others," the system has already absorbed the original critique and reformatted it as policy.
That is the cybernetic move. Resistance becomes input. Dissent becomes data. The system does not argue with you. It metabolizes you.
What makes Animal Farm endure is that Orwell saw how domination perfects itself not through violence but through redefinition. The animals never stop believing in the revolution. They just lose the capacity to remember what it meant.
‘The system does not argue with you. It metabolizes you.’ That might be the most chilling sentence in this entire thread. It perfectly captures how modern systems, whether in tech or governance, absorb dissent and reformat it as a 'new feature' or a 'policy update.' Orwell’s barn wall is the original 20th-century dashboard. The tragedy of Animal Farm is exactly what I meant by 'decisions that no longer need reasons': the animals follow the process because they have lost the capacity to remember the purpose.
Enjoy your time in Uruguay.