Tech by Blaze Media

© 2025 Blaze Media LLC. All rights reserved.
Cybernetics promised a merger of human and computer. Then why do we feel so out of the loop?
Photo by Bloomberg / Getty Images

Cybernetics promised a merger of human and computer. Then why do we feel so out of the loop?

Why 'do it for us' tends to turn into 'do it to us.'

It began in the crucible of a world at war. The word cybernetics was coined in 1948 by the MIT mathematician Norbert Wiener, a man wrestling with the urgent problem of how to make a machine shoot down another machine. He reached back to the ancient Greek for kubernétes, the steersman, the one who guides and corrects. Plato had used it as a metaphor for governing a polis. Wiener used it to describe a new science of self-governing systems, of control and communication in the animal and the machine. The core idea was feedback, a circular flow of information that allows a system to sense its own performance and steer itself toward a goal.

The idea was not about mechanics but about behavior. The focus shifted from what things are to what they do. A thermostat maintaining the temperature of a room, a human body maintaining homeostasis, a pilot correcting the flight path of an airplane; all were, in this new light, functionally the same. They were all steersmen. The conciseness of the concept was seductive, its implications unsettling. It suggested a universal logic humming beneath the surface of both wired circuits and living tissue, blurring the line between the made and the born.

You shape the algorithm, and the algorithm shapes you.

The primordial cybernetic device was James Watt’s centrifugal governor, that elegant pirouette of spinning weights that tamed the steam engine in 1788. As the engine raced, the rotating balls swung wide, closing a valve to slow it; as the engine slowed, they fell, opening the valve again. It was a perfect, self-contained conversation.

But it was the Second World War that gave birth to the theory. Human reflexes were no longer fast enough for the new calculus of aerial combat. Wiener and his colleagues were tasked with solving the “air defense problem,” which was really a problem of prediction. They treated the enemy pilot, the gun, and the radar as a single, closed-loop system, each reacting to the other in a lethal dance. By the war’s end, as one analyst starkly put it, autonomous machines were shooting down other autonomous machines in the “first battle of the robots.”

In the Cold War that followed, cybernetics became a tool of ideological contest. In the West, it was the logic of the military-industrial complex, of corporate automation and the game theory of nuclear deterrence humming away in the computers at Project RAND. It promised optimization and control.

Yet the idea proved too fluid to be contained. While men in uniform were designing command-and-control networks, Stewart Brand was on the West Coast, publishing the Whole Earth Catalog. He filled its pages with cybernetic theory, reimagining it not as a tool for top-down control but for bottom-up, self-regulating communities. The catalog itself was a feedback loop, constantly updated by its readers. For a generation of commune-dwellers and future Silicon Valley pioneers, cybernetics was the grammar of personal liberation and ecological harmony. Computers, Brand wrote in Rolling Stone, were “coming to the people.”

RELATED: 'They want to spy on you': Military tech CEO explains why AI companies don't want you going offline

Photo by Matt Cardy

The Soviets, meanwhile, followed a more jagged path. Initially denouncing cybernetics as a "bourgeois pseudoscience," they performed a complete reversal after Stalin’s death. Here was a science, they realized, that could perfect the planned economy. Visionaries like Anatoly Kitov and Victor Glushkov dreamed of a vast, nationwide computer network called OGAS, an electronic nervous system that would link every factory to a central hub in Moscow. It was an ambitious plan for “electronic socialism,” a rational, data-driven alternative to the brute-force dictates of the past. The system, they hoped, would offer a technocratic antidote to personal tyranny. OGAS was never fully built, stalled by bureaucracy and technical limits, but the dream itself was telling. Both superpowers saw in the feedback loop a reflection of their own ambitions: one for market efficiency, the other for state perfection.

Perhaps the most popular incarnation of the cybernetic dream was Project Cybersyn in Salvador Allende’s Chile. From 1971 to 1973, the British cybernetician Stafford Beer designed a nerve center for the Chilean economy. In a futuristic operations room that looked like a set from "Star Trek," managers sat in molded white chairs, surrounded by screens displaying real-time production data fed from factories across the country via a network of telex machines. It was an attempt to steer a national economy in real-time, to keep it in a “dynamic equilibrium” against the shocks of strikes and embargoes. Cybersyn was a short-lived project, ending with the 1973 coup, but it remains a powerful symbol of the cybernetic ideal: a society as a single, responsive, controllable system.

The feedback loop was not confined to the physical world. It began to shape our fictions, which in turn shaped our reality. William Gibson, who knew famously little about computers, coined the word “cyberspace” in his 1984 novel "Neuromancer." The vision was so compelling it seemed to will itself into existence, providing the language and the imaginative blueprint for a generation of technologists building the early internet and virtual reality. Neal Stephenson’s 1992 novel "Snow Crash" gave us the “metaverse” and the “avatar,” terms that have since migrated from fiction to corporate strategy. Cyberpunk literature provided the prototypes for the world we now inhabit.

Today, the word “cybernetics” feels archaic, a relic of a retro-futurist past. Yet its principles are more deeply embedded in our lives than Wiener could have imagined. We are all entangled in cybernetic loops. The social media algorithms that monitor our clicks to refine their feeds, which in turn shape our behavior, are feedback systems of astonishing power and intimacy. You shape the algorithm, and the algorithm shapes you. A self-driving car navigating city traffic is a cybernetic organism, constantly sensing, processing, and acting. Our smart homes and wearable devices are nodes in a network of perpetual, low-grade feedback.

We have built a world of steersmen, of systems that regulate themselves. The question that lingers is the one Wiener implicitly asked from the beginning. In a world of automated, self-correcting systems, who, or what, is charting the course?

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?
Stephen Pimentel

Stephen Pimentel

Stephen Pimentel is an engineer and essayist in the San Francisco Bay Area, interested in the classics, political philosophy, governance futurism, and AI.