User account

The future has never looked older

Tyler Coburn

Ergonomic Futures

Published: 31.12.2017


However convincingly space agencies justify the enormous budgets, there’s one gratuitous expense: training astronauts for free fall. Simply pay a visit to fraternity row, where potential spacemen routinely defy the laws of gravity.

Zero-G is the post-planetary equivalent of a keg stand. Astronauts don’t hang upside-down, but the effect is much the same: the body’s fluids congregate in the chest and head, puffing faces and pressurizing skulls. Balance soon goes out of whack; G-force makes the adrenaline flow. Continual swelling of the optic nerve causes some to become…farsighted!

For spacemen of the sixties, the poison of choice wasn’t alcohol, but drugs. In the early years of the decade, NASA commissioned two researchers to design a human best adapted to space—who lived in “space qua natura.” Their model being, astoundingly, could breath without lungs and space walk without suits.

To describe this new human, the researchers invented the term “cyborg.”

Essential to their vision were exogenous devices: fuel cells that replaced the lungs, intravenous feeding tubes, pressure pumps injecting pharmacological cocktails to keep radiation and high blood pressure at bay. When functioning effectively, this cybernetic system would be so integrated into the user as to operate “unconsciously.”

When the system didn’t function effectively, the human element was presumed to be the problem. Spacemen deprived of sensory and motor variation, for example, have been known to experience “psychotic-like states.” In such instances, the researchers advised that drug infusions be triggered remotely from earth or by a fellow crew member.

In other instances, a human could exhibit maladaptive behavior. The exogenous devices and subcutaneous tubes might be misperceived as threatening and controlling, not ingenious and benign. The pharmacological pumps, despite their avowed function, appear as “palliation” for the depressions of the cyborg complex—the anxieties of being haplessly invaded by the future.

For these (and most other) scenarios, drugs were the prescribed solution.

The first cyborg was thus a human freed from biological limitations, yet bound by imperfect devices and doped to ease the pain of those imperfections. In the decades since, genetic engineering has been learning to fix such problems on the assembly line. Speaking at a 2014 symposium, George Church—the man who gives the field an avuncular face—identified gene variants pertinent to survival in extra-terrestrial environments: LPR5 G171V for extra-strong bones, MSTN for lean muscles, GHR for lowered cancer risk, and so on. Future generations won’t suffer come-downs or crushing hangovers; they’ll be built to party from our solar system to wherever.


A few years ago, a group of UFO believers approached Shara Bailey, an anthropologist working on dental morphology in hominins and early humans. They claimed to have found an ancient jaw of tantalizingly unknown provenance…

Shara agreed to talk with the television reporter covering the story, stating something to the effect of: “In my professional opinion, this jaw is a fake. There’s nothing on earth that looks like this.”

Her first sentence was cut from the broadcast segment. Shara has been a darling of the believer community ever since.


Nothing under the sun, no matter how unbelievable or fantastic, is immune to the pressures of evolution. Take science-fiction. The Force, the mind meld—the entire field of psionics, for that matter—have the look of yellowing comic books, the taste of stale popcorn. They would have gone the way of the dodo, if not for the magic of capital. Hollywood has proved to be more powerful than natural selection, building menageries in the form of franchises, gilding cages for endangered ideas. The future has never been better preserved; the future has never looked older.

It wasn’t always this way. What had historically loitered around myth and spiritualism, as second sight and sixth sense, only approached the field of science in the 1930s. Laboratories began to run experiments in “extra-sensory perception,” tasking subjects of exceptional ability to guess cards from a custom-made deck, or “receive” drawings at a distance, then recreate them by hand.

ESP drew still more interest from science-fiction writers, whose protagonists could deploy their psionic powers far beyond the lab. Still, these abilities came at a cost; the crippled, the deaf-blind, and the mutant were frequent recipients. Our brains already monopolize the body’s energy reserves, and additional fuel must come from somewhere…

Psionics plays several roles in the sci-fi imaginary—most politically, as the binding force of a group mind. One community in Robert A. Heinlein’s Methuselah’s Children, for example, makes no distinction between its members, who collectively manipulate the genetics and ecology of their world. The group mind, in this scenario, bypasses the strictures of possessive individualism and breaks with social norms. Brains may be bigger or smaller—may belong to different genders, races, and creeds—but they all have a place in the noosphere.

Alas, such ideals are rarely forthcoming, and like many technologies of the twentieth century, psionics found immediate application in warcraft. Spurred by reports of psychic training on the far side of the Iron Curtain, the CIA funded two initiatives, beginning in the 1970s, to practice “remote viewing.” Early prototypes of drones, these psychic spies flew the extra-sensory airstreams, scanning for enemy bases, terrorists, and missing fighter-bombers. Their success rate was better than what a pigeon photographer would have achieved, but not sufficient to keep the program intact. It ended in 1995, conceding defeat to the ascendant machine eye.

The Cold War was a renaissance of psionic sci-fi, before “espionage” became “counter-terrorism,” when secrets lent themselves less to torture than to telepathic extraction. But the genre changed with the times. We no longer need to imagine a group mind, because we’ve found one in cyberspace, nor wait for telepathy, as telepathy-like machines will do. We might find irony in the fact that disability—once a seeming prerequisite for ESP—now gets support from these machines, wherein a thought can move an avatar arm, or reach brains in other countries as quick flashes of lights, to be decoded as digits, then letters. Perhaps it’s time to revise that famous phrase: She thinks, therefore I am.

A moving arm, a flashing light may be as far as we get. Miguel Nicolelis, who predicted the coming of “neurosocial networking,” doubts that emotions, memories, and higher cognitive states will ever be capable of transmission. Could it be that these qualities, so integral to our notion of self, are resilient to telepathic capture? Is the soul digging trenches and fortifying ranks?

Or is this a matter less of the soul than of the science of individuation, which holds that no two brains are exactly alike, and thus no thought can share the same neuronal position? If so, then achieving a group mind would require a feat of Borgesian proportions: seven billion, four hundred million dictionaries to be written with the means of translating between them. Before the printing press, we enslaved ourselves to transcription; soon, transcription may enslave the machines.


What is the face, the figure of humanity—life lived on the 50th percentile? We can’t define the deviant without first inventing the “norm.”

This concept joined the social sciences, in the mid-nineteenth century, through the work of Adolphe Quetelet. Drawing on biological and criminological data, the statistician sought to determine the physical and moral qualities of “the average man.” A harbinger of the eugenics movement, this “man” was an empirical fiction who grew less lifelike and less precise despite the addition of quantitative information.

Quetelet’s model made skeptics of people like Francis Galton, because it implied that taller, smarter, and morally superior individuals also deviated from the norm. Galton responded by devising his own model, one that emphasized the median over the mean, or in other words, where one stood in the rank, not who comprised the average. The norm here became less observed than ideal: an aspiration for social betterment that gave cause for selective breeding.

“The average man” was the first of the century’s many hallucinations, culminating with Galton’s attempt to visualize the biological aspects of the “criminal type.” The resulting images, composited from multiple photographs of unique individuals, were received at the time as optical equivalents of large statistical tables. They showed the faces of true evil to be data manifest.

Composite photography went the way of many nineteenth-century pseudosciences, growing as obscure as the hair and ears of its subjects. Nowadays, though digital technology can intensify this technique—pixel by pixel, layer upon infinite layer—the norm no longer lives on the surface of images, but deep in the grain of the self.

Thanks to The Human Genome Project, we’ve at last revised the “rough draft” of humanity, producing a “‘consensus’” DNA sequence that, David Serlin qualifies, is, “like all composites, a fiction.” What, after all, defines the normality of a genome in constant change?

By sequencing and patenting our genes, we follow in the footsteps of Quetelet: adding data to “the average man,” observing the scope of deviation. Engineering the perfect human, however, requires a leap in Galton’s direction. Just as Galton had to rework Quetelet’s model, in justifying eugenic practices, so too must genetics do more than plot and “explain” our genome, by claiming the authority to improve its every last fault.

What is the figure, the future of humanity—life engineered for the 50th percentile? In assimilating to the genomic norm, we’ll forgo much more than deviance. Genetic diversity will lessen, and vulnerabilities increase: minor viruses that grow to epidemic proportions, endgames that prey on our lack of divergence. The tree, stripped of its branches and leaves, will memorialize something we’ve forgotten to remember.


At one time, for practical reasons, the heating systems of museums found lodging in their sofas. The ottoman of the Louvre’s Salon Carré—that infamous object from Henry James’s The American, where the art of seduction was ever on display—contained a coal grate to keep bodies and passions inflamed. In warmer seasons, when the libido can more or less heat itself, such seats became park benches: resting stops for “aesthetic headaches” as much as for wearied amblers and would-be picnickers.

Contemporary museums, in contrast, are decidedly less commodious. We still chance upon a sublime artwork from time to time, then stumble back in disbelief, yet rarely are our Stendhal swoons caught by plush upholstery. A wooden bench might break our fall, or a daybed on holiday from its analyst. More often than not, we hit the floor.

The museum seat is one in a constellation of display structures that increasingly cater to the “disembodied” spectator. This peculiar human, comprising merely two eyeballs and a brain, began haunting museums as early as the mid-nineteenth century—and prompting shifts in institutional design. Joel Sanders and Diana Fuss have traced the seating of London’s National Gallery, for example, which began in a private residence in the early century, where furniture could be moved at the viewer’s discretion. However, once the Gallery relocated to the heart of the city, only a few chairs remained, implicitly fixed in their positions. An engraving of the era depicts viewers familiarizing themselves with the new norm; they stand, they look, and they contemplate.

By the time MoMA opened its doors in 1939, the disembodied spectator had assumed modern airs, no longer soft-shoeing in search of moral education, but flowing through the galleries like a shopper through a department store. Amidst this marvelous circulation, the museum seat appeared increasingly lost: a relic of the time when a body could suffer the exhaustion, an eye the strain, that it uniquely relieved. The first MoMA benches came with backs, but those would disappear soon after, reducing the museum seat to a signpost for significant artwork—an entreaty to give a little more from our shrinking attentional wallets.

Nowadays, a few museums in the world carry new types of seats, as uncomfortable as MoMA’s ascetic units, albeit for a very different reason. Ergonomically designed for future bodies, they prescribe corporeal norms that no living human can fit. And so they’ll wait, like the museums themselves, until these bodies come along to fill them…

Nothing under the sun, no matter how unbelievable or fantastic, is immune to the pressures of evolution. Take science-fiction. The Force, the mind meld—the entire field of psionics, for that matter—have the look of yellowing comic books, the taste of stale popcorn. They would have gone the way of the dodo, if not for the magic of capital. Hollywood has proved to be more powerful than natural selection, building menageries in the form of franchises, gilding cages for endangered ideas. The future has never been better preserved; the future has never looked older.

My language

Selected content
English, French

Tyler Coburn

is an artist and writer based in New York. Coburn’s writing has appeared in frieze, e‑flux journal, Mousse and Rhizome, among others. His performances, sound works and installations have been presented at the Whitney Museum of American Art, New York; South London Gallery; Kunstverein Munich; LAXART, Los Angeles, among others.

Other texts by Tyler Coburn for DIAPHANES