Viewing entries in
Science

1 Comment

A New Moral Treatment

Tumblr_m67akuyeSs1ryxf7xo1_500
CITY JOURNAL
Spring 2013

A New Moral Treatment
by James Panero

Humane institutionalization can help the mentally ill and protect society.

If it’s true that “men moralise among ruins,” as Benjamin Disraeli wrote, the ruins of America’s nineteenth-century mental institutions should invite some serious reflection. Built between 1850 and 1900, these crumbling edifices speak to our onetime dedication to caring for the mentally ill. Almost all were designed on the Kirkbride Plan, named for Pennsylvania physician Thomas Story Kirkbride, author of an influential treatise on the role of architecture and landscape in treating mental disorders. Even in their dilapidated state, it’s possible to see how the buildings, which followed a method of care called the “moral treatment,” gave the mentally ill a calming refuge from the gutters, jails, and almshouses that had been the default custodians of society’s “lunatics.”

Unfortunately, in the middle of the twentieth century, as asylums became grossly overcrowded and invasive treatments aroused public concern, the moral treatment came to seem immoral. The eventual result was the process known as deinstitutionalization, which steadily ejected patients from the asylums. Instead of liberating the mentally ill, however, deinstitutionalization left them—like the asylums that once sheltered them—in ruins. Many of today’s mentally ill have returned to pre-Kirkbride conditions and live on society’s margins, either sleeping on the streets or drifting among prisons, jails, welfare hotels, and outpatient facilities. As their diseases go untreated, they do significant harm to themselves and their families. Some go further, terrorizing communities with disorder and violence. Our failure to care for them recalls the inhumane era that preceded the rise of the state institutions. The time has come for new facilities and a new moral treatment.

When Kirkbride published On the Construction, Organization, and General Arrangements of Hospitals for the Insane in 1854, he expressed concerns that remain relevant today. “The plan of putting up cheap buildings in connection with county or city almshouses for the care of the insane poor, and under the same management, cannot be too severely condemned,” he wrote. “Such structures are sure to degenerate into receptacles of which all humane persons will, sooner or later, be heartily ashamed.” Proper care, Kirkbride continued, occurs “only in institutions specially provided for this class of disease.” He concluded that “the simple claims of a common humanity . . . should induce every State to make a liberal provision for all its insane, and it will be found that it is no less its interest to do so, as a mere matter of economy.”

Kirkbride’s arguments were compelling, and state legislatures across the country began building asylums to his specifications, spending millions to care for the mentally ill during the late nineteenth century. Set in serene landscapes and composed of substantial brick edifices, the asylums resembled palatial estates. The Hudson River State Hospital for the Insane in Poughkeepsie, New York, was designed in 1867 by Frederick Clarke Withers in High Victorian Gothic; its grounds were the work of Frederick Law Olmsted and Calvert Vaux, the same duo responsible for Central Park in Manhattan. Buffalo State Asylum for the Insane, begun in 1870, was designed by H. H. Richardson in a style that marked the advent of his Romanesque Revival period, with grounds again planned by Olmsted and Vaux. The enormous New Jersey State Lunatic Asylum at Morristown owed its founding to the persistence of Dorothea Dix, a nurse who famously lobbied state legislatures for funding for the mentally ill. Designed by Samuel Sloan in Second Empire Victorian in the early 1870s, the building was said to have the largest continuous foundation in the United States until the construction of the Pentagon about 70 years later.

122496
Under the Kirkbride Plan, patients were treated in the beautiful, calming surroundings of such institutions as the Buffalo State Asylum for the Insane.

 

At a time when the medical science of mental illness was in its infancy, the Kirkbride Plan created alternative, protected worlds for patients. It echoed many of today’s more holistic approaches to treatment by encouraging patients to participate in social activities, games, and crafts. Kirkbride institutions often sported their own baseball diamonds, golf courses, bakeries, bowling alleys, ice cream shops, dairy farms, gardens, and stages for plays and other performances.

But in the twentieth century, a shadow fell over the Kirkbride asylums, as doctors there began using more invasive procedures. The Austrian psychiatrist Manfred Sakel introduced insulin shock therapy, now known as insulin coma therapy, in the 1930s. Electroshock therapy arrived from Italy soon after. Both treatments induced seizures to alter brain chemistry in patients with depression and schizophrenia. In 1949, the Portuguese neuropsychiatrist Egas Moniz won a Nobel Prize for developing the frontal lobotomy, which he had invented in 1935. Walter Freeman, a clinical neurologist in Washington, D.C., further popularized the treatment through his own outpatient procedure, which came to be known as the transorbital, or “ice-pick,” lobotomy. Freeman performed the ten-minute operation—in which he inserted long metal rods around the eyeballs of his patients and penetrated, stirred, and severed their frontal brain matter—some 3,500 times. Among the recipients of lobotomy was 23-year-old Rosemary Kennedy, the future president’s sister, who wound up severely disabled by the procedure in 1941.

As her siblings embarked on a lifetime crusade against institutionalization and invasive treatment, they joined a growing chorus that included civil libertarians and conscientious objectors who had been assigned to work in the asylums during World War II. Starting in the 1950s, these critics could tout powerful new antipsychotic drugs, such as Thorazine, as an alternative to institutionalization. By temporarily blocking receptors in the dopamine pathways, the drugs could inhibit psychotic hallucinations and produce a semblance of clarity for many patients—so long as the drugs were regularly administered. The miracle medicines seemed to obviate the need for separating the mentally ill from the rest of society.

Critics of institutionalization protested, too, about the asylums’ crowded conditions. By midcentury, some psychiatric wards designed to house 50 to 60 patients were struggling to accommodate twice that number. In 1946,Life ran a photo essay subtitled MOST U.S. MENTAL HOSPITALS ARE A SHAME AND A DISGRACE. Two years later, journalist Albert Deutsch published an exposé of the institutions called The Shame of the States. The apparent ease with which the government could commit patients and subject them to invasive treatments also alarmed critics. In 1955, fringe elements of the American Right, along with L. Ron Hubbard’s new Church of Scientology, described a congressional proposal to expand mental health care in the Alaska territory as “Siberia, USA.” The Santa Ana Register ran an editorial denouncing the proposed Alaska-based institution as a “concentration camp for political prisoners under the guise of treatment of mental cases [and] our own version of the Siberia slave camps run by the Russian government.”

In 1963, President Kennedy laid the groundwork for an alternative to the asylums, proposing a federal law that, once passed, provided the seed money for a system of decentralized Community Mental Health Centers (CMHCs). These facilities sought to remove the mentally ill from the state-run asylums and to incorporate them into more social, usually urban, settings. And two years later, Medicare and Medicaid, creations of President Lyndon Johnson’s Great Society, erected a new funding apparatus that effectively, if unintentionally, drove patients out of the asylums. Since mental institutions had always been a state responsibility, the two federal programs deliberately excluded state mental-hospital patients between the ages of 22 and 65 from coverage. Patients passing through the CMHCs, by contrast, could now be eligible for partial federal reimbursements. The ability to shift 45 percent of treatment costs, on average, to Washington proved too great a temptation for the states, which promptly began emptying their asylums.

Pic

Further, at the same time that patients were driven from institutional care, civil libertarians set up legal barriers to committing patients to the institutions that remained. Needing treatment was no longer enough to be committed; “patients now had to be a danger to themselves or others,” writes E. Fuller Torrey, a psychiatrist and the nation’s leading critic of deinstitutionalization. That standard may sound reasonable in theory, but in practice, it meant that only extremely violent patients could be committed—often, only after they’d acted on their violent impulses. Adjusting for population growth, Torrey calculates that over 90 percent of patients who would have been committed before deinstitutionalization are now out in the world.

As for the Kirkbride buildings, many fell into shambles. Some were converted into condominiums, stores, and retirement villages. In upstate New York, the Matteawan State Hospital for the Criminally Insane is now part of the Fishkill Correctional Facility—a telling transformation that mirrors the journey, taken by so many mental patients, from asylums to prisons. And in recent years, documentarians and photographers have shown a renewed interest in the once-grand buildings, often gaining access surreptitiously to record what remains there. The fascination can seem lurid, as though the voices of schizophrenia still echoed through the halls. Layers of paint have flaked into mountains of dust. In the winter, stalagmites of ice reach up from the floors. An odd piece of furniture, an old wheelchair, or an abandoned loom may serve as a reminder of former residents’ lives and activities. But what the cameras mainly find is the buildings’ haunting beauty. Soaring ceilings, ornate door hinges, and elaborate tile work all speak to the care and expense that went into their design and construction. Websites like Kirkbridebuildings.com now compile photographs and host forums for discussion. Community preservation groups have fought to save the buildings from demolition.

When the CMHCs replaced the asylums, some backers of the new arrangement argued, in the words of President Carter’s Commission on Mental Health, that it would allow the “greatest degree of freedom, self-determination, autonomy, dignity, and integrity of body, mind, and spirit for the individual.” Unfortunately, this self-determination proved far worse for the severely mentally ill than the state institutions ever were. Within a year of leaving institutional care, according to researchers, half of all mentally ill patients fail to take their prescribed antipsychotic medications—a terrifying prospect for the vast numbers of patients who left the asylums under deinstitutionalization. Between 50 and 60 percent of patients discharged from state institutions were schizophrenic. Another 10 to 15 percent had been diagnosed with manic-depressive illness or severe depression.

The CMHCs have proven woefully inadequate at caring for this massive population. They offer far less supervision, professional care, and patient coordination than the old state institutions did. Also, with an eye on the bottom line, the managed-care companies that run them generally avoid taking on costly patients with severe illnesses.

For evidence of the failure of the CMHCs, just look at the way so many mentally ill people actually live today. Deinstitutionalization has consigned them to a terrifying roller-coaster ride among prisons, emergency rooms, and the streets. Public psychotic episodes, now a common sight in American cities, are, at the very least, frightening examples of the loss of social order. Last year, a New York homeless man made headlines when he was caught on video making angry outbursts in front of children. Calling himself “Adam Sandler” and dressed as the Sesame Street character Elmo, the man posed for money in Central Park and routinely shouted anti-Semitic rants while in costume, sending children and parents fleeing. The behavior earned him a profile in the New York Times, which revealed that he had once operated a pornographic website in Cambodia called Welcome to the Rape Camp. The article reported that “Sandler” had been sent to Metropolitan Hospital Center after his most recent episode, but he was neither arrested nor committed. Days later, he was back in costume, posing with children for money. “Obviously, they saw I was not a threat to myself or anybody,” he said, adding that doctors had described him as “a little paranoid.”

Far more frightening than episodes like that is the violence that a small percentage of the severely mentally ill inflict on society. In recent years, untreated mentally ill people have committed many of America’s mass homicides. The list includes Seung-Hui Cho, who in 2007 killed 32 and injured 24 at Virginia Tech; Jared Lee Loughner, a diagnosed schizophrenic who in 2011 killed six people and injured 14, including U.S. Representative Gabrielle Giffords; and possibly James Holmes, who is currently awaiting trial for opening fire in 2012 on a crowded movie theater in Aurora, Colorado, killing 12 and injuring at least 58. It may include, too, Adam Lanza, who last December killed 26 people at Sandy Hook Elementary School in Newtown, Connecticut. While Lanza’s condition at the time of the shootings remains a mystery, it has already been determined that he had a history of mental disorder.

Psychotic breakdowns on a smaller scale pose an even greater public concern. A 2008 study in Indiana found that 10 percent of inmates imprisoned for homicide had been diagnosed with severe mental illness, a number consistent with similar studies in Europe. Or consider the New Yorker’s recurring nightmare: being pushed under an oncoming subway train. In 1999, Andrew Goldstein, a schizophrenic who had stopped taking his medications, shoved Kendra Webdale to her death beneath a train in New York City. The incident led to the creation of Kendra’s Law, which gave New York courts the modest power to compel the mentally ill to accept treatment as a condition of living in society. (Kendra’s Law was nonetheless opposed by the ACLU.) This past December, a homeless drifter named Naeem Davis was seen exhibiting erratic behavior on a subway platform before allegedly shoving Ki-Suck Han onto the tracks and killing him. Later that month, Sunando Sen was killed the same way. Erika Menendez—a woman with “a history of psychiatric problems,” according to the Daily News—confessed to the crime.

These incidents suggest a correlation between deinstitutionalization and violent crime, a relationship that Torrey and others confirm. According to a report issued by the Treatment Advocacy Center (TAC), a think tank founded by Torrey, several studies have shown that “having fewer public psychiatric beds was statistically associated with increased rates of homicide.” Christopher Ferguson, an associate professor of psychology and criminal justice at Texas A&M, has written in Time that the rise of masshomicide “began in the late 1960s and coincided with the deinstitutionalization movement, when mental asylums were closed down and services diminished.”

The connection between mental illness and crime would come as no surprise to law enforcement professionals. Since deinstitutionalization, police and sheriffs’ departments have reported an overwhelming increase in mental illness–related calls, a trend that continues today. A 2011 survey of 2,400 law enforcement officials reported that responding to these calls had become “a major consumer of law enforcement resources nationally.” A TAC study in 2010 found that there were now “three times more seriously mentally ill persons in jails and prisons than in hospitals.” Many county sheriffs’ associations estimate that over a quarter of their jail population is mentally ill. The Los Angeles County Jail has become the largest de facto inpatient psychiatric facility in the United States, says Torrey; New York’s Rikers Island Prison Complex is the second-largest.

Though the proponents of deinstitutionalization claimed that it would save money, even that claim hasn’t stood the test of time. Yes, expensive institutional beds have been eliminated. But weigh those savings against the costs that must be borne by other facilities, such as emergency rooms, prisons, jails, and nursing homes. “Untreated mentally ill individuals revolve endlessly through hospitals, courts, jails, social services, group homes, the streets and back again,” reports TAC. “It is a spectacularly inefficient and costly system, perhaps best symbolized by ΩMillion Dollar Murray,≈ a mentally ill homeless man who cost Nevada more than $1 million, mostly in emergency department costs, as he rotated through the system for 10 years.” Consider, too, the dollar burden that the mentally ill have piled on law enforcement agencies.

But even more important is the human cost of preventing sick people from receiving proper treatment. The current legal barrier to commitment “is not just unfeeling, it is uncivilized,” writes the columnist Charles Krauthammer, a former chief resident in psychiatry at Massachusetts General Hospital. “The standard should not be dangerousness but helplessness. Society has an obligation to save people from degradation, not just death.”

While the backers of deinstitutionalization recognize these problems, they have largely doubled down on their own solution, calling for even more funding of the poorly managed local facilities that replaced the asylums. But recently, a few psychiatrists and other members of the mental health profession have joined urbanists and law enforcement officials in questioning the wisdom of deinstitutionalization. Last April, H. Steven Moffic, a tenured professor of psychiatry at the Medical College of Wisconsin, wrote an article called “Is It Time for Re-institutionalization?” in the Psychiatric Times. “Have we gone too far in making it difficult to hospitalize someone,” he asked, “and are our hospitalizations generally too short anyways to help clarify diagnosis and carefully make any medication adjustments?” His answer: Yes. Moffic went on to praise recent expansions at a handful of psychiatric hospitals in Massachusetts and Vermont. The tide, he said, was turning back toward the institutions.

Moffic’s article provoked a flurry of online responses, showing that after decades of condemnation, institutionalization is becoming suitable for discussion again within the mental health community. “I do think something needs to change,” wrote Rebecca Trewyn, a psychiatric nurse in Wisconsin. “Most of the patients I am currently seeing in private practice . . . will soon end up in prison because we can’t treat them properly in the 15 minutes we have.” Frank Miller, who works near an original Kirkbride building, wrote to say how “many many things went wrong” as “the state hospitals were downsized” and as patients were transferred to private mental health providers. Like Moffic, Miller happily reports that a new state hospital building is under way at his institution “to provide relief to the private hospital ERs and jails where the chronically mentally ill are Ωparked.≈ ” David Bell, who began working in an Australian mental asylum in 1956, lamented how in his country, “we have committed all the same reforms in the name of de-institutionalization, closed the asylums and opened many new jails, walked past the homeless lying on benches or surrounded by their bags in doorways and reluctantly poured increasing funds into Ωcommunity mental health.≈ Like Dr. Moffic I do not look back with horror at my work in the institution, but with some fondness.” Bell ended: “What is to become of the mentally ill and retarded? Give them asylum for as long as they need it.”

Sixty years ago, Jo Garfield took asylum exactly that way. Battling a severe eating disorder and underlying manic-depressive symptoms, Garfield was anorexic at 16, grew to 225 pounds a year later, and became addicted to the prescription drug Dexedrine. At college in Wisconsin, after she stole a prescription pad and filled her scripts at drugstores, her mental state deteriorated further.

Her parents sent her to Chestnut Lodge, a small private mental institution in Rockville, Maryland. When she went, she tells me, “they made sure I didn’t have meds, but I had concealed them in my sneaker.” Yet Garfield grew to like the institutional setting, and her condition began to improve. She willingly extended her initial treatment of a few months to a few years. She was surrounded by patients with far worse disorders: “For the first time, I felt like I was at least as healthy or more so than the people around me.”

Garfield spent two and a half years at Chestnut Lodge. “The whole idea was interpersonal relationships,” she recalls. “You learn to cope with life and people by interacting with patients and nurses and doctors. They had arts and crafts. A newspaper. On the floor I started out was the less seriously disturbed, but it was a locked ward. Then once they decided I could be trusted not to get pills, they moved me to this smaller thing called Little Lodge. There was a lot of bridge playing. I got involved in this small store.” When she finally left, she discovered, “I had done some behavior modification. Before that I couldn’t control eating. Now I had changed my habits. I had retrained myself.” Examples like Garfield’s demonstrate how institutionalization can offer a path to recovery. In 1986, she wrote about her experiences in a book called The Life of a Real Girl: A True Story. Today, she’s a respected writer and a prominent collector of modern art.

A close friend of mine, born into the deinstitutionalized era, wasn’t so fortunate. Mel, as I’ll call him, was described by his parents as brilliant from an early age. He read at age two, asked to study cello at four, and entered a preeminent music school at 12. Mel was as good at math and English as at music. He was also one of the wittiest people I’ve known. Yet soon after he enrolled in college, his eccentricities became more pronounced. He took time off and traveled to the far reaches of Asia—where, he told me, maps didn’t yet exist. Sometime after his return, I learned that he had been diagnosed with schizophrenia. He drifted among flophouses in Trenton, New Jersey. By phone, we talked about his difficulty taking antipsychotic medication; he felt much better without it, he said. Soon after the terrorist attacks of September 11, 2001, Mel left a message on my answering machine to say hello—one in a series of calls that he made to friends, as I later learned. Two days later, he jumped to his death from a New York high-rise. He was 25.

It’s impossible to be certain that Mel’s fate would have been different without deinstitutionalization. But it’s certainly clear that deinstitutionalization has made life worse for those whose illness prevents them from independently following their own best course of treatment. And it’s a troubling thought that the lifetime suicide rate, about 1 percent for the general population, is over 10 percent for schizophrenics.

Even as more critics come forward, deinstitutionalization continues. In 1955, nearly 600,000 severely mentally ill patients were in the care of state psychiatric hospitals. By 2010, only 43,000 state psychiatric beds remained available for the mentally ill. That equals about 14 psychiatric beds per 100,000 Americans, far fewer than the minimum of 50 recommended by TAC and nearly identical to the per-capita number that existed in 1850, when the institutional movement first began.

A century and a half ago, the need for a moral treatment of the mentally ill led to institutions that offered the most advanced care of the day. The fiscal and legal barriers to repeating that achievement may seem insurmountable. But undoing 50 years of bad policy is easy compared with what today’s mentally ill must endure.

1 Comment

Comment

Fred Dicker Live: Hydrofracturing

 

James writes:

This morning I talked with Fred Dicker, dean of the Albany press corps, on AM 1300 about hydrofracturing in Pennsylvania and New York and speculated on the reasons for its opposition downstate. I am at work on a story about hydrofracturing for an upcoming issue of City Journal.

Be sure to tune in also for some nice words about The New Criterion and Hilton Kramer. The broadcast is below.

BE1kWRCCAAE6l4c

James Panero touring a drill rig in Susquehanna, Pennsylvania, the first step in natural gas extraction by means of hydrofracturing. Similar reserves exist twenty miles north across the New York state line, but Albany has so far blocked such gas development.

Comment

Comment

The Culture of the Copy

2885065_f520
THE NEW CRITERION
January 2013

The Culture of the Copy
by James Panero

On the printing press, the Internet & the impact of duplication.

 

We now live in the early part of an age for which the meaning of print culture is becoming as alien as the meaning of manuscript culture was to the eighteenth century. “We are the primitives of a new culture,” said Boccioni the sculptor in 1911. Far from wishing to belittle the Gutenberg mechanical culture, it seems to me that we must now work very hard to retain its achieved values.
—Marshall McLuhan, The Gutenberg Galaxy, 1962

Technological revolutions are far less obvious than political revolutions to the generations that live through them. This is true even as new tools, for better and worse, shift human history more than new regimes do. Innovations offer silent coups. We rarely appreciate the changes they bring until they are brought. Whether or not we become the primitives of a new culture, as the Futurist Umberto Boccioni observed, most of us still live behind the times and are content to do so. We expect the machines of the present to fulfill the needs of the past even as they deliver us into a future of unknowns.

World-changing inventions almost always create new roles rather than fill old ones. It’s a great invention, but who would ever want to use one? was the classic response to the telephone, variously attributed to Ulysses S. Grant or Rutherford B. Hayes but probably said by neither of them. Life-altering technologies often start as minor curiosities and evolve into major necessities with little reflection on how they reform our perceptions or even how they came to be.

In the eighteenth century, Edmund Burke could see the significance of the French Revolution while observing its developments in real time. Yet “in the sixteenth century men had no clue to the nature and effects of the printed word,” writes Marshall McLuhan in The Gutenberg Galaxy, his 1962 book on the printing revolution and the dawning of the electronic age. It wasn’t until nearly 200 years on that Francis Bacon located the printing press alongside gunpowder and the compass as changing “the whole face and state of things throughout the world.” Writing in his 1620 book Novum Organum (“New Instrument”), Bacon maintained that “no empire, no sect, no star seems to have exerted greater power and influence in human affairs than these mechanical discoveries.” In the nineteenth century, Victor Hugo called the invention of printing the “greatest event in history” and the “mother of revolution.” Political revolution began in this technological upheaval.

An argument can be made, and so I will make it here, that the invention of the Internet is the under-recognized revolution of our time. The world-changing technology of the Internet, of course, is already apparent and barely needs retelling. The Internet is more significant than the telephone, the television, the transistor, or the personal computer because it subsumes all these prior inventions into a new accumulation that is greater than the sum of its parts. As the network of networks—the “inter-network”—the Internet is a revolution of revolutions.

Yet while we appreciate the Internet’s technological wonders, the cultural landscape it leads to is less explored. We acknowledge the Internet’s effect on information but are less considering of its influence on us. Even as we use its resources, most of us have no understanding of its mechanics or any notion of the ideas, powers, and people that led to its creation.

One way to situate the Internet is to see it as inaugurating the next stage of copy culture—the way we duplicate, spread, and store information—and to compare it to the print era we are leaving behind. New technologies in their early development often mimic the obsolete systems they are replacing, and the Internet has been no different. Terms like “ebook” and “online publishing” offer up approximations of print technology while revealing little of the new technology’s intrinsic nature.

Just as the written word changed the spoken word and the printed word changed the written word, so too will the digital word change the printed word, supplementing but not replacing the earlier forms of information technology. Speaking and writing both survived the print revolution, and print will survive the Internet revolution. The difference is that the Internet, with its ability to duplicate and transmit information to an infinite number of destinations, will increasingly influence the culture of the copy.

What the world is today, good and bad, it owes to Gutenberg,” wrote Mark Twain. “Everything can be traced to this source, but we are bound to bring him homage, . . . for the bad that his colossal invention has brought about is overshadowed a thousand times by the good with which mankind has been favored.”

The Gutenberg revolution occurred around 1440 in near obscurity. The life of Johannes Gutenberg, the German metalsmith from Mainz, is largely unknown. The exact nature of the invention that he first unveiled in Strasbourg remains a source of debate. Even as the technology of book printing spread through Germany and Italy, Gutenberg died a financial failure. His recognition as the inventor of typography only came at the start of the sixteenth century, over three decades after his death.

Gutenberg did not invent every component that gave rise to the printed page. His innovation, as commonly understood, was to put existing technologies together in a press that used oil ink and movable type to stamp Roman letters arranged in rows onto a page. Gutenberg’s expertise in metalwork helped him develop a metal alloy for the letter punches that could withstand the pressures of the printing process. He also devised a simple hand mold to recast the punches. This not only led to the rise of a book’s standardized font but also enabled the reproduction of the printing machine itself.

The rapid development of print culture in Europe occurred across two trajectories at once. Each printing press could produce hundreds and soon thousands of pages a day, just as the printing machines themselves could be duplicated. In the 1450s, the greatest early demonstration of the new technology was the production of the Gutenberg Bible. Copies of Gutenberg’s rare original Bibles are today considered among not only our most valuable printed books but also the most beautiful. Thirty years after this initial run—a start-up operation that landed Gutenberg in court with his disgruntled investors—there were 110 printing presses in operation across Europe, with fifty in Venice alone. By 1500, European presses had already produced over twenty million books. A century after that, the number was 150 to 200 million copies. The printing press made bestsellers out of writers in their own lifetimes. Erasmus sold a remarkable 750,000 copies. Luther distributed 300,000 printed tracts.

The rise of print culture had countless benefits, but it also overturned many of the achievements of the manuscript culture it replaced. The great proliferation of printed books meant that scholars no longer had to seek out rare copies of written material, but literature also no longer enjoyed the protection of a scholarly class and a culture of scholasticism went into decline. As the sixteenth century saw a renewed interest in ancient writing, due to the wide reproduction of classical works, Latin also lost ground as the lingua franca of high culture. An increasingly literate middle-class public, unfamiliar with Latin, sought out books in their own vernaculars, which helped give rise to new national identities. As reading became a silent activity to be accomplished alone, the printed book challenged the oral tradition. Likewise grammar and syntax were regularized to illuminate sense rather than stress.

The printed page made literature individual. Before the printing press, the medieval student would not have regarded the “contents of the books he read as an expression of another man’s personality and opinion,” writes E. P. Goldschmidt. “He looked upon them as part of that great and total body of knowledge, the scientia de omni scibili, which had once been the property of the ancient sages.” The manuscript era belonged to the scribe—the one writing out the manuscripts. The print era belonged to the author, because a book could now be set just as the author intended. The printed book, in fact, distinguished finished and completed work from drafts and papers in a way that exclusively written technology could not. Printed matter powered ideas with new range and fixity.

The development of the Internet was a more collaborative process than the invention of the printing press, but the two events share many similarities, including an initial disregard for the figures who made them possible. Leading up to his 2000 Presidential run, Al Gore said in an interview that he “took the initiative in creating the Internet,” based on his sponsorship of legislation that removed commercial restrictions on Internet traffic. This statement was widely recast into a claim that Gore had “invented” the Internet. What this absurdity revealed was that, even if a politician had not invented the Internet, almost no one knew who did. This fact remains true even as the Internet continues to expand through the products of widely celebrated but ultimately less significant industrialists and developers.

Starting in 1960, the American computer scientist J. C. R. Licklider was among the first to speculate on the potential for close “man-computer symbiosis” and the possibility of an “intergalactic” network to foster “on-line man-computer communication.” Marshall McLuhan likewise observed the dawning of a new electronic age that is “not mechanical but organic” and would be navigated by “surf-boarding” from one point to the next.

At the same time, the U.S. Department of Defense was identifying a far more practical need for networked computer communication. In the event of a nuclear strike, traditional circuit-based communication systems that required fixed lines would be rendered inoperable. The Advanced Research Projects Agency (ARPA) of the U.S. Department of Defense therefore set about developing the technology for a networked information infrastructure with multiple communications pathways, one that could be robust enough for “survivable communications” to be maintained even if portions of the network were destroyed. The computer network that resulted was called ARPANET, the progenitor and first networked component of the Internet that was switched on in 1969.

The key development that made networked routing possible was what became known as packet-switching. The notion of sending information in distinct bursts that could be routed and rerouted through a networked system was invented in 1960 by Paul Baran at the RAND Corporation and Donald Davies at the National Physical Laboratory in England. Packet switching was soon implemented by Lawrence Roberts, the project manager at ARPA’s Information Processing Techniques Office, through mathematical models developed by Leonard Kleinrock, first at MIT and later at UCLA.

Few of the Internet’s founders, mostly academic computer scientists, have become rich or even known for their early inventions. In 2011, the death of Apple Computers co-founder Steve Jobs received worldwide recognition, but few noted the passing of Baran, a father of the Internet whose contribution to history will ultimately be more consequential that the development of the iPhone.

The history of the Internet is not a “story of a few heroic inventors,” writes Janet Abbate in her book Inventing the Internet. “It is a tale of collaboration and conflict among a remarkable variety of players.” Yet if any one invention could be considered the Internet’s Gutenberg moment, it was the development of the Transmission Control Protocol and Internet Protocol, together known as TCP/IP.

In the early 1970s, Vint Cerf and Bob Kahn came together at ARPA to solve the problem of inter-network communication. The question was how to create packets of information that could be sent not only over the singleARPANET network but also from network to network without considering where it was from, where it was going, or what it passed through. “Some were larger, some went faster, some had packets that got lost, some didn’t,” says Cerf of this variable landscape. “So the question is how can you make all the computers on each of those various networks think they are part of one common network despite all these variations and diversity.”

TCP/IP was their answer, a dual protocol in which IP deals with addressing and forwarding and TCP contends with flow control and error correction. Together TCP/IP became the backbone for the inter-networked communication upon which today’s Internet expands and communicates. More than a mere technical innovation, TCP/IP, like the printing press, is where several technologies came together in one revolutionary development. “It made possible Wi-Fi, Ethernet, LANs, the World Wide Web, e-mail, FTP, 3G/4G,” in the words of Wired magazine, “as well as all of the inventions built upon those inventions.”

This technology allowed the Internet to become infinitely modular and adaptable. Because no one entity controlled its pathways, TCP/IP left the Internet open and provided the world’s greatest conveyance of unregulated information. “The design of the internet architecture is captured in the TCPand IP protocols,” says Cerf, who is now the “Chief Internet Evangelist” for the Google corporation. “It confers equality on all interlocutors on the network (a supercomputer is treated as equal to a laptop from the protocol point of view).”

In 2005, Cerf and Kahn received the Presidential Medal of Freedom for the invention of TCP/IP. “The Internet is proving to be one of the most powerful amplifiers of speech ever invented,” Cerf has written about the technology he shaped. “It offers a global megaphone for voices that might otherwise be heard only feebly, if at all. It invites and facilitates multiple points of view and dialog in ways unimplementable by the traditional, one-way, mass media.”

The story of the Cold War began with the nuclear bomb and ended with the Internet. Both were military developments. Yet unlike the unfulfilled promise of peacetime nuclear energy, the Internet has quickly evolved from a tactical weapon to a strategic instrument of world-wide importance. By promoting the spread of democratic ideas across unregulated networks, the Internet is proving to be an even more effective weapon against totalitarianism than nuclear deterrence. This cultural potential is the reason “we must dedicate ourselves to keeping the network unrestricted, unfettered and unregulated,” argues Cerf, who has campaigned against giving the keys to the Internet over to foreign powers. “We must have the freedom to speak and the freedom to hear.”

The peacetime dividends of the Internet pay out as each new invention and each new network tie into it. In 1990, Tim Berners-Lee invented the World Wide Web, the public face of the Internet, and made the first successful communication between a server and a client running his research team’s new Hypertext Transfer Protocol (HTTP). Social media programs like Facebook and Twitter, search algorithms like Google, weblog interfaces like Blogger, ecommerce sites like Paypal, Voice over Internet Protocol (VoIP) services like Skype, and video streaming like YouTube all emerged thanks to HTTP, which in turn operates throughTCP/IP.

With each new stage of copy culture, the ease of duplication is countered by the increasingly complex technology required to produce and use the copies it creates. Just as Twain wrote that the bad of the printing press was “overshadowed a thousand times by the good,” the Internet age presents its own problems even as it solves countless others.

Through inks, pens, archival writing surfaces, and the required literacy of both the writer and the reader, manuscript culture replaced the simplicity of oral culture. Print culture turned the reproduction of the word into an even more specialized field, yet the information in printed books could still be accessed by any literate person with nothing more than the light to read by.

Not so for digital information. While the Internet has leveled the relationship between producer and consumer—publication is no more difficult than acquisition—both tasks now employ a host of technologies to support them. Access to Internet-based information requires personal computer interfaces, routers, digital storage devices, broadband connections, and electricity. If any one of these technologies fail, the Internet becomes useless. An old book can be as readable as the day it was printed, but digital media from a mere decade ago can become unusable, with unreadable formats and corrupted data.

To be sure, the Internet has given us access to literature as never before. To lament the decline of printed text as a more rarefied medium in a digital age mimics the complaints of those Renaissance elites who favored manuscripts and turned their noses up at middle-class print culture. As digital interfaces improve, it becomes harder to argue that print is an altogether preferable medium to Internet-based text. Printed books are unclean technologies. They are heavy and flammable. The paperback edition was certainly a great invention of early twentieth-century printmaking. Yet is the tiny text of a paperback, sandwiched between flimsy covers, preferable to a book downloaded for free from an initiative like Project Gutenberg, the oldest digital library of out-of-copyright books founded by Michael Hart in 1971, and read on a digital tablet or mobile phone?

The rise of the Internet will no more destroy literature than did the invention of the printing press. On the contrary, the Internet’s new copy culture will almost certainly increase literacy and the spread of democratic ideals, furthering the legacy of the printing press. Currently 2.4 billion people worldwide use the Internet across an estimated 2 to 3 billion connected devices. Seventy-eight percent of the U.S. population is now connected, which is not surprising considering the U.S. origins of the Internet, but half a billion people in mainland China are connected as well, albeit with government interference (known as “The Great Firewall of China”). Just as printed books gave rise to new libraries, and new libraries became new universities, the Internet also has the power to transform education and end-run corrupt institutions by delivering the library, the newspaper, and the classroom to any corner of the world.

Yet a great challenge still exists in the way the Internet records and stores information. A published book is a fixed and polished record of a moment in time. The Internet always operates in the present. Aside from web portals like the “Wayback Machine,” which can provide historical snapshots of webpages, the Internet has no past. With “time stamps” and footnoted “corrections,” web culture has attempted to import the rules of fixed publication, but the Internet still treats all information the same. Any information on the Internet can be updated, changed, or erased at any moment. On the plus side, the mutable quality of Internet-based information has permitted the rise of great user-maintained databases such as Wikipedia. In this way the Internet mimics scribal culture more than print culture: New readers add new insights, and the information the Internet contains is forever evolving.

On the downside, Internet-based information is infinitely more fugitive than printed matter. In order to eliminate the information in a book, each copy must be rounded up and destroyed. For Internet-based information to go down, only the data hosts need be eliminated. Unlike letters sent in the mail, emails are often poorly archived, challenging our ability to preserve important correspondence. As more and more data enters what is known as the Internet cloud and no longer sits on personal storage devices, a centralized loss could be catastrophic.

Such concerns are among the reasons why we should not rejoice in the demise of print culture even as we embrace the Internet’s possibilities. Online resources like Google Art Project reveal the Internet’s great cultural potential by giving us access to visual artifacts as never before. Google’s high-resolution scan of The Harvesters by Pieter Bruegel reveals nuances in the painting that even a close physical inspection cannot show. Still, few would claim that the Metropolitan Museum should move this painting to storage. Likewise with the integration of digital maps, satellite imagery, 3-D rendering, and Streetview photography, which can provide an unparalleled overview of a landscape: no one would pretend an Internet-based map is a substitute for visiting a real place and time.

So, too, for printed matter. In the coming years, the ease of duplication, storage, and transmission will tempt institutions to economize their use of printed books. In “Reproductions for the Plebes,” an essay published in these pages in June 1984, Hilton Kramer warned about a proposal put forward by Edward C. Banfield, a professor at Harvard University, for museums to sell off their original work and replace them with passable facsimiles. Calling it “ghastly,” Kramer attacked Banfield’s idea as “antidemocratic as well as anti-aesthetic.”

A version of Banfield’s theoretical notion is now a real idea, called the “Central Library Plan,” being put forward by the New York Public Library. At stake is the future of the library’s 42nd Street headquarters, completed in 1911 by the firm Carrère and Hastings and recently renamed the “Stephen A. Schwarzman Building” in exchange for a $100 million donation from the private equity investor. As one of the world’s flagship institutions, the NYPL is moving in a direction that is sure to influence all of library culture.

Claiming to appeal to a democratic mandate, the library will seek to “open this iconic building to millions more users—scholars, students, families, job seekers and more.” This will be done by removing much of its non-circulating book collection to a storage location in New Jersey, demolishing the seven floors of stacks that support its main Reading Room, and replacing this area of the building with social space and computer terminals. At the same time, two significant branch libraries nearby will be closed and integrated into the main building’s newly hollowed-out core.

The New York Public Library has a history of undertaking egregious action in order to supplement its bottom line. In 2005, the library sold a painting that was a significant part of its patrimony, Kindred Spirits by Asher B. Durand (1849), for a $35-million payout. From the top down, theNYPL behaves like a government bureaucracy, not the guardian of one of the most important archives in the world.

It is therefore hard to fathom the true motivations behind the Central Library Plan. Talk of access and progressive thinking may just be cover for mere cost savings and the freeing up of valuable real estate for its developer board members to pursue. Yet supposing the library has the public’s best interests at heart, such a move still demonstrates little understanding of the power of print or how libraries should transition into the Internet age.

The digitization of books, a great undertaking, argues against new public space, as Internet-based research can be done from anyplace with a connection. As the culture of the copy shifts away from print media, the preservation and accessibility of printed artifacts becomes an even more vital and pressing concern, just as the rise of print culture did not make ancient manuscripts any less important. The New York Public Library is already the most democratic of institutions, because anyone can access its singular print resources regardless of academic accreditation. Although pursued in the name of democracy, to make these resources any less accessible would be “antidemocratic as well as anti-aesthetic,” as Kramer rightly labeled the Banfield proposal years ago.

Libraries should lead the charge in advancing the values of print culture, just as they need to consider the ways that Internet-based information should be archived and preserved. Print media fills in for the vast limitations of Internet media­—serving as its ultimate backup and giving fixity to information. As we drive technology forward, an equally important task is to preserve the best of what’s left behind. We are living in the Internet’s revolutionary generation. The decisions we make now will affect culture for many years to come.

Comment