Lobotomy’s back – controversial procedure is making a comeback
Frank T. Vertosick
In 1949 lobotomy was hailed as a medical miracle.
But images of zombielike patients and surgeons with ice picks soon put an end to the practice.
Now, however, the practitioners have refined their tools.
Last year a Team of Harvard Investigators headed by neurosurgeon G. Rees Cosgrove published a technical report bearing the ponderous title “Magnetic Resonance Image — Guided Stereotactic Cingulotomy for Intractable Psychiatric Disease.” Although steeped in medical jargon, the report’s central thesis — that psychiatric diseases can b e treated by the selective destruction of healthy brain tissue — dates back to a much earlier, less sophisticated age when the search for a surgical cure for mental illness spawned an entire medical specialty known as psychosurgery.
Psychosurgery enjoyed a brief period of global acceptance around the time of World War II but was quickly driven from the medical mainstream with the advent of better, nonsurgical methods of treating the mentally ill. Now, almost half a century after psychosurgery’s demise, the Harvard Medical School and a handful of other centers are hoping that new and improved surgical techniques can revive it. Today’s neurosurgeons are also trying to rename the field “psychiatric surgery,” presumably to avoid the Hitchcockian overtones of the older moniker. But, as rock star Prince discovered, shedding the name that made you famous isn’t easy.
In their 1996 paper that appeared in the respected journal Neurosurgery. Cosgrove and his co-workers described a brain operation designed to relieve emotional distress and reduce abnormal behavior. Between 1991 and 1995, they performed cingulotomies — which means, essentially, that they burned dime-size holes in the frontal lobes of the brain — on 34 patients suffering from one of the following afflictions: severe depression; bipolar disorder, or manic-depression; obsessive-compulsive disorder (OCD); and generalized anxiety disorder. The target of their operations, the cingulate gyrus, is a thin ribbon of gray matter believed to play a role in human emotional states. The authors used a computer-guided technique known as stereotaxis to advance an electrode into the cingulate gyrus, then cooked the tissue with electric current.
Cingulotomy produced major clinical improvement, as judged by psychiatrists, in a little over a third of the patients; another quarter of them had a “possible response” to surgery. Not stellar results to be sure, but the Harvard patients all had severe disease that had proved resistant to all other available therapies. Any good outcomes in this population might be significant, and the investigators believed that their results were good enough to warrant a larger trial of Cingulotomy.
Despite its high-tech approach, however, the Harvard paper still looks anachronistic, to say the least. Finding a paper extolling the virtues of psychosurgery in today’s medical literature is rather like finding one advocating blood-letting. Modern neurosurgeons destroying normal brain to treat mental illness? To borrow from Samuel Johnson, this is akin to a dog walking on its hind legs — the question is not how well the act can be done but why it’s even attempted.
In spite of its elevated reputation, neurosurgery is a crude business, even — or especially — to a neurosurgeon, and I’ve been in practice for ten years. When confronted with an exposed brain at the operating table, I feel as if I’m about to repair a computer with a chain saw. The living brain has a surreal fragility; its porcelain surface is laced with delicate arteries that begin as thick cords but quickly branch into finer and finer threads. Looking at the surface of the brain is like looking at a satellite photo of a large city — one immediately senses a function far more complex than what is visible.
The idea that a sophisticated derangement in brain function, like OCD, can be cured by frying holes in the frontal lobe looks as patently absurd as recovering a lost file from a floppy disk by burning it with a curling iron. But experience suggests that such lesions can work, if they are done correctly and on the right patients.
Psychosurgery got its start back in 1890 when Swiss psychiatrist and surgeon, tried removing portions of the cerebral cortex from schizophrenic brains. His victims, previously agitated and tormented by violent hallucinations, became more “peaceful” after the operation. Burckhart’s operation didn’t impress his colleagues, though, and an angry outcry from the European medical community prevented its further use.
Psychosurgery surfaced again with a vengeance in Portugal, during the mid-1930s; shortly thereafter, neurologist Walter Freeman enthusiastically imported it to the United States. Psychiatrists started to believe Freeman’s proselytizing hype, and desperate families of the mentally ill began seeking surgery for their loved ones. During World War II the United States saw an increased demand for mental health care as thousands of combat-fatigued veterans crowded already overburdened hospitals. In this setting, psychosurgery became established as a standard therapy. Over the 20-odd years that psychosurgery held the attention of the medical mainstream, perhaps as many as 35,000 patients underwent psychiatric operations of one form or another.
But as Burckhart had discovered decades earlier, the medical community could not long ignore the ethical quagmire surrounding psychiatric brain operations. In the 1950s the rising use of psychosurgery ignited a national debate over the morality of inflicting irreversible brain injuries on the most emotionally vulnerable patients. While this debate smoldered among academics right up to the 1970s, the introduction of the tranquilizer chlorpromizine in 1954 rendered many of the concerns about psychosurgery moot.
Armed with effective chemical therapies, psychiatrists soon turned to pills instead of the knife and quit referring their patients for surgery. A few centers continued to use modified forms of psychosurgery on very small numbers of patients, both here and in Europe, well into the 1980s, so psychosurgery as a specialty never died — although psychosurgery as an industry did.
Should psychosurgery be brought back from the realm of the experimental and made a mainstream treatment once again? Should we reopen this ethical can of worms? As Cosgrove’s report shows, there are those who think we should. Hundreds of severely incapacitated people fail all other treatments, including drugs, electroshock, and psychotherapy, leaving surgery their only option. The illness most helped by cingulotomy — major depression — can be life-threatening. If psychosurgery works, shouldn’t it be used?
The successful resurrection of extinct brain operations has a recent precedent: pallidotomy for parkinsonism. In this procedure, parts of the globus pallidus, a clump of tissue in the core of the brain controlling limb coordination, are surgically destroyed. The operation is technically similar to cingulotomy, and in the past few years it has enjoyed a renaissance. Before the discovery of L-dopa — a chemical substitute for the brain chemical dopamine — surgeons carried out pallidotomies and a number of other destructive procedures to ease the tremor and rigidity of Parkinson’s disease. After the introduction of L-dopa, the role of the surgeon in the treatment of Parkinson’s lessened, and the operations soon fell into relative disuse.
While L-dopa did revolutionize the treatment of Parkinson’s, the drug proved ineffective in a small number of patients. Still others responded to medical therapy only to become resistant to it months or years later. As neurologists accumulated more experience with drug treatments for Parkinson’s, they realized that medical therapy alone could not keep the disease at bay. A growing demand for alternative treatments renewed interest in pallidotomy, and several medical centers began trying it again. Since today’s image-guided pallidotomy can be done midi far greater accuracy than was ever possible before, modern surgical results have been excellent, and pallidotomy is currently available nationwide.
But bringing back pallidotomy, an operation with no historical baggage, was a piece of cake. To achieve a similar comeback in their own field, modern neurosurgeons must overcome psychosurgery’s dark past — a considerably more difficult task.
Looking back today, psychosurgery is seen as nothing short of a mental health holocaust perpetrated by mind-stealing hacks in the dimly lit clinics of public psychiatric hospitals. It will always be synonymous with the flagship operation of its heyday, the dreaded prefrontal lobotomy. In the conventional form of the operation, a neurosurgeon poked holes in the patient’s skull just above and in front of the ear canals on both sides of the head and plunged a flat knife, called a leucotome, into the frontal lobes to a depth of about two inches. By sweeping the leucotome up and down within the brain, the surgeon amputated the anterior tips of the frontal lobes, the so-called prefrontal areas, from the rest of the brain. In contrast to the half-inch lesions of pallidotomy and cingulotomy, the lobotomist sliced an area of brain equal to the cross section of an orange.
This technique soon gave way to a quicker, albeit somewhat grislier, version of prefrontal lobe destruction. Before World War II, brain surgeons — not exactly a dime a dozen even today — were quite scarce; this lack of surgical expertise hindered the wider use of psychosurgery. To rid himself of the need for a surgeon, Freeman began tinkering with the transorbital approach invented by Amarro Fiamberti in Italy. (At this point, James Watts, Freeman’s surgical colleague in conventional lobotomies, ended their collaboration, saying the transorbital procedure was too risky.)
In Freeman’s modification of the procedure, the lobotomist inserted an ice pick (yes, an ice pick) under the upper eyelid and drove it upward into the frontal lobe with a few sharp raps of a mallet. The pick was then twisted and jiggled about, thus scrambling the anterior frontal lobes. The ice-pick lobotomy could be done by anyone with a strong stomach, and, even better, it could be done anywhere. Freeman carried his ice pick in hi-s pocket, using it on one occasion to perform a lobotomy in a motel room. A cheap outpatient procedure, the ice-pick lobotomy became a common psychosurgical choice in state hospitals across the country.
In the late 1950s lobotomy’s popularity waned, and no one has done a true lobotomy in this country since Freeman performed his last transorbital operation in 1967. (It ended in the patient’s death.) But the mythology surrounding lobotomies still permeates our culture. Just last year the operation surfaced on the television show Chicago Hope. Few of us have ever met a lobotomized patient, but we all know what to expect — or at least we think we do. Who can forget the vacant stare of the freshly knifed Jack Nicholson in One Flew Over the Cuckoo’s Nest? At best, according to the popular conception, the luckier victims recovered enough to wander about like incontinent zombies.
Although some patients ended up this way, or worse, the zombie stereotype derives more from Hollywood fiction than from medical reality. Lobotomy peaked in the 1950s, not during the Middle Ages. While we may have been a little more bioethically challenged back then, we weren’t Neanderthals either. Lobotomy could never have survived for 20 years if it yielded a lot of cretins. In fact, intelligence, in those cases where it was measured pre- and postoperatively by formal testing, remained unaffected by a competent lobotomy and in some cases it even improved.
Not surprisingly, the operation did have disturbing side effects. Patients often suffered major personality changes and became apathetic, prone to inappropriate social behavior, and infatuated with their own toilet habits. They told pointless jokes and exhibited poor hygiene. Postoperative deaths, although uncommon, occurred and could be gruesome. But all these problems must be put into the context of the era: in the 1940s brain surgery for any disease was very risky.
It’s easy for us to forget that the media first hailed psychosurgery as a medical miracle. Lobotomy’s reputation once ran so high that the Nobel committee awarded the prize in Medicine and Physiology to its inventor, the Portuguese neurologist Egas Moniz, in 1949. But less than a decade after this endorsement, lobotomy was dead md its memory vilified.
The operation’s descent into disgrace had many causes. For one thing, lobotomy never had a scientific basis. Moniz got the idea for it in a flash after hearing a presentation by Fulton and Jacobsen, two Yale physicians, during a 1935 neurological conference in London. The Americans described two chimpanzees, Becky and Lucy, that had become remarkably calm after frontal lobe ablation.
This single, almost casual observation prompted Moniz to return home and begin human trials immediately. Further animal work would not be useful, he argued, since no animal models of mental illness existed. Why he rejected the thought of further animal experimentation while still viewing Fulton and Jakcobsen’s tiny report as a virtual epiphany remains a mystery. Moniz, who had just endured a nasty priority fight concerning his invention of cerebral angiography, may have rushed into human trials in order to stake the earliest claim to lobotomy.
The association of the frontal lobes with emotional and intellectual dysfunction was hardly a radical idea, even in 193 5. The frontal lobes of lower mammals are vanishingly tiny; even chimps and apes have fairly small ones. In humans, on the other hand, the frontal lobes make up nearly two-thirds of the cerebrum, or higher brain. Since mental illnesses are uniquely human afflictions, a therapeutic surgical assault on the frontal lobes seemed quite plausible.
Moniz subsequently created a fanciful theory of “abnormally stabilized pathways” in the brain to justify his operation. He reasoned that cutting brain fibers might interrupt the abnormal brain circuitry of psychiatric patients, freeing them from a cycle of endless rumination. Since then, no better rationale for lobotomy has been advanced. Nevertheless, a lack of scientific justification doesn’t doom m operation as long as the operation works. Many good operations, pallidotomy included, can trace their origins to pseudoscience or serendipity. But was lobotomy ever a good operation? We’ve had half a century to study it and we’re still not sure.
Unfortunately, lobotomists showed no great talent for comprehensive, long-term analysis of their data. The esteemed Moniz often followed his patients for only a few weeks after their surgery. The peripatetic Freeman drove about the country doing hundreds of ice-pick procedures, but only near the end of his life did he find out how the majority of them fared. Even then, his assessments proved vague and unconvincing.
Only a single certain conclusion emerged from the dozens of lobotomy studies that have appeared over the years: schizophrenics don’t get better after surgery. This is ironic, given that they were the first to undergo psychosurgery. We now have an inkling as to why the treatment doesn’t work. Unlike depression and mania, which are disorders of mood, schizophrenia is a disorder of thought. And what a lobotomy alters is emotional state, not cognitive abilities.
Most lobotomists had vague and paternalistic ideas of what constituted a “good” result. Results were typically judged by psychiatrists, families, or institutional custodians; detailed surveys of what the patients thought rarely appear in the psychosurgery literature. This seems strange, since a cure, as judged by outsiders, may not be viewed that way by the patient. Is the patient, although inwardly miserable, cured because he no longer assaults the nursing staff, or because he can now sit quietly for hours without screaming? A careful reading of Freeman’s more detailed case histories shows that a few patients didn’t even see themselves as ill in the first place, although they realized that their behavior disturbed others.
Probably the most important factor in lobotomy’s demise was its deep physical and metaphysical ugliness. More than one seasoned professional vomited or passed out while watching Freeman crack through a patient’s orbital bone with his ice pick. Moreover, prospective patients often had to be dragged to an operating room or clinic. In Psychosurgery, the textbook he coauthored with Watts, Freeman frankly describes his-unorthodox methods of obtaining “consent” for lobotomy. Occasionally, forcible sedation was needed to keep the patient from backing out at the last minute.
Freeman’s landmark treatise also notes that if the patient was “too disturbed” to sign a consent, a close relative could give permission instead. He didn’t elaborate on how disturbed a person needed to be to abdicate his right to refuse lobotomy. Freeman never considered the possibility that relatives might have less than honorable motives for agreeing to the dissection of their loved one’s frontal lobes. Tennessee Williams, however, had no trouble envisioning such a nasty scenario. In his play Suddenly Last Summer, Mrs. Venable orders her young niece, Catharine, to be lobotomized. Catharine knew a little too much about the deviate practices of Mrs. Venable’s late son, Sebastian. Who would believe the poor child after she had the appropriate “therapy” at Lion’s View asylum?
It’s doubtful that many real families ever had such fanciful motives behind their surrogate assents for lobotomy, although even mundane motives can be illegitimate. Was it right to authorize a lobotomy to make an argumentative person a quiet one? Or to stop behaviors repugnant to everyone — everyone, that is, except the patient?
In retrospect, the real question isn’t why lobotomy died, but why it survived for so long. The answer is simple: Walter Freeman. Lobotomy became his career, his crusade, and he spread psychosurgery’s gospel with boundless enthusiasm. His elegant bearing and Freudian goatee gave him the look of a world-renowned healer of minds. In the end, his force of will could no longer counter lobotomy’s growing ethical opposition and pharmaceutical competition. Freeman did his best to carry on, but it was no use.
Modern psychosurgery has no evangelist equal to Freeman to spread its message, and so it must survive only on its merits. Time will tell whether it can.
There are good reasons to think the field can be revived. For starters, modern procedures like magnetic, resonance-guided cingulotomy bear little resemblance to the ugly lobotomies of the past. Computer-guided electrodes the thickness of pencil lead that can inflict minute injuries with millimeter precision have replaced ice picks and leucotomes. Procedures now take place only in sophisticated operating theaters, not in motel rooms or in the back rooms of county hospitals.
Modern neurosurgeons like Cosgrove approach their operations not as true believers but as skeptical scientists. Freeman’s arm-twisting consents are also gone; today multidisciplinary committees review each patient on a rigorous case-by-case basis. And no one but the patient can give consent for cingulotomy — there were no Mrs. Venables involved in the Harvard study. Unlike the itinerant lobotomists of Freeman’s time, modern psychosurgeons follow their patients closely for years and test them exhaustively.
But two problems remain. First, Cosgrove’s report, like earlier psychosurgery studies, makes no mention of the patients’ perception of their operations; it details only what their psychiatrists thought. Patients can’t even request this surgery on their own; an operation is offered only if the psychiatrist agrees. In other “quality of life” operations — face-lifts, surgical removal of herniated spinal disks, elective joint replacements — the patient approaches the surgeon directly, requests surgery, and then personally decides if the postoperative outcome is satisfactory. An orthopedic surgeon doesn’t ask an internist if a knee replacement has alleviated a patient’s pain. So why must we rely on psychiatrists to tell us if a patient no longer feels depressed after cingulotomy?
Second, the cingulotomy rests on no firmer scientific foundation than lobotomy did. First performed in 1952 as a modified version of the lobotomy, cingulotomy was based on Freeman’s observations that lobotomy patients seemed to have less “psychological tension” when fibers near the cingulate gyrus were severed. This ribbon of brain tissue is thought to be a conduit between the limbic region, a primitive area involved in emotional behavior, and the frontal lobes, the seat of reason and judgment. But we lack any more detailed understanding of how the cingulate gyrus functions. As such, cingulotomy can trace its intellectual heritage right back to the chimps Becky and Lucy.
Psychosurgery will never become as routine as it was in the 1940s and 1950s. The most refractory of the chronically disabling mental illnesses, schizophrenia, can’t be treated surgically. Depression, while quite common, usually responds to one of the many excellent medical therapies that must be tried first, leaving few patients as candidates for surgery. And patients with OCD often respond to nonsurgical treatments. Thus, the pool of patients likely to benefit from cingulotomy will always be fairly small. In addition, few major medical centers can muster the psychiatric, bioethical, and surgical resources to perform and evaluate the procedure correctly.
Then there is that sticky public relations problem. No matter how refined their surgeries, modem psychosurgeons will still be perceived as lobotomists. An unfair label, perhaps, but one that will prove difficult to shed.
A greater concern may be that the public won’t care at all. In Freeman’s day, society paid to house and care for great numbers of the mentally infirm, making psychiatric disease a public health problem of the first order. This may be why no one bothered to ask the patients what they thought of surgery — the lobotomists weren’t treating patients, they were treating a national crisis. Since lobotomy did make patients easier to care for, and even got many out of institutions and off the public dole, psychosurgeons served the national interest well. Freeman acknowledged that the lobotomist often put the needs of society over those of the individual, arguing that it was better for a patient “to have a simplified intellect capable of elementary acts than an intellect where reigns disorder of subtle synthesis. Society can accommodate itself to the humble laborer, but it justifiably mistrusts the mad thinker.”
The goal of lobotomy wasn’t to control disease but to control patients. Some would argue that our present heavy use of psychotropic drugs is just as flawed, in that we don’t make the patients better — we just succeed in preventing them from bothering us.
As a nation, we could seriously question all our recent efforts in the mental health arena. During the last three decades, mental illness has been literally cast into the streets. Asylums have vanished and many private health plans now refuse to pay for psychiatric treatment. Before we judge the lobotomists of old too severely, we should go to the nearest street grate and see how we are dealing with our mental health crisis today. High-profile diseases like AIDS and breast cancer dominate the headlines and the federal research budgets, leaving many victims of mental illness to suffer in silent solitude.
Modern psychosurgeons are thus courageous in seeking to address a difficult problem. By trying to bring the best neurosurgical technologies to a group of patients who have run out of hope, they risk the scorn of those who see only what psychosurgery was and not what it can be. I wish them luck. Given the lessons of history, they’ll surely need it.
COPYRIGHT 1997 Discover
COPYRIGHT 2004 Gale Group