Tuesday, April 25, 2017

Different side of London

Different angle #London #visitbritain

A post shared by Maria Dellaporta (@dellaportamaria) on


Sunday, April 23, 2017

An Extra Plate


My grandfather died two weeks ago. After the funeral, my family sat around my grandmother's living room, talking about the nice memories we had with him. All of us grandchildren mentioned how he always spoke in different voices when reading, even if it was just the newspaper. We spoke of the stories he used to tell us about his childhood.

My grandfather was born in Poland in 1929. He was 14 when his family was captured by the Nazis and taken to Birkenau.

He never told us grandchildren what it was like in the camp. We never pushed for him to. From what I understood, he had once had 2 younger sisters and a younger brother, as well as 3 older sisters and 2 older brothers. The 3 younger children were killed upon arrival to the camp. During their stay, his family was torn apart. He would never see two of his older sisters again, they died after a couple weeks of being there. His mother also died very soon after they arrived.

My grandfather and his remaining family were liberated in 1945 when he was 16 years old. He suffered from major PTSD for the rest of his life. He met my grandmother when he was in his late 20s after moving to America and they soon got married and had a family.

My grandfather seemed like a very normal man. He never became senile, not even in his last years. By the time he was in his 60s, he no longer suffered from major PTSD, and by that I mean he no longer woke up every night screaming, or feared small spaces, or became debilitated with fear any time he saw a Nazi flag. He was the strongest man I ever knew, mentally and emotionally.

However, there was something that my grandfather did that always seamed very odd to me. Any time we had a meal with him, he would serve an extra plate. Even in restaurants, he would order two meals. But he never ate the food on the other plate, and he never let anyone else eat from it.

I remember asking him why he did that when I was a kid, and he would always say what he said anytime anyone asked: "Old Polish tradition." I believed that until I went into college and began studying my background and reading all about Polish culture. I kept what my grandfather always said in mind, and tried to do research about this odd tradition, but never found anything.

So, as we were recounting our favorite grandfather memories, I decided now was as good a time as any to try to find out why he really did that. I brought it up, and everyone began to discuss it, and we all began asking my grandmother if she knew why.

She hesitated for a moment, but then she decided to tell us.

"Birkenau was a very harsh place. They never had enough clothing to shield them from the terrible winters. Everyone was sick. Everyone had fleas. Your grandfather was a smart man, his father before him had served in the first World War and had taught the boys all he could about survival in a tough environment. The inmates were not given very much food at all. Your grandfather had an idea to form a pact with a young girl his age, so that each day, they would take turns getting the food that belonged to both of them, so that instead of everyday not getting enough to eat to even make a slight difference in their hunger, they would be full one day and then they'd fast the next.

This worked for a little while, but the longer they stayed in there, the sicker they got. Eventually the girl contracted typhus. She was too weak to stop him. He was blinded by his hunger. He would tell her he'd feed her, but he never did. He kept her food for himself. She was too sick to comprehend what was happening, and shortly after, she died. The camp was liberated soon after that."

We sat there dumbfounded before she began to speak again.

"After the camp was liberated, he couldn't tell anyone what he had done. He was so ashamed. He tried to live his life like normal, but his life was no longer normal. He broke down and told me once that every time he would sit down to a meal, he would hear weeping in his head. That's how it started. But eventually he would see her, sitting at the table with him.

I thought that this was just his post traumatic stress disorder messing with his mind, but then I saw her too. She sat at the end of that table over there," she pointed to her long dining room table, "she wouldn't speak, she would only weep. We could never eat together without her being there. Eventually the house began to smell of death and sickness, perpetually. A coldness took over the entire house, even in summer. We'd find dead animals in the house, with bite marks taken out of them. This had to stop. So one day, Abraham got the idea to set a plate out for her at dinner. Conditions got better as he did this every day, every meal, and even every snack he had. Eventually the house no longer smelled, the cold went away, and the animals stopped appearing. Also I stopped seeing her. We never actually saw her again."

None of us knew what to say. Grandfather was a skeptic of everything. He didn't even believe in God, despite attending temple every friday of his life. I guess we all sort of chopped it down to his PTSD getting the best of him.

That is, until we all heard the weeping.

Tuesday, April 18, 2017

Explaining Britain’s immigration paradox

Migration is good for the economy. So why are the places with the biggest influxes doing so badly?


“The Golden Cross Welcomes you to Redditch!” The greeting, on the wall of a pub outside the town’s railway station, is valiant. But the dingy wire fence and mossy concrete beneath it let down the enthusiasm of the sign’s welcome. Redditch is struggling. In recent years, wages have fallen. It has also seen a rapid rise in the number of migrants, in particular those from eastern Europe. Perhaps linking these two phenomena, the people of Redditch voted 62:38 to leave the European Union in the referendum last June.

Immigration is a boon for Britain. The 9m-odd foreign-born people living there bring with them skills and attitudes that make the country more productive. Younger and better educated than natives, immigrants pay more in tax than they use in the way of public services. For some institutions they are indispensable: perhaps 30% of doctors in Britain are non-British.

Even so, Britain is unenthusiastic about immigration. Surveys find that roughly half of people would like it reduced “a lot” and fewer than 5% want it to go up. Many politicians interpret the vote for Brexit as a plea to reduce the number of new arrivals. Although the government has recently hinted that net migration may not fall by much after Britain leaves the EU, a group called Leave Means Leave, backed by two-dozen MPs, is calling for it to be slashed to a sixth of its current level.

To understand this antipathy to immigration, we examined the ten local authorities that saw the largest proportional increase in foreign-born folk in the ten years from 2005 to 2015 (we excluded Northern Ireland, because of differences in its data). Whereas big cities such as London have the greatest share of immigrants among their populations, the places that have experienced the sharpest rises are mostly smaller towns, which until recently had seen little immigration.

Top of the list is Boston, in Lincolnshire, where in 2005-15 the number of foreign-born residents rose from about 1,000 to 16,000. In 2005 immigrants were about one in 50 of the local population. They are now one in four. All ten areas we looked at saw at least a doubling in the share of the population that was born outside Britain.

These ten areas—call them Migrantland—voted about 60:40 in favour of leaving the EU, compared with 52:48 across Britain. Boston went for Brexit by 76:24, the highest margin of any local authority. And whereas it has often been noted that there was no link between the size of a place’s migrant population and local enthusiasm for Brexit (consider London, both cosmopolitan and heavily for Remain), we found some link between the increase in the number of migrants and the likelihood to vote Leave (see chart). London boroughs such as Hackney and Newham have welcomed large numbers of foreigners for centuries. People in those places have got used to newcomers, suggests Tony Travers of the London School of Economics. “But when your local population of migrants goes from 10% to 15% in a decade, that’s where you get the bite.”

Jacqui Smith, a former MP for Redditch and Labour home secretary in 2007-09, sees his point. “I know there’s racism in London, but people have largely become used to diverse communities...The transitional impact in Redditch is much greater,” she says. Redditch has in recent years acquired a couple of Polish supermarkets. Those who are well-off, mobile and confident find those sorts of developments interesting—“You think, ‘I’ll be able to get some Polish sausage’,” says Ms Smith. But those who lack housing or work worry about what such changes represent. The staff at an employment agency in Redditch attest to such fears. Most of the workers they place in jobs are from eastern Europe. “They’re brilliant, we love them,” smiles one member of staff. But when locals come looking for work and see how many foreign names are on the agency’s register, there is some resentment, she says.

The wrong place at the wrong time

It is tempting to conclude that such attitudes are motivated by prejudice. Yet a closer look at the economy and public services in Migrantland makes clear that its residents have plenty to be angry about—even if the migrants are not the culprits.

Places where living is cheap and jobs plentiful are attractive to newcomers. In 2005 the average house in Migrantland cost around £140,000 (then $255,000), compared with more than £150,000 across Britain. Unemployment was lower than average. Low-skill jobs blossomed. Migrantland seems to be more dependent on agriculture than the rest of the country. The big change in Boston, says Paul Gleeson, a local Labour councillor, is that previously-seasonal work, such as fruit- and veg-picking, has become permanent as technology and new crop varieties have lengthened the agricultural season. This means the people doing that work now live there permanently, too. Manufacturing centres are nearby: food processing, for instance, is a big employer in Boston and Mansfield.

Given the nature of the jobs on offer, it is unsurprising that the new arrivals are often young and not particularly well educated or Anglophone. We estimate that whereas over 40% of the Poles living in London have a higher-education qualification, only about a quarter do in the East Midlands, where three of our ten areas are. One in 20 people in Boston cannot speak English well or at all, according to the 2011 census. Small wonder that integration is hard. Many landlords do not allow tenants to drink or smoke inside, so people sit out on benches, having a drink and a cigarette. “Because they’re young, not because they’re foreign, they might not put their tins in the bin,” says Mr Gleeson.

What’s more, the places that have seen the greatest surges in migration have become poorer. In 2005-15 real wages in Migrantland fell by a tenth, much faster than the decline in the rest of Britain. On an “index of multiple deprivation”, a government measure that takes into account factors such as income, health and education, the area appears to have become relatively poorer over the past decade.

Are the newcomers to blame? Immigration may have heightened competition for some jobs, pushing pay down. But the effect is small. A House of Lords report in 2008 suggested that every 1% increase in the ratio of immigrants to natives in the working-age population leads to a 0.5% fall in wages for the lowest 10% of earners (and a similar rise for the top 10%). Since Migrantland relies on low-paid work, it probably suffered more than most.

But more powerful factors are at play. Because the area is disproportionately dependent on manufacturing, it has suffered from the industry’s decline. And since 2010 Conservative-led governments have slashed the number of civil servants, in a bid to right the public finances. The axe has fallen hard on the administrative jobs that are prevalent in unglamorous parts of the country. Migrantland’s public-sector jobs have disappeared 50% faster than those in Britain as a whole. In the Forest of Dean they have dropped by over a third. Meanwhile, cuts to working-age benefits have sucked away spending power.

Even before austerity, it had long been the case that poor places had the most threadbare public services. Medical staff, for instance, prefer to live in prosperous areas. Our analysis suggests that Migrantland is relatively deprived of general practitioners. Doctors for the East Midlands are trained in Nottingham and Leicester, but fewer people want to study there than in London, for instance. After training there, half go elsewhere. In 2014 there were 12 places for trainee doctors in Boston; only four were filled.

Follow the money

What can be done? In places where public spending has not yet caught up with a rapidly enlarged population, the government could target extra funding in the short term. The previous Labour government ran a “migration impacts fund”, introduced by Ms Smith. She acknowledges that the amounts involved were small (the budget was just £35m per year) but argues that the point was to reassure people that the government understood fears that immigration can make things tough for a time. The current government has launched a similar initiative, though it is no better funded.

And although Britons dislike immigration, they do not feel the same resentment towards immigrants themselves. Once they have been placed in jobs alongside each other, locals and migrants tend to rub along, says the Redditch recruitment agency. A music festival was recently held in the town to raise money for children’s hospital wards in Poland. Local Poles took part in the Holocaust commemoration this year, says Bill Hartnett, leader of the council.

All that may be encouraging, but it does not provide a way to improve conditions in the left-behind places to which migrants have rushed. To many people, Brexit may appear to be just such a policy. They have been told a story that leaving the EU will make things better in their area, says Mr Gleeson. “It won’t.”


The Economist

Sunday, April 16, 2017

When is it OK to shoot a child soldier?


Canada writes rules for troops who face armed nine-year-olds

ONE of the worst dilemmas soldiers face is what to do when they confront armed children. International law and most military codes treat underage combatants mainly as innocent victims. They offer guidance on their legal rights and on how to interrogate and demobilise them. They have little to say about a soul-destroying question, which must typically be answered in a split second: when a kid points a Kalashnikov at you, do you shoot him? Last month Canada became the first country to incorporate a detailed answer into its military doctrine. If you must, it says, shoot first.

Such encounters are not rare. Child soldiers fight in at least 17 conflicts, including in Mali, Iraq and the Philippines. Soldiers in Western armies, sometimes acting as peacekeepers, have encountered fighters as young as six on land and at sea. More than 115,000 young combatants have been demobilised since 2000, according to the UN. For the warlords who employ them, children offer many advantages: they are cheap, obedient, expendable, fearless when drugged and put opponents at a moral disadvantage. Some rebel armies are mostly underage.

In 2000 a group of British peacekeepers in Sierra Leone who refused to fire on children armed with AK-47s were taken hostage by them. One paratrooper died and 11 others were injured in their rescue. Soldiers who have shot children sometimes suffer from crippling psychological wounds. A Canadian who protected convoys in Afghanistan from attack by young suicide-bombers has not been able to hug his own children since he came home four years ago. Some soldiers have committed suicide. “We always thought it was the ambush or the accident that was the hardest point” of a war, said Roméo Dallaire, a retired Canadian general, in testimony before a parliamentary hearing on military suicides in March. In fact, the “hardest one is the moral dilemma and the moral destruction of having to face children.”

The Geneva Convention and other international accords prohibit attacking schools, abducting children and other practices that harm them. But they do not tell soldiers what to do when they confront children as combatants, making self-defence feel like a war crime. On March 2nd Canada adopted a military doctrine that explicitly acknowledges soldiers’ right to use force to protect themselves, even when the threat comes from children. “A child soldier with a rifle or grenade launcher can present as much of a threat as an adult soldier carrying the same armament,” it says. It is based in part on research by the Child Soldiers Initiative, an institute founded by Mr Dallaire that works towards ending the use of children as fighters.

The new doctrine goes well beyond the moment of confrontation. Intelligence officers, it says, should report on the presence of child soldiers and how they are being used. Soldiers deployed in areas with child fighters should be prepared psychologically, trained to handle confrontations with kids and assessed by psychologists when they return. The instruction suggests ways to ensure that killing children is a last resort. It recommends shooting their adult commanders to shatter discipline and prompt the youngsters to flee or surrender. It warns against the use of lightly armed units, which are vulnerable to “human-wave” attacks by children.

The authors of the new directive seem to be aware that a policy to shoot child soldiers even in self-defence could provoke outrage. So far, human-rights groups have expressed understanding. Canada is trying to strike a balance between treating children as innocents and recognising them as battlefield threats, says Jo Becker, a children’s-rights specialist at Human Rights Watch in New York. Britain is considering guidelines of its own, and other countries may follow. Canada may soon put its doctrine to the test. Its government has promised to send 600 troops on a three-year peace mission to Africa. It has not revealed yet where exactly they will go. Wherever it is, they are likely to meet gun-toting children. By acknowledging their right to defend themselves, Canada’s government may lessen the trauma of those forced to fight the youngest warriors.

The Economist

Saturday, April 15, 2017

Thursday, April 13, 2017

Tuesday, April 11, 2017

The voices in our heads

 

Talking to your yogurt again,” my wife, Pam, said. “And what does the yogurt say?”

She had caught me silently talking to myself as we ate breakfast. A conversation was playing in my mind, with a research colleague who questioned whether we had sufficient data to go ahead and publish. Did the experiments in the second graph need to be repeated? The results were already solid, I answered. But then, on reflection, I agreed that repetition could make the statistics more compelling.

I often have discussions with myself—tilting my head, raising my eyebrows, pursing my lips—and not only about my work. I converse with friends and family members, tell myself jokes, replay dialogue from the past. I’ve never considered why I talk to myself, and I’ve never mentioned it to anyone, except Pam. She very rarely has inner conversations; the one instance is when she reminds herself to do something, like change her e-mail password. She deliberately translates the thought into an external command, saying out loud, “Remember, change your password today.”

Verbal rehearsal of material—the shopping list you recite as you walk the aisles of a supermarket—is part of our working memory system. But for some of us talking to ourselves goes much further: it’s an essential part of the way we think. Others experience auditory hallucinations, verbal promptings from voices that are not theirs but those of loved ones, long-departed mentors, unidentified influencers, their conscience, or even God.

Charles Fernyhough, a British professor of psychology at Durham University, in England, studies such “inner speech.” At the start of “The Voices Within” (Basic), he also identifies himself as a voluble self-speaker, relating an incident where, in a crowded train on the London Underground, he suddenly became self-conscious at having just laughed out loud at a nonsensical sentence that was playing in his mind. He goes through life hearing a wide variety of voices: “My ‘voices’ often have accent and pitch; they are private and only audible to me, and yet they frequently sound like real people.”

Fernyhough has based his research on the hunch that talking to ourselves and hearing voices—phenomena that he sees as related—are not mere quirks, and that they have a deeper function. His book offers a chatty, somewhat inconclusive tour of the subject, making a case for the role of inner speech in memory, sports performance, religious revelation, psychotherapy, and literary fiction. He even coins a term, “dialogic thinking,” to describe his belief that thought itself may be considered “a voice, or voices, in the head.”

Discussing experimental work on voice-hearing, Fernyhough describes a protocol devised by Russell Hurlburt, a psychologist at the University of Nevada, Las Vegas. A subject wears an earpiece and a beeper sounds at random intervals. As soon as the person hears the beep, she jots notes about what was in her mind at that moment. People in a variety of studies have reported a range of perceptions: many have experienced “inner speech,” though Fernyhough doesn’t specify what proportion. For some, it was a full back-and-forth conversation, for others a more condensed script of short phrases or keywords. The results of another study suggest that, on average, about twenty to twenty-five per cent of the waking day is spent in self-talk. But some people never experienced inner speech at all.

In his work at Durham, Fernyhough participated in an experiment in which he had an inner conversation with an old teacher of his while his brain was imaged by fMRI scanning. Naturally, the scan showed activity in parts of the left hemisphere associated with language. Among the other brain regions that were activated, however, were some associated with our interactions with other people. Fernyhough concludes that “dialogic inner speech must therefore involve some capacity to represent the thoughts, feelings, and attitudes of the people with whom we share our world.” This raises the fascinating possibility that when we talk to ourselves a kind of split takes place, and we become in some sense multiple: it’s not a monologue but a real dialogue.

Early in Fernyhough’s career, his mentors told him that studying inner speech would be fruitless. Experimental psychology focusses on things that can be studied in laboratory situations and can yield clear, reproducible results. Our perceptions of what goes on in our heads are too subjective to quantify, and experimental psychologists tend to steer clear of the area.

Fernyhough’s protocols go some way toward working around this difficulty, though the results can’t be considered dispositive. Being prompted to enter into an inner dialogue in an fMRI machine is not the same as spontaneously debating with oneself at the kitchen table. And, given that subjects in the beeper protocol could express their experience only in words, it’s not surprising that many of them ascribed a linguistic quality to their thinking. Fernyhough acknowledges this; in a paper published last year in Psychological Bulletin, he wrote that the interview process may both “shape and change the experiences participants report.”

More fundamentally, neither experiment can do more than provide a rough phenomenology of inner speech—a sense of where we experience inner speech neurologically and how it may operate. The experiments don’t tell us what it is. This hard truth harks back to William James, who concluded that such “introspective analysis” was like “trying to turn up the gas quickly enough to see how the darkness looks.”

Nonetheless, Fernyhough has built up an interesting picture of inner speech and its functions. It certainly seems to be important in memory, and not merely the mnemonic recitation of lists, to which my wife and many others resort. I sometimes replay childhood conversations with my father, long deceased. I conjure his voice and respond to it, preserving his presence in my life. Inner speech may participate in reasoning about right and wrong by constructing point-counterpoint situations in our minds. Fernyhough writes that his most elaborate inner conversations occur when he is dealing with an ethical dilemma.

Inner speech could also serve as a safety mechanism. Negative emotions may be easier to cope with when channelled into words spoken to ourselves. In the case of people who hear alien voices, Fernyhough links the phenomenon to past trauma; people who live through horrific events often describe themselves “dissociating” during the episodes. “Splitting itself into separate parts is one of the most powerful of the mind’s defense mechanisms,” he writes. Given that his fMRI study suggested that some kind of split occurred during self-speech, the idea of a connection between these two mental processes doesn’t seem implausible. Indeed, a mainstream strategy in cognitive behavioral therapy involves purposefully articulating thoughts to oneself in order to diminish pernicious habits of mind. There is robust scientific evidence demonstrating the value of the method in coping with O.C.D., phobias, and other anxiety disorders.

Cognitive behavioral therapy also harnesses the effectiveness of verbalizing positive thoughts. Many athletes talk to themselves as a way of enhancing performance; Andy Murray yells at himself during tennis matches. The potential benefits of this have some experimental support. In 2008, Greek researchers randomly assigned tennis players to one of two groups. The first was trained in motivational and instructional self-talk (for instance, “Go,” “I can,” “Shoulder, low”). The second group got a tactical lecture on the use of particular shots. The group trained to use self-talk showed improved play and reported increased self-confidence and decreased anxiety, whereas no significant improvements were seen in the other group.

Sometimes the voices people hear are not their own, and instead are attributed to a celestial source. God’s voice figures prominently early in the Hebrew Bible. He speaks individually to Adam, Eve, Cain, Noah, and Abraham. At Mt. Sinai, God’s voice, in midrash, was heard communally, but was so overwhelming that only the first letter, aleph, was sounded. But in later prophetic books the divine voice grows quieter. Elijah, on Mt. Horeb, is addressed by God (after a whirlwind, a fire, and an earthquake) in what the King James Bible called a “still small voice,” and which, in the original Hebrew (kol demamah dakah), is even more suggestive—literally, “the sound of a slender silence.” By the time we reach the Book of Esther, God’s voice is absent.

In Christianity, however, divine speech continues through the Gospels—the apostle Paul converts after hearing Jesus admonish him. Especially in evangelical traditions, it has persisted. Martin Luther King, Jr., recounted an experience of it in the early days of the bus boycott in Montgomery, in 1956. After receiving a threatening anonymous phone call, he went in despair into his kitchen and prayed. He became aware of “the quiet assurance of an inner voice” and “heard the voice of Jesus saying still to fight on.”

Fernyhough relates some arresting instances of conversations with God and other celestial powers that occurred during the Middle Ages. In fifteenth-century France, Joan of Arc testified to hearing angels and saints tell her to lead the French Army in rescuing her country from English domination. A more intimate example is that of the famous mystic Margery Kempe, a well-to-do Englishwoman with a husband and family, who, in the early fifteenth century, reported that Christ spoke to her from a short distance, in a “sweet and gentle” voice. In “The Book of Margery Kempe,” a narrative she dictated, which is often considered the first autobiography in English, she relates how a series of domestic crises, including an episode of what she describes as madness, led her to embark on a life of pilgrimage, celibacy, and extreme fasting. The voice of Jesus gave her advice for negotiating a deal with her frustrated and worried husband. (She agreed to eat; he accepted her chastity.) Fernyhough writes imaginatively about the various registers of voice she hears. “One kind of sound she hears is like a pair of bellows blowing in her ear: it is the susurrus of the Holy Spirit. When He chooses, our Lord changes that sound into the voice of a dove, and then into a robin redbreast, tweeting merrily in her ear.”

Forty years ago, Julian Jaynes, a psychologist at Princeton, published a landmark book, “The Origin of Consciousness in the Breakdown of the Bicameral Mind,” in which he proposed a biological basis for the hearing of divine voices. He argued that several thousand years ago, at the time the Iliad was written, our brains were “bicameral,” composed of two distinct chambers. The left hemisphere contained language areas, just as it does now, but the right hemisphere contributed a unique function, recruiting language-making structures that “spoke” in times of stress. People perceived the utterances of the right hemisphere as being external to them and attributed them to gods. In the tumult of attacking Troy, Jaynes believed, Achilles would have heard speech from his right hemisphere and attributed it to voices from Mt. Olympus:

The characters of the Iliad do not sit down and think out what to do. They have no conscious minds such as we say we have, and certainly no introspections. When Agamemnon, king of men, robs Achilles of his mistress, it is a god that grabs Achilles by his yellow hair and warns him not to strike Agamemnon. It is a god who then rises out of the gray sea and consoles him in his tears of wrath on the beach by his black ships. . . . It is one god who makes Achilles promise not to go into battle, another who urges him to go, and another who then clothes him in a golden fire reaching up to heaven and screams through his throat across the bloodied trench at the Trojans, rousing in them ungovernable panic. In fact, the gods take the place of consciousness.

Jaynes believed that the development of nerve fibres connecting the two hemispheres gradually integrated brain function. Following a theory of Homeric authorship that assumed the Odyssey to have been composed at least a century after the Iliad, he pointed out that Odysseus, who is constantly reflecting and planning, manifests a self-consciousness of mind. The poem’s emphasis on Odysseus’ cunning starts to seem like the celebration of the emergence of a new kind of consciousness. For Jaynes, hearing the voice of God was a vestige of our past neuroanatomy.

Jaynes’s book was hugely influential in its day, one of those rare specialist works whose ideas enter the culture at large. (Bicamerality is an important plot point in HBO’s “Westworld”: Dolores, an android played by Evan Rachel Wood, is led to understand that a voice she hears, which has urged her to kill other android “hosts” at the park, comes from her own head.) But Jaynes’s thesis does not stand up to what we now know about the development of our species. In evolutionary time, the few thousand years that separate us from Achilles are a blink of an eye, far too short to allow for such radical structural changes in the brain. Contemporary neurologists offer alternative explanations for hearing celestial speech. Some speculate that it represents temporal-lobe epilepsy, others schizophrenia; auditory hallucinations are common in both conditions. They are also a feature of degenerative neurological diseases. An elderly relative with Alzheimer’s recently told me that God talks to her. “Do you actually hear His voice?” I asked. She said that she does, and knows it is God because He said so.

Remarkably, Fernyhough is reluctant to call such voices hallucinations. He views the term as pejorative, and he is notably skeptical about the value of psychiatric diagnosis in voice-hearing cases:

It is no more meaningful to attempt to diagnose . . . English mystics (nor others, like Joan, from the tradition to which they belong) than it is to call Socrates a schizophrenic. . . . If Joan wasn’t schizophrenic, she had “idiopathic partial epilepsy with auditory features.” Margery’s compulsive weeping and roaring, combined with her voice-hearing, might also have been signs of temporal lobe epilepsy. The white spots that flew around her vision (and were interpreted by her as sightings of angels) could have been symptoms of migraine. . . . The medieval literary scholar Corinne Saunders points out that Margery’s experiences were strange then, in the early fifteenth century, and they seem even stranger now, when we are so distant from the interpretive framework in which Margery received them. That doesn’t make them signs of madness or neurological disease any more than similar experiences in the modern era should be automatically pathologized.

In his unwillingness to draw a clear line between normal perceptions and delusions, Fernyhough follows ideas popularized by a range of groups that have emerged in the past three decades known as the Hearing Voices Movement. In 1987, a Dutch psychiatrist, Marius Romme, was treating a patient named Patsy Hage, who heard malign voices. Romme’s initial diagnosis was that the voices were symptoms of a biomedical illness. But Hage insisted that her voice-hearing was a valid mode of thought. Not coincidentally, she was familiar with the work of Julian Jaynes. “I’m not a schizophrenic,” she told Romme. “I’m an ancient Greek!”

Romme came to sympathize with her point of view, and decided that it was vital to engage seriously with the actual content of what patients’ voices said. The pair started to publicize the condition, asking other voice-hearers to be in touch. The movement grew from there. It currently has networks in twenty-four countries, with more than a hundred and eighty groups in the United Kingdom alone, and its membership is growing in the United States. It holds meetings and conferences in which voice-hearers discuss their experiences, and it campaigns to increase public awareness of the phenomenon.

The movement’s followers reject the idea that hearing voices is a sign of mental illness. They want it to be seen as a normal variation in human nature. Their arguments are in part about who controls the interpretation of such experiences. Fernyhough quotes an advocate who says, “It is about power, and it’s about who’s got the expertise, and the authority.” The advocate characterizes cognitive behavioral therapy as “an expert doing something to” a patient, whereas the movement’s approach disrupts that hierarchy. “People with lived experience have a lot to say about it, know a lot about what it’s like to experience it, to live with it, to cope with it,” she says. “If we want to learn anything about extreme human experience, we have to listen to the people who experience it.”

Like other movements that seek to challenge the authority of psychiatry’s diagnostic categories, the Hearing Voices Movement is controversial. Critics point out that, while depathologizing voice-hearing may feel liberating for some, it entails a risk that people with serious mental illnesses will not receive appropriate care. Fernyhough does not spend much time on these criticisms, though in a footnote he does concede the scant evidentiary basis of the movement’s claims. He mentions a psychotherapist sympathetic to the Hearing Voices Movement who says that, in contrast to the ample experimental evidence for the efficacy of cognitive behavioral therapy, “the organic nature of hearing voices groups” makes it hard to conduct randomized controlled trials.

Fernyhough is not only a psychologist; he also writes fiction, and in describing this work he emphasizes the role of hearing voices. “I never mistake these fictional characters for real people, but I do hear them speaking,” he writes in “The Voices Within.” “I have to get their voices right—transcribe them accurately—or they will not seem real to the people who are reading their stories.” He notes that this kind of conjuring is widespread among novelists, and cites examples including Charles Dickens, Joseph Conrad, Virginia Woolf, and Hilary Mantel.

Fernyhough and his colleagues have tried to quantify this phenomenon. Ninety-one writers attending the 2014 Edinburgh International Book Festival responded to a questionnaire; seventy per cent said that they heard characters speak. Several writers linked the speech of their characters to inner dialogues even when they are not actively writing. As for plot, some writers asserted that their characters “don’t agree with me, sometimes demand that I change things in the story arc of whatever I’m writing.”

The importance of voice-hearing to many writers might seem to validate the Hearing Voices Movement’s approach. If the result is great literature, it would be perverse to judge hearing voices an aberration requiring treatment rather than a precious gift. It’s not that simple, however. As Fernyhough writes, “Studies have shown a particularly high prevalence of psychiatric disorders (particularly mood disorders) in those of proven creativity.” Even leaving aside the fact that most people with mood disorders are not creative geniuses, many writers find their creative talent psychologically troublesome, and even prize an idea of themselves as, in some sense, abnormal. The novelist Jeanette Winterson has heard voices that she says put her “in the crazy category,” and the idea has a long history: Plato’s “mad poet,” Aristotle’s “melancholic genius,” and John Dryden’s dictum that “great wits are sure to madness near allied.” But, in cases where talent is accompanied by real psychological disturbance, do the creative benefits really outweigh the costs to the individual?

On a frigid night in January, 1977, while working as a young resident at Massachusetts General Hospital, I was paged to the emergency room. A patient had arrived by ambulance from McLean Hospital, a famous psychiatric institution in nearby Belmont. Sitting bolt upright, laboring to breathe, was the poet Robert Lowell. I introduced myself and performed a physical examination. Lowell was in congestive heart failure, his lungs filling with fluid. I administered diuretics and fitted an oxygen tube to his nostrils. Soon he was breathing comfortably. He seemed sullen and, to distract him from his predicament, I asked about a medallion that hung from a chain around his neck. “Achilles,” he replied, with a fleeting smile.

I’ve no idea if Lowell knew of Jaynes’s book, which had come out the year before, but Achilles was a figure of lifelong importance to him, one of many historical and mythical figures—Alexander the Great, Dante, T. S. Eliot, Christ—with whom he identified in moments of delusional grandiosity. In Achilles, Lowell seemed to find a heroic reflection of his own mental volatility. Achilles’ defining attribute—it’s the first word of the Iliad—is mēnin, usually translated as “wrath” or “rage.” But in a forthcoming book, “Robert Lowell, Setting the River on Fire: A Study of Genius, Mania, and Character,” the psychiatry professor Kay Redfield Jamison points out that Lowell’s translation of the passage renders mēnin as “mania.” As it happens, mania was Lowell’s most enduring diagnosis in his many years as a psychiatric patient.

In her account of Lowell’s hospitalization, Jamison cites my case notes and those of his cardiologist in the Phillips House, a wing of Mass General where wealthy Boston Brahmin patients were typically housed. Lowell wrote a poem about his stay, “Phillips House Revisited,” in which he overlays impressions of the medical crisis I had witnessed (“I cannot entirely get my breath, / as if I were muffled in snow”) with memories of his grandfather, who had died in the same hospital, forty years earlier.

There was a long history of mental illness in Lowell’s family. Jamison digs up the records of his great-great-grandmother, who was admitted to McLean in 1845, and who, doctors noted, was “afflicted with false hearing.” Lowell, too, suffered from auditory hallucinations. Sometimes, before sleep, he would talk to the heroes from Hawthorne’s “Greek Myths.” During a hospitalization in 1954, he often chatted to Ezra Pound, who was a friend—but not actually there. Among his contemporaries, recognition of Lowell’s mental instability was inextricably bound up with awe of his talent. The intertwining of madness and genius remains an essential part of his posthumous legend, and Lowell himself saw the two as related. Jamison quotes a report by one of his doctors:

Patient’s strong emotional ties with his manic phase were very evident. Besides the feeling of well-being which was present at that time, patient felt that, “my senses were more keen than they had ever been before, and that’s what a writer needs.”

But Jamison also shows that Lowell sometimes saw his episodes of manic inspiration in a more coldly medical light. After a period of intense religious revelation, he wrote, “The mystical experiences and explosions turned out to be pathological.” Splitting the difference, Jamison suggests that his mania and his imagination were welded into great art by the discipline he exerted between his manic episodes.

Lowell was discharged from Mass General on February 9th. Jamison quotes a note that one of my colleagues wrote to the doctors at McLean: “Thank you for referring Mr. Lowell to me. He proved to be just as interesting a person and a patient as you suggested he might be.” Later that month, Lowell had recovered sufficiently to travel to New York and do a reading with Allen Ginsberg. He read “Phillips House Revisited.” That September, he died.


The New Yorker

Sunday, April 09, 2017

Patchwork politics


Tony Judt decided to write “Postwar” while changing trains at Vienna's Westbahnhof terminus in December 1989. One historical era was ending and another was about to begin. Mr Judt, a Londoner who was educated at Cambridge and in Paris, and who is now professor of European Studies at New York University, believes that 1989 marked the end of the legacy of the second world war. The 44 years that followed were, in a sense, an “interim age: a post-war parenthesis, the unfinished business of a conflict that ended in 1945 but whose epilogue had lasted for another half century.”

If the first world war destroyed old Europe, the second, Mr Judt believes, created the conditions for a new, non-ideological Europe. The grand ideas which had shaken the continent since the French Revolution were now dead. All that was left was “the promise of liberty”, a promise fulfilled in western Europe in 1945, but which the rest of Europe had to wait for until 1989.

Europe had proved unable to liberate itself from National Socialism; nor could it keep Communism at bay unaided. It relied for its freedom and security upon the benevolence and goodwill of America. The movement for European unity, Mr Judt believes, “was grounded in weakness, not strength”. It was because Europe's influence was declining that it began to unite; and it was because Britain did not, in the 1950s, see its position in that light, that it did not join the European movement.

General de Gaulle famously had a “certain idea of France”. Mr Judt has a “certain idea of Europe”, a community of values whose system of inter-state relations is a model to be copied, rather than, as in the past, a warning to be avoided. For Europe shows the world that nationalism is obsolete. Mr Judt does not face up to the problem that Europe's virtue is bought largely at the price of loss of influence. How many divisions has the pope, Stalin once asked. The same question may be asked of Europe.

Mr Judt also argues that the new Europe, with the significant exceptions of the Soviet Union and Yugoslavia, was ethnically homogeneous. The peace settlement after the first world war, based as it was on Woodrow Wilson's principle of self-determination, had created states in central and eastern Europe with large minority populations. Post-second world war Europe was built out of the rubble of Nazism and Communism. “Hitler and Stalin between them had blasted flat the demographic heath upon which the foundations of a new and less complicated continent were then laid.”

Europe is not the same pre-war melting pot, but is composed instead of “hermetic national enclaves”. Minorities have been either expelled, as with the Germans from Poland and Czechoslovakia after 1945, or, as with the Jews, murdered. Mr Judt is particularly good on the centrality of the Holocaust to the new Europe. In a moving epilogue, entitled, “From the House of the Dead”, he declares that “the recovered memory of Europe's dead Jews has become the very definition and guarantee of the continent's recovered humanity.” The new Europe remains mortgaged to its terrible past. That is why, he concludes, the European Union “may be an answer to history, but it can never be a substitute.”

Yet, as Mr Judt, himself shows, the nations of western Europe have become far less hermetically sealed or ethnically homogeneous than in 1945. Immigration and asylum have given rise to new and acute cultural cleavages; and it is precisely because the nations of Europe have failed to become genuine melting pots that so much of European politics now revolves around issues of multiculturalism.

Mr Judt deals with grand and important themes. But, after announcing them in a powerful introduction, he proceeds to tell us at great length mainly what we know already. His discussion is chronological not thematic, and the main ideas get lost in what is now a familiar story, although the story is told with some skill. Nevertheless, few books of nearly 1,000 pages justify their length, and “Postwar” is no exception. When Lord Beaverbrook was sent a 700-page biography of his fellow press magnate, Lord Northcliffe, he dispatched it unread to the University of New Brunswick, saying, “It weighs too much.” Sadly, “Postwar” is likely to suffer the same fate.

The Economist

Tuesday, April 04, 2017

Monday, April 03, 2017