Saturday, February 26, 2011

'The King's Speech' - A Review by Paul Schlieben


Many will see ‘The King’s Speech’ and ‘Social Network’ through the prism of class and privilege.  (Some see all films as political propaganda.)  In one film, you have a Prince afflicted with a debilitating stammer, who reluctantly becomes King when his older brother abdicates the throne of the King of England.  In the other film, you have a privileged, ambitious, brilliant Harvard student who goes on to create Facebook and become the world’s youngest billionaire.  No question about it, historian Howard Zinn would have abhorred them equally.
I admit to having grown weary of British period dramas — all dressed up with suitably starched upper lips and sphincter-choked accents — so I was not predisposed to like ‘The King’s Speech’ and was reluctant to see it.  But Colin Firth’s reputation combined with widespread praise for the film persuaded me to go.  My prejudice evaporated in the very first scene, as a fearful Prince (Colin Firth), attending some grand event being broadcast live on radio, approached a microphone as though being led to the gallows, and … is unable to speak.  As they say, you could feel his pain.
This week I saw 'The Kings Speech' for a second time.  I can't explain why I find it so entertaining – maybe I'm getting old and sentimental – but at its heart and stripped of its pomp, I believe it can be explained simply like this: it’s a film about two men who during the course of the film become friends.  Neither one, certainly not the Prince, would have described it that way, but inevitably, both might well have agreed years later.  That this friendship was improbable and true (or, at the least, "based on" a true story) and the historical context extraordinary, heightens the dramatic tension and makes us sit up and pay attention.   The Prince of Wales, Bertie, needs a voice coach to help him overcome a terrible stammer.  Ultimately, what will cure his stammer is to unburden himself to a friend, one who helps him identify the source of the stammer and overcome its effects.
Lionel Logue (Geoffrey Rush), a self-taught, unaccredited, and unorthodox voice coach and part-time actor, with a family to feed, needs clients.  While not the primary focus of the movie, one can imagine that Logue craves professional validation, as an actor and as a coach, and is equally in need of a friend.  Both have the love and support of their families, which seem to me to be important prerequisite to their forming their friendship.
That they were brought together out of necessity should not distract from the bond that emerges.  Friendships have been built on weaker foundations.  Some might say that Lionel’s desire for the King’s success is evidence of his living vicariously, but perhaps the best friendships are between people who, motivated by mutual affection, live vicariously through each other and celebrate each other’s successes.   In fact, that might serve as a definition of a good marriage. 
Successful drama must be emotionally engaging.  ‘The King’s Speech’ is successful because we feel every moment as if we were Bertie, one moment, and Lionel Logue, the next.   Much of the credit for this film’s success must go to Colin Firth and Geoffrey Rush, who deliver performances so convincing I would not be surprised to learn they had become lifelong friends in the course of shooting this film (and disappointed if they had not.)   While I am happy to see Colin Firth as the likely recipient of the Academy Award’s Best Actor, I was more pleased to see Geoffrey Rush nominated for Best Supporting Actor.  He would get my vote. 
Sometimes a simple story of friendship, told well, trumps a more complex tale, no matter how brilliantly conceived.  I admired 'Social Network' and believe Aaron Sorkin to be the most brilliant screenwriters alive today, but 'The Kings Speech' broke through to my heart.  I'd vote with my heart. 
And, to give ‘The King’s Speech’ screenwriter David Seidler his due, his script was also brilliant.  Take, for instance, this dialogue:
Prince of Wales (Bertie): “I’ve had the very best voice coaches.”
Lionel Logue: “They’re idiots.”
Prince: “They’ve all been knighted.”
Logue: “Makes it official then.”
Or, when asked whether he knows any jokes, Bertie stutters this sardonic reply: “… T… T…Timing isn’t one of my strengths.”
Many believe that ‘Winter Bones’, that dark Appalachian tale of America’s desperate, poor meth-addicted underbelly, better represents the reality with which many American’s contend.  I agree.  Others will insist that ‘Social Network,’ with its clever plotting and dialogue is more deserving of an Oscar.  In some respect, they would be right.  But all these movies could be from different planets—isn’t there something absurd about having to pick one over the other?  Each film is original, brilliant and unique.   Throw in ‘True Grit’ as well.  That film was truly gritty, from beginning to end.  To pick one film over the others is a fruitless exercise.  We don’t need a ‘Best Picture’ category.  The nominations are sufficient.  Stop there.  Tune out the Academy Awards and just go see them all.
For print and download version -- Print version

Sunday, February 13, 2011

'The Great Influenza' by John M. Berry


Imagine half a dozen ocean waves, emanating from different parts of the globe, converging on a coastal community in summer, a community whose residents are unaware of the impending danger.  Some waves are visible on the horizon; others are but deep swells, typical of the season.  What happens when all of the waves combine their amplitude the moment they reach shore and form a giant tidal wave, a tsunami of historic proportions?  This, or something like it, is what happened in 1918, when the influenza pandemic enveloped the globe in just a few months, resulting in the deaths of, by some estimates, one hundred million people.  Of course, one of the biggest waves was the war in Europe, now in its fourth year.  But this was not the deadliest.  There was also the long swell of medical history, only recently jolted out of a Hippocratic stupor lasting over two thousand years.  Then there was the wave set in motion by President Wilson, who unleashed a powerful political force determined to whip a fractious country to war.  This, along with the hastily passed a new Sedition Act and the Espionage Act of 1917, combined with a propaganda machine that brooked no dissent, made reporting actual conditions nearly impossible, criminal even.  Then there was biggest wave of all, the influenza virus itself.  Capable of slipping through all the body’s defenses and adapting to its hosts with increasing effectiveness – a virus, too small to see by conventional means – it infected and killed young adults to a disproportionate degree.   Compounding the devastation, most of the lifeguards – the doctors and nurses – were off tending to the troops in Europe, leaving communities begging for volunteers to nurse the sick and dying, and those few doctors and nurses left behind were overwhelmed and decimated by the disease.  And worse, public health officials, military and civilian, were overruled by the supreme urgency of war and made powerless to limit the influenza’s devastating effects.

War

For the first few years of the war in Europe, the United States tried to maintain its neutrality.  President Wilson himself was extremely reluctant to enter the fray.  However, in 1917, Germany outraged the nation when it announced unrestricted submarine warfare and tried to persuade Mexico to its side.  The President was forced to act.  As reluctant as he was, once the decision was made, Wilson pursued war with incredible single-mindedness, an almost religious fervor.  “To Wilson, this war was a crusade, he intended to wage total war.” 
“To fight,” Wilson declared, “you must be brutal and ruthless, and the spirit of ruthless brutality will enter into the very fibre of our national life….”  And it did.
“The government compelled conformity, controlled speech in ways… not known in America before or since....”  Wilson pushed the Espionage Act and a new Sedition Act through Congress, and established the FBI and a volunteer group called the American Protective League (destined to become the Secret Service) to enforce these new laws, and initiated a “voluntary” Liberty Bond drive, and other measures.  He created, by executive order, the Committee on Public Information headed by George Creel, who went on to produce “tens of thousands of press release and feature stories that were routinely run unedited by newspapers.” 
In many ways, as is the case with the Influenza epidemic itself, our nation’s memory of the “Great War” has been eclipsed by the depression, WW II and the wars fought since.  John Berry’s excellent account reminds us of the draconian measures begun under the guise of war.  (The 2001 USA Patriot Act seems mild in comparison.)  Regardless of the motive or justification, these two statements sum up conditions leading up to the outbreak of the influenza epidemic:
“… Columbia University President Nicholas Murray Butler said, ‘What had been folly was now treason.’”
And Berry himself says, “As an unintended consequence, the nation became a tinderbox for epidemic disease….”

Epidemic

In the summer of 1918, the influenza crashed along the Coasts of all the continents of the world, working its way inland along the rivers and roads of commerce, spreading suffering and death in its wake, then receding, as it ran out of hosts to infects.   It started at an Army base in Kansas as “LaGrippe,” quickly mutating to its most lethal form and “swarming” through the population, through the port cities of Boston, Philadelphia, New York, New Orleans, Chicago and on and on, relentlessly infecting even remote inland outposts, then just as quickly running out of hosts and mutating again into a less dangerous form, infecting fewer and fewer, as it ebbed, but not before President Wilson was caught in its undertow while attending peace talks in Paris.  His encounter with influenza very likely resulted in a bout of depression that affected the course of history.  This strain of influenza would never disappear completely, it would just lie in wait for a new mutation or for new hosts whose lack of immunity would provide the opportunity to do it all again.
“It was influenza, only influenza,” yet it had mutated, exploded and “swarmed” into a disease more deadly than the Bubonic Plague, or any other in human memory. 
So widespread was the influenza that this morbid little ditty, sung by schoolgirls as they jumped rope in schoolyards across the country, spread like a virus too.
“I had a little bird,
Her name was Enza
Opened the window
And In-flu-enza”
My advice for those who think the H1N1 influenza scare a few years ago was overblown is to read this book.  The reason the medical community was so concerned about this particular virus in 2008 is that H1N1 is the same flu virus that affected millions in 1918[1].  John Berry’s well-researched, comprehensive book tells the story of this devastating pandemic; about the men and women who worked to contain and defeat it; about its spread from Kansas to virtually every corner of the globe; about our state of war that placed the need to mobilize forces above everything else, ignoring even the Army Surgeon General pleas and suppressing news reports that might have saved lives—reports about the virus itself.  John Berry takes it even further, and describes, in elegant detail, how the virus worked in the body and why it became so lethal; and how the epidemic spread, and about how this pandemic accelerated scientific research to an unprecedented degree, eventually leading, in one instance, to an understanding of DNA, that most essential building block of life.
Why was this influenza so much worse than ordinary flu?  It affected people in two ways.  For some, the luckier ones, it did act like regular flu from which most people recovered.  However, as is frequently the case with influenza, as symptoms subside and the patient starts to feel better, secondary infections take hold, often resulting in pneumonia.  Think of it this way:  The flu breaks down the body’s defenses—the natural mechanisms that work to keep the lungs sterile.  Enter pneumococcus, streptococcus and other bacterial pathogens.  These are the sources of bacterial pneumonia.  Today, as these secondary symptoms emerge, doctors typically prescribe antibiotics.  Antibiotics are effective against bacterial infections, but not viral infections.  In 1918, antibiotics had not been invented and pneumonia frequently resulted in death.
With the so-called Spanish Flu[2], however, the disease frequently took a more lethal turn.  After ravaging the respiratory track and defeating the body’s normal defenses, it penetrated the deepest recesses of the lungs, infecting those tiny cells responsible for oxygenating the blood, and, literally, choking them off by filling them with fluids[3].   This was the course the flu took in many, if not most, of the young adults who died quickly[4].   Often, symptoms progressed so rapidly that a person could wake up in the morning feeling fine and be dead within twelve hours.   Symptoms included intense headaches, bones that felt like they would break, hallucinations, high fever – all typical of flu, but much more intense.  What was new this time was that blood literally pored from eye sockets, nose, ears, mouth, and victims coughed up blood, even, as was frequently reported, projecting a stream of blood across the room, and in the final stages, “cyanosis”—victims turning such a dark shade of blue from lack of oxygen, it was hard to tell “Caucasians from Negroes.”  So fast did influenza spread, and so overwhelmed were the few medical staff available that “…nurses wrapped more than one living patient in winding sheets and put toe tags on the boys’ left big toe.  It saved time…” 
This was no ordinary “grippe,” this mutation managed to break down all the body’s defenses and confound public health official.  Facemasks – which became ubiquitous, were as useless as a window screen in a dust storm.  People were advised to avoid crowds (virtually impossible) and, as one health board advised, “…stay warm, keep the feet dry and the bowels open—this last piece of advice a remnant of the Hippocratic tradition.”  The problem was, and still is today, that “men could appear healthy while incubating influenza themselves, and they could also infect others before symptoms appeared.”  One patient could infect thousands without knowing it.
So devastating and so quickly did it spread, that there was a breakdown in civil society.  Indeed, at one point, so dire did the situation appear, Victor Vaughan, the acting Army Surgeon General, wrote, “If the epidemic continues its mathematical rate of acceleration, civilization could easily disappear … from the face of the earth within a matter of a few more weeks.”
People avoided people, many refusing to go to work or even to the store.  People were dying at such a fast rate that caskets were in short supply.  Sometimes entire families were infected, with nobody to even dispose of bodies.  Some were so ill, they were forced to sleep in the same bed with the dead.  Unable to buy and prepare food, many adults and children starved, and orphans roamed the streets, and many people collapsed and died in the street.  Horse carts roamed the city, collecting bodies and stacking them like cordwood.  But gravediggers were in short supply and, inevitably, after weeks of paralysis, officials organized to deal with the crisis by digging mass graves.  But public officials were powerless to prevent or slow its progress; chaos reigned.
In the midst of all this, newspapers were reporting, “This is only the grip, nothing to be concerned about,” or complete fictions such as this: “Scientific Nursing Halting Epidemic.”   “On a single day of October 10,” Berry tells us, “the epidemic alone killed 759 people in Philadelphia” and, “During the week of October 16 alone, 4,597 Philadelphians died….” And this was in just one city.  Yet, referring to people not yet infected, a public official is quoted as saying, “There is no question that by a right attitude of the mind these people have kept themselves from illness.  I have no doubt that many persons have contracted the disease through fear… Fear is the first thing to be over come, the first step in conquering this epidemic,” and “The weak and timid succumb first.”  These sentiments, propagated by Washington in the midst of war, appeared in papers across the country.  Of course, people could see what was happening all around them, so these admonitions had just the opposite effect as was intended, magnifying fear and distrust.

Medical History

I was most impressed with how well Berry put the medical history in context.  In just a few chapters, Berry covers a sweep of history from Hippocrates (460-370 BC) (and those who wrote under his name), up to the outbreak of the influenza, and beyond, rendered so expertly, you quickly understand the frustrations and challenges with which the medical community had to contend as the disease spread.
In the course of telling this story, Berry writes about dozens of scientists and health professionals.  I’ll just mention a few.  Most notable was William Henry Walsh, the “impresario” who “intended to precipitate a revolution” in medicine, and did.  He is described as “the glue that cemented together the entire American Medical establishment,” as he dragged the science and practice of medicine out of the dark ages and into the 20th Century.  He was instrumental in establishing Johns Hopkins as a world-class medical institute, starting in the 1880s, and, from his position there, permanently altered medical research, education and its practice throughout the United States.  Along with his protégé Simon Flexner, who would lead the Rockefeller Institute, and Simon’s brother, Abraham, and dozens of others that he inspired, Walsh completed “the reform of all medical education” in the US, and directed “the flow of tens of millions of dollars into laboratory research.” 
To me, the most surprising revelation was the state of medical education and practice before 1900.  Even schools like Harvard, Penn and Columbia did not require students to have a college degree and, what’s worse, many schools admitted anyone who could pay.  Some could hardly read or write!  “The whole system of medical education … is something horrible to contemplate,” complained Harvard’s president Eliot in 1869.  When he urged the adoption of written exams, Harvard’s Professor of Surgery, Henry Bigelow, complained, “…[Eliot] actually proposes to have written examinations for a degree of doctor of medicine.  I had to tell him that he knows nothing about the quality of the Harvard medical students.   More than half of them can barely write.” (Judging from the handwriting on prescription forms, this still may be the case.)
Simon Flexner’s story is illustrative of the state of medical education in the late-19th Century.   It begins from “his growing up the black sheep in an immigrant Jewish family in Louisville, Kentucky.  Older and younger brothers were brilliant students, but he quit school in the sixth grade.”   Described as “sullen and flirting with delinquency,” he worked and was fired from several jobs before getting a job at nineteen with a druggist who had a microscope.  While forbidden to use it, he did anyway and, “Abruptly his mind was engaged.”  He attended the Louisville College of Pharmacy, graduated at the top of his class, and attended medical school, at night, while working in a brother’s pharmacy.  About his medical school experience, “Flexner later recalled, ‘I never made a physical examination.  I never a heard a lung sound.”  Nevertheless, he was then free to hang his shingle and practice medicine.  This is where his story takes another turn.  Flexner was of exceptional intelligence and it was obvious to him just how ill prepared he was.  “His younger brother Abraham had graduated from the Hopkins. … Simon sent some of his microscopic observations to Walsh.  Soon Simon was studying at the Hopkins himself.”  Walsh was so impressed with Flexner that he “arranged a fellowship for him in Germany.”  Four years later, he returned to become a professor of pathology at Hopkins.  A true autodidact, Flexner made up for the gaps in his education by reading and studying widely.  Not only was he well prepared to become a professor, soon afterwards, with Walsh’s whole-hearted endorsement, he became the head of the new Rockefeller Institute, which he led with distinction for many years.  Flexner’s story is just one of dozens of compelling stories John Berry tells. 
Even as late as 1870, at a time when European schools taught the use of microscopes, stethoscopes, ophthalmoscopes and thermometers, doctors in the United States seldom used them.  Indeed, few American medical schools had them available.  While there were “two hundred endowed chairs on Theology at American colleges, there were only five endowed chairs in Medicine.”  Several states didn’t even license doctors, and “the titles ‘Professor’ and ‘Doctor’ went to anyone who claimed them.”  By in large, nurses were more knowledgeable and better trained than many of the doctors with whom they worked.  This infuriated some doctors, who resorted to using numeric codes when prescribing medicine so that nurses could not tell what the Doctors were prescribing, and object.  Traditional “heroic measures” such as bleeding, cupping, blistering, purging (with caustic purgatives) and so on were the methods employed for hundreds of years and, even though many doctors where aware of advances in medical knowledge and knew these techniques did little good, they were frustrated since “little of this new science could be translated into curing or preventing disease.”  
For a time, doing nothing beyond comforting the afflicted was the best medicine.  More often than not, “Do no harm,” meant, “do nothing.”  Not until Walsh and his generation of European-trained doctors began to address the inadequacies of research and education in the United States, would things change.  (Europe was decades ahead of the US until around 1910 or so.)
And change it did, first, with the establishment of the Johns Hopkins Medical School in 1893, followed by the Rockefeller Institute and, quickly thereafter, at other universities and institutions across the country.  An important development that served to precipitate this change was what became known as the “Flexner Report” – a comprehensive survey of medical education in the United States conducted by Samuel Flexner’s brother, Abraham.  This study brought to light the sorry state of medical education throughout the country.  Out of more than one hundred and fifty medical schools nationwide, one hundred and twenty were judged substandard, in fact, abysmal.  With the publicity that followed – publicity very much resisted by the American Medical Association – most schools closed, and those that survived now had a clear set of standards to meet; Abraham Flexner provided the model.  For over 2000 years, medical understanding and practice were frozen in Hippocratic stasis—in theories based on the “four humours[5]” and in practices that included bleeding, cupping, purging and so on; still performed by country doctors even as late as 1918.  But, thanks to Walsh and his protégés, medicine in the U.S. was turned on its head in just a few decades.  By 1918, there were vaccines to prevent bacterial diseases such as smallpox, cholera, typhoid, diphtheria and tetanus, and even cures for diphtheria, the first “cure … entirely the result of laboratory work.”
There is much more here about medical history and science and the men and woman who pursued it that is fascinating to read.  The lives of men and woman like Paul Lewis, William Park and Anna Wessel Williams, to name just a few, are probably deserving of books of their own, but Oswald Avery story deserves special mention.  Initially in pursuit of the cause and prevention of influenza during the pandemic, Avery spent decades in his laboratory, not emerging until 1943 at the age of 67, with a paper describing the function of DNA.  As luck would have it, that year, as he was being considered for a Nobel Prize for an earlier discovery, this new paper (the first he published in a dozen years) was so revolutionary and startling that the Nobel committee hesitated.  Even though Avery was first to publish, he never did get the recognition for his brilliant work on DNA by the Nobel committee that he deserved.  “Tenacious and persistent” don’t even come close to describing Avery.  My favorite Avery quote is, “Disappointment is my daily bread.  I thrive on it.”  Until his death ten years or so later, he never let up.
Researching and writing ‘The Great Influenza’ took John Berry seven years.  Perhaps he was inspired by Avery’s example.  It is surprising to me that, outside of some fictional accounts and memoirs, so little had been written about this pandemic.  It’s almost as though the country developed collective amnesia about an event that cost more lives than all the wars in the 20th Century.   All I can say is, “It’s about time!’  This book is truly a monumental achievement, one that deserves to sit alongside ‘The Microbe Hunters’ and other classics of science.  It should be required reading for every medical student.
During a recent appointment with my doctor, I mentioned that I was reading ‘The Great Influenza,” and asked if he had heard of it.   I was delighted to hear him say he read it last year and express enthusiasm for the book.   I mentioned how surprised I was that even in the last half of the 19th Century, most doctors didn’t use stethoscopes and other instruments we take for granted, and then pointed to the computer attached to the wall (which, I hasten to say, me doctor used) and said, “in fifty years, will we be as surprised and horrified to learn that most doctors didn’t use computers in 2010, and failed to understand the accumulated benefit it provides, for example, in epidemiology and reduction of medical errors?”  He laughed and said to me, “As I was reading the book, I wondered how many of the things I’m doing now I’ll be embarrassed about in fifty years.”
I thought, “now there’s a healthy, self-critical way to look at it—no matter what we’re doing now, how will we view it fifty years later?”  We could apply that to life as well as medicine.  In an instant, my respect for my doctor grew.
“I had a little bird,
Her name was Enza
Opened the window
And In-flu-enza”
Print or download version --> Print version

[1] Thanks to the Army, which preserved lung tissue from autopsies performed in 1918, this was confirmed by RNA analysis in the 1990s.
[2] Spain was one of the few nations not at war and therefore free to report the spread of the disease without interference by the censors.  Since Spain reported it first, people thought that that was where it started, hence the misnomer, Spanish Flu.  Kansas flu would have been more accurate.
[3] What’s going on here is a bit more complicated.  In some ways, the younger and healthier the victim, the greater the danger, since a healthy immune system mounts a more robust counteroffensive.  Ultimately, what killed many patients was the immune response, which flooded the lungs with white blood cells and antigens, which, together with a stew of dead lung cells, clogged up the lungs so completely that the body could not eliminate the congestion fast enough.  Some victims coughed so hard they tore cartilage and broke ribs trying to clear respiratory passages.  Many died and many others suffered brain damage, depression and other long-term disabilities.
[4] One theory for why this virus affected a higher proportion of young adults as compared to people over forty-five, besides the close-quarters imposed by war, is that older people who lived through the “Russian flu” of 1889-90 were more likely to have an immunity.  That flu was similar enough to offer some protection to this newer strain of the disease.  Also see previous note.
[5] Blood, black bile, yellow bile, phlegm.