Tuesday, December 13, 2011

Merry (not yet ready to declare war on) Christmas

Sad News and Glad News

First, Joan’s mother, Florence Parrella, died in June after a very long illness.   She had suffered, spending the better part of fifteen years in bed and a wheelchair.  We will hold onto memories of her making cookies and Italian meatballs with her grandchildren.  She was a patient, loving grandmother. She would have been ninety this month.

Then, in early October, my father, Ernest Schlieben, died.  He departed as he had lived his 96 years, on his own terms.  In late August, after an operation to repair a perforated ulcer, he came down with pneumonia and, knowing the end was near, insisted on being home.  We arranged for hospice care.  He voluntarily stopped eating and drinking and, remarkably, survived 26 days.  My brother Dan and I spent the entire time with him in Ewing, NJ (I made two short visits home) and brothers George and Brooks, who lived nearby, relieved us almost daily.  The month of September gave the four of us a chance to spend time together as adults (that doesn’t happen often these days) and it was time well spent.  I like to think that that was part of Ernie’s plan; that and teaching us that, like living, dying can be faced fearlessly.  Characteristically, right up to the end he was reminding us to empty the dehumidifier in the basement, and, oh yes, don’t forget there are quarters in a bowl in the kitchen.

In between these two events, on August 17th, our granddaughter, Madeleine Paige Schlieben, was born to Roy and Jenn in Bangkok.  Of course, without prejudice, I can say that she is the most beautiful grandchild ever.  Jenn and Roy (mostly Jenn) have been amazingly good at posting blogs and photos of Maddie’s first four months.  We look forward to seeing her in person when Jenn, Roy and Maddie arrive home on December 16th.  They will be home for three weeks.  I was able to share photos and a short video with Ernie before he died.  It might have been the last time he was able to smile; “Life goes on…” he whispered.

More glad tidings:  I am delighted to report that Jess and Brendan Haley will be married on July 14th.  We had an engagement party here in June at which time I was tempted to call Judge Runyon down the street to come by and save us a bundle by marrying them on the spot!   (Like all good thoughts, this one occurred to me about six hours after the party was over.) Brendan’s Design/Build business is going very well and Jess will finish up her Masters thesis in May at BAC.  Hooray!

This year marks the fifth year of our retirement.  We had hoped to travel more this year but events forced us to postpone travel until this coming spring.  We did manage a long weekend in Cooperstown in April, but that was about it.

Joan and I are well, although it would be disingenuous to imply that Joan’s Parkinson’s hasn’t made her life difficult.  She goes to Ti Chi for Balance once a week and to a monthly support group, and she takes long walks almost daily, so she gets high marks for soldiering on.  She does not recommend it, though.

We have been taking care of Suki, Jenn and Roy’s dog, since May.  The heat and humidity of Bangkok weren’t Suki’s thing and she is much more at home hunting chipmunks at Camp New Hampshire.  We managed to keep a tradition alive by vacationing at Silver Bay again this year.  We have gone to Silver Bay every year since Roy was a year old, so it has much the feel of home away from home.

Well, that about does it for another year.  We wish you all a very Happy New Year!

Monday, June 13, 2011

Nursing Home (a poem)


Notice to my progeny: When the time comes, my answer should be easily inferred from even a casual reading of following.  (DNR; shoot me first!)
Nursing Home
by Paul Schlieben

She speaks to me
as if I were a petulant child she was putting to bed
too loudly, with imperious inflection --
and most maddening of all --
punctuating each sentence with “Dearie?” or “Hon?”
as if talking to an deaf imbecile.
And, as if I had a choice,
I’d answer, as surly as I could,
“yes, nursey”, “no, nursey,”
myself,
trying to gain the upper hand,
taking charge,
issuing orders of my own,
but, too weak to be heard,
and, literally, without the heart to pull it off,
and deep inside, to be honest,
only wanting to be looked after lovingly,
like a child,
held warmly, against a caring breast,
to be gently mothered, not nursed.

As she brings the spoon closer to my lips,
I half expect a singsong coaxing,
“here comes the airplane, dearie” or “open wide, hon”
to entice me to eat my oatmeal or applesauce.
At first, I tried to blow it back in her face,
but, unable to muster the breath
to cause so much as a flutter on the spoon,
I’d resign and eat,
bemused at my own peevishness.

Doctor says they are well-intentioned, these nurses;
that their efficiency is neither good nor bad,
their intention simply to get through the day
with as few diapers and soiled sheets as they can manage
and escape home without disrupting their routine,
and who can blame them?
Here, the measure of a good day
is one without a death,
a disruption from which we are all rarely spared.

No, their intentions are neither good nor particularly bad
and, upon reflection,
as sad witnesses to this endless procession of the inevitable,
who can blame them for quietly
whispering in the hall,
“I wish he would just get on with it”?

At first, from the moment they lifted me from the cab
into that blue vinyl and steel wheelchair,
prized for portability rather than comfort,
and wheeled me briskly past supermarket doors
that swooshed opened into that putty and green lobby,
I fought against the sudden assault on my dignity.
Not since boot camp, sixty years ago,
had I felt so autonomous, so undifferentiated,
so regimented – “Here I am,” I mused,
“fodder for this ‘cash cow of death’!”

It shocked me.
But now, the personality that
recoiled into its shell,
cautiously peers out,
with dawning comprehension. 
This, now, is the permanent state,
or as permanent as it can be at 88 --
more endgame, than state.
So I hunker down and tell myself,
“How many years can this stage last?
I’ve endured longer hardship
and humiliation in my time.”

So, gradually, I think --
exercising the only faculty left me --
and understand the true nature of this place.
Resigned, I simply do what I’m told,
or rather, not fight what’s done to me,
waiting quietly, hoping for the opportunity,
on a slow day, when the pace has slackened,
to ask quietly about a dad or mom,
a husband or child, a friend,
to glimpse what humanity lies
in the shade of this relentless efficiency,
to find a crack and gain purchase
on the well-defended souls entrusted with my care,
and engage Elma, Maria and Bethany,
so they, in turn, can acknowledge my humanity,
however fleetingly,
and we can become connected,
however tenuously,
so I won’t have to die alone.

© 2011 by Paul Schlieben
 

Sunday, May 8, 2011

A Poem for Mother's Day


A Mother’s Love
By Paul Schlieben

She loves simply and quietly.
She keeps love in check,  
lest it burn her heart to ash. 
Where stress is the third rail of my existence,
Love is hers.
Her love is trapped in a cage of fear. 
Mostly, it sits quietly.
Sometimes it crashes against the bars,
demanding to be heard,
or it watches mutely, fearfully,
informed by the wisdom of years,
suffering in our foolishness,
afraid giving voice to her fears
Will drive us away,
Hoping for an improbably outcome
As we crash into obstacles
only she was able to see. 
Softly she cries out, anguished,
fearing her cries be misinterpreted as anger,
but is that so unusual?
A mother’s love is often foolishly misread,
except by her.

Thursday, May 5, 2011

Paul Ryan's Medicare Plan

While there are signs that the Republican Party is backing away from its plan to eliminate Medicare by substituting vouchers, starting with people who are now  54 and under, I couldn't resist posting this cartoon.  The idea that you would try to assuage the concerns of seniors -- in effect, buy their support -- by exempting them from the effects of their proposed changes, smacks too much of how Washington works: playing one side off against the other, rather than persuading everyone to come together to solve national problems.  In short, I find such "divide and conquer" strategies offensive.
Does something need to be done?  Sure.  But it has to start with addressing the costs of medical care, not by handing over even more of our financial resources to for-profit insurance companies.  The Affordable Care Act is just getting off the ground.  Yes, there's room for improvement.  We need to move away from fee for service towards a more rational and affordable system of best practices and prevention.  To repeal the Affordable Care Act now and then turn around and dump seniors into an unregulated system, one that would allows insurance companies to deny coverage and call all the shots, just like the good ol' days, is madness.

Thursday, March 31, 2011

‘Sunset Park’ by Paul Auster, Plus One


Paul Auster is one of my literary heroes and remains so even after this novel.  He is a writer whose every sentence drives the narrative forward, who delivers up vivid characters in just a few sentences, who can write convincing dialogue without quotation marks or “he saids,” “she saids,” and who writes with purpose.  Every book is masterful and worthy of careful study. There’s a puzzle in each one; each is as enigmatic as ‘Book of Illusions.’[1]  Mastery, I guess, is what happens when you’ve been at it for four decades.
I think ‘Sunset Park’ is not his best work, but that’s not to say it’s not good.  I’ve learned that there’s always much more than meets the eye in an Auster novel and I’d fault myself for being obtuse before I spoke ill of his work.  His dissection of the movie, ‘The Best Years of Our Lives,’ the classic 1946 story of three men returning home from war, is brilliant and insightful. That alone is good reason to read ‘Sunset Park.’  By coincidence, I had just seen this classic film, so it was fresh in my mind.
‘Sunset Park’ opens with Miles Heller, a 27-year-old man working in Florida on a ‘trashout’ crew—men hired to clear out foreclosed, abandoned homes of whatever its former occupants left behind.  Mostly, it’s broken toys, trash bags and burned out pots left on the stove; but occasionally it’s computers, DVD players and flat screen TVs.  Sometimes the houses look like the occupants just walked away from a half-eaten breakfast; more often they are trashed by the owners—missing stoves, sinks, and stripped of wiring and copper pipes.  Miles takes lots of pictures of what he finds, although he can’t say exactly why.  Maybe they hold the key to the lives lived there, a symbolic connection to the life he left behind in New York City seven years earlier.
Reading an old copy of ‘The Great Gatsby’ in the park on his day off, he meets and falls in love with a young Cuban-American high school student, Pilar Sanchez. Sitting on a blanket nearby, Pilar catches his eye and laughs, pointing at her book jacket, gesturing that they are both reading the same book.  And, so, a relationship begins.  We soon learn that Pilar’s parents are dead, killed in a car crash, and she lives with three older sisters.  As we’ll soon discover, the oldest, Angela, is trouble.  Eventually, Pilar moves into Miles’s apartment and Miles, realizing how incredibly smart Pilar is, tutors her and encourages her to apply to several northeastern colleges.  He is confident she could win a scholarship.
There are just two problems; Pilar is underage and her oldest sister, Angela, dislikes Miles or, at least, takes a predatory interest in him.  Angela works as a cocktail waitress and, according to Pilar, “sometimes sleeps with customers for money.”  Sensing an opportunity to blackmail Miles, Angela pulls him aside after a dinner with the family and confronts him with the fact that Pilar is underage – “one call to the cops and your toast, my friend” – and demands he deliver to Angela the trash-out plunder for her and her associates to fence.  At first, feeling trapped, Miles complies, delivering a flat screen TV and a few other things, but eventually, he refuses.  One morning, as he was leaving for work, Angela’s friends corner him, punching him, “a cannonball of a punch” hard in the stomach to make clear they will be less gentle if he continues to refuse.  Miles decides his only recourse is to leave Florida until Pilar turns eighteen, about five months from now.  He gives Pilar most of his savings to cover her expenses so she can remain in the apartment until she turns eighteen and graduates from high school, at which time it will be safe for Miles to return.  Miles retreats to Brooklyn.
Back-story.  Miles Heller is the son of a New York publisher, Morris.  Morris Heller, now in his early 60s, started Heller Publishing at a time when it was possible to discover and publish unknown writers.  Morris owes much of his success to his father, who put up the money to start his business, and to those few writers he discovered years earlier—writers whose most productive years now are behind them, not necessarily because of diminished talent, just the inevitable consequence of growing old.  (Does Auster identify with these men?) 
Miles’s mother, Mary-Lee Swann, having sensed that motherhood would be the end of a promising acting career, left Miles and his father shortly after he is born.  Since then, she achieved fame on stage and film.  Contact is intermittent but not embittered.  Two years after she abandoned them, Miles’s father married Willa Parks, an English professor.  Willa was married before and has a son, Bobby, about Miles’s age.  When they were in high school, Bobby was hit and killed by a car while walking on a mountain road.  Bobby, happy-go-lucky and careless, had run out of gas.  The boys argued and Miles, exasperated, pushed Bobby.  The circumstances of Bobby’s death lead, circuitously, to Miles flight four years later at the end of his third year at Brown.  Miles, “…still can’t decide if he is guilty of a crime or not.”  (Auster’s ambiguous framing of Bobby’s death – Bobby’s lackadaisical attitude, the polar opposite of Miles’s; a typical step-brotherly love-hate relationship; Mile’s irritation leading up to the death; the coincidence of a car barreling down a mountain road at just the wrong instant; and for Miles, “… what is important … is to know if he heard the car coming toward them or not …” – are all pure Paul Auster.  I can’t imagine a book of his that didn’t place the reader on the knife-edge of ambiguity.)   However, as much as it affected him, it wasn’t the accident itself that sent Miles wandering, it was overhearing years afterwards his parent’s fraught conversation about him and the guilt this evinced.
Leaving no word of his whereabouts and, now, gone for more than seven years, Miles maintains a correspondence only with an old New York high school friend, Bing Nathan.  Miles travels to the ski slopes of New Hampshire, to Chicago, to California and eventually to Florida, where we first meet him working on the trash-out team. 
The trash-out theme re-emerges later in the book in a more brutal form, but not before we meet several interesting characters, each deserving one or more chapters of their own. 
There’s Bing Nathan, “the only person who has known [Miles’s] various addresses over the years…,” and, who, without Miles knowing it, shared the letters with Miles’s father, Morris.  Oversized and flabby, an anarchist and sometimes member of a band called ‘Mob Rule,” Bing is the proprietor for the past three years of a tiny fixit shop in Brooklyn called the “The Hospital for Broken Things.”  Bing abhors modern technology and among the things he fixes are old manual typewriters favored by a few writers who live nearby.
Then there’s Ellen Brice, an artist who, during the course of the novel, gravitates towards drawing highly erotic images.  Temporarily at least, Ellen is seriously miscast in life as a Brooklyn real estate agent who, while showing Bing cheep Brooklyn apartments, steers him to an abandoned house – a dilapidated shack really – on a street facing Green-Wood Cemetery (later referred to as a “vast necropolis”) in the Sunset Park section of Brooklyn.  As if to confirm her own disaffection with Real Estate, when Bing decides to take over the abandoned house, “… like no other house he has seen in New York,” Ellen becomes one of Bing’s three housemates.  Ellen has suffered emotional instability, but “doesn’t want to go back on medication.  Taking one of the pills is like swallowing a small dose of death…”
Then there’s Alice Bergstrom, a doctoral student who recently left a job as adjunct at Queens College “teaching remedial and freshman English” at lower wages then if she worked at a car wash.  Now, living rent-free in the Sunset Park squat and working just fifteen hours a week for a non-profit called PEN (more about that later), Alice is able to devote more time to her thesis on “…the relations and conflicts between men and women as shown in books and films from 1945-1947…”   (It is at this point that Auster works in his analysis of the film ‘The Best Years of Our Lives.’)  Alice is visited intermittently, and at lengthening intervals, by her occasional, self-absorbed boyfriend, Jake Baum, an unappreciated writer of short stories who is drifting towards the realization that it isn’t woman who interest him most.
Millie Grant, housemate number four, has a relationship of sorts with Bing and then, inexplicably departs, thus making way for Miles, who, responding to Bing’s entreaties, joins the Sunset Park squat, which he views as a inexpensive, temporary alternative to paying New York rents or getting beat up or murdered by Angela’s friends in Florida. 
But, there’s a flaw in their thinking.  They all are certain that the overworked staff of the city housing department, which acquired the house after its owner defaulted on taxes, are stretched thin and have forgotten about a worthless, rundown house in Sunset Park.  What Bing and company didn’t count on is just how far a senseless spirit of vindictiveness will carry even the most overworked city agency when abetted by two violence-prone policemen.
Miles hasn’t contacted his father or mother for seven years.  Transformed and emboldened by his love for Pilar, Miles decides to comes to terms with the past and contact his parents; his California mother temporarily in New York appearing in an off-Broadway play, Samuel Beckett’s ‘Happy Days’[2]; his father at home in New York, but in and out, making frequent, unplanned trips to London, where his wife, Willa, who has become very ill, has been teaching a semester long class.
Of course, the central event here – the one that sets everything else in motion – is Bobby’s death and Miles’s unresolved guilt.  That the timing of Bobby’s death coincides roughly with 9/11 is interesting, but the events are not easily paired.  That the time frame of the story roughly parallels the Great Recession, bookended as it is between Miles trashing-out abandoned houses in Florida and the final scenes of the book, appears to be intentional.  One might even go so far as to suggest that, allegorically, Miles represents, with Bobby’s death, the national trauma that was 9/11 (did our actions trigger the attack somehow?); our collective ignorance of whatever deeper meaning is rooted there; the wildly irresponsible, go-out-and-shop, orgy of house-flipping that overtook the country; the subprime crash resulting in ‘trashing-out’ the homes of millions of Americans; and, just when recovery seemed possible and things looked like they are getting back to normal, another crash.  Yes, that double-dip hasn’t happened yet, but many people think that the political drift of the country all but ensures more trouble ahead.  While that certainly describes the arch of Miles’s experience, I am far from certain this is what Auster intended.  Another possibility just occurred to me.  Miles, young and feeling guilty and confused, is living the only life he could during these seven years.  He’s caught in a vortex of events he doesn’t understand, including his confusion about his culpability for a death.  He naively works to rekindles optimism about his future, finds love, reestablishes normalcy, then rudely, crushingly, realize that he has miscalculated once again.   What could be a better description of the confused lives Americans have lived these last ten years?  What could be a better prognosis of the hardships to come?
‘Sunset Park’ is the most topical and contemporary of Auster’s works in that it reflects and relies on recent and current events more than any other.  For instance, one of his characters, Alice Bergstrom, is working for an organization called ‘PEN Freedom to Write Program[3]’ and Auster devotes several pages to PEN’s mission.  He mentions Salman Rushdie, the death of a Norwegian publisher, Article 301 of the Turkish penal code, Burmese writers, the Patriot Act, the Campaign of Core Freedoms, Cuban writers, and, of course, Chinese writers such as Lui Xiaobo, the jailed Chinese democracy advocate and cowriter of something called Charter 08, and PEN’s cause celeb.  While I wholeheartedly support PEN’s mission, I’m not sure it serves his narrative well.   But, maybe that’s the price he was willing to pay in support of this worthy and, as Auster points out, grossly underfunded organization.
Then there’s the frequent references to baseball, a passion that historically ties Miles to his father and grandfather, a passionate interest in players who’s lives have taken unexpected, often tragic, turns.  Names like Boots Poffenburger, Herbert Jude Score and Lucky Lohrke.  If I followed the game more closely, this might have drawn me in more than it did, but I was struck with this sentence: “…baseball is a universe as large as life itself, and therefore all things in life, whether good or bad, whether tragic or comic, fall within its domain.”  I might add that man’s longing for certainty, for universes that can be comprehended and shared, is itself a universal longing.  Baseball is just one example.
There’s also the obvious references to the sub-prime mortgage crisis and the effects it has had on not only on the poor but the nation’s psyche; to the shabby treatment of adjunct professors; and to the state of publishing today, and publishers’ struggle to stay alive.  It occurred to me that Heller Publishing might be a surrogate for Auster’s long-time publisher, which is probably struggling.  Maybe Auster, trading on his reputation and the all but certain sales his books generate, wrote this book to help his publisher get through the recession.  As a reader who only frequents my local ‘independent’ bookstore, I for one am more than willing to oblige.
I’ll close with this quote from one of Auster’s characters, Renzo, a writer and lifelong friend of Mile’s father: “The interview is a debased literary form that serves no purpose except to simplify that which should never be simplified…”  I guess he might say the same about a book review.



‘Three Stations’ by Martin Cruz Smith

Martin Cruz Smith’s latest book does not measure up to his first big success, ‘Gorky Park’.  He tries to squeeze just one more story out of Arkady Renko, and it probably won’t be his last.  In this instance, Renko is a Moscow police detective on the verge of losing his job; in fact, the order is out to can him, so he is avoiding contact with his corrupt boss.  Renko pursues a murder case of a woman presumed to be a prostitute who was found in a seedy trailer at the point where three railroad stations terminate in Moscow.  But the evidence doesn’t add up and Renko’s pursuit leads him through a maze of corruption, but not very convincingly, including attempts on his life.  While there are lots of street level Moscow atmospherics, there are also abrupt cutaways and plot shifts that are less than satisfactory, as if someone else edited this novel for length and left a few too many clues on the cutting room floor. 
Sometimes you get the feeling that a writer and his publisher just need to boost their revenues by riding on their reputation of earlier successes.  They both knew Smith didn’t have to try too hard to make some serious dough.  I know, this sounds terribly cynical, but, hey!  On that score, ‘Three Stations’ succeeds beautifully.

[1] This is the title of an earlier book.  See my earlier post of Paul Auster’s book ‘Invisible.’  'Invisible' Review
[2] I’m not familiar with the play but suspect there’s a thematic connection here.
 

Print and Download version -> Print version

Monday, March 28, 2011

Mal-Distribution: Reaching the Boiling Point?


Are we misreading what going on in the Mideast?  Or, to put it another way, are we missing its broader implications?  Is what we're seeing there a precursor of what we're likely to see here and in other countries in the future?   Are the spread of Wisconsin-inspired demonstrations related to the unrest we’re seeing elsewhere?  Are these revolutions the natural consequence of the mal-distribution of wealth?  
We're a country caught between two virtues, the virtue of Personal Responsibility –the idea that each person should make his own way in the world and control his own fortunes – and the virtue of Social Responsibility – the notion that we are all in this together and the wealth of the nation should devolve to the benefit of all, that we'd all be better off (even the wealthiest among us) if the wealth were distributed more equitably.
The challenge is to reconcile these two seemingly opposing virtues. 
I believe that the fundamental problem is that there just isn't enough work and that this will only get worse.  As the use of technology expands, jobs slowly disappear; work gradually becomes obsolete.[1]  Today, most workers are At-Will employees, with the employer holding all the cards.  Purchase new technology to displace personnel?  Great—Fewer people to feed and the government will pick up the tab.  Contrast this with the 1950s, when 28% of our workforce belonged to unions,[2] virtually all employed in the private sector.  But, such is the wealth of the nation that even the poorest will survive, somehow.  Look at Egypt where the majority live on two dollars a day.  Even they get by. 
As the divide between the rich and the poor grows, and it becomes harder for the rich to hide their fortunes, the unemployed and poor get angry.  They feel cheated.  Sometimes, as in Saudi Arabia, the government tries to buy off its people, but usually, by the time anger has boiled over, it’s too late.  You can put a lid on it, but it only boils all the harder.  As long as there have been revolutions, it has ever been so.[3]
Our system requires that we work or accept being poor.  But in the USA, there are five applicants for every job opening, and this is not likely to improve soon, if at all.  A college education is no longer a guarantee of employment.[4]  Time will tell whether today’s high unemployment is cyclical.  Evidence suggests it’s not.
The prevailing fiction is that the wealthy earn their money by the sweat of their brow and deserve to keep every penny.  However they gained their advantage, they're now in a position to leverage their power and resources to acquire even more of both without much personal effort, at a cost to our nation’s well-being.  The trend of the past ten years is indisputable; the rich have become much richer – 50% richer – while the rest of the population has lost ground; many, are far worse off.  Most would agree that, whatever the cause, for the good of the nation, this trend must be reversed.  But how?
The challenge is to get our leaders, most of whom are part of the privileged classes, to talk honestly about our problems without being drowned out by the chattering classes and a well-financed opposition.  Corporate influence in Washington and its ownership of the media ensures that the ideology that  favors wealth is in ascendance.  This must change.  We need to achieve a balance.
Our political leaders hide behind the popular illusion of American Exceptionalism, a fiction they contradict at their peril.  But what if it turns out that we’re not exceptional; that we’re just the same as everyone else, and what we're witnessing is a worldwide phenomenon, a quake with its epicenter at a fruit-stand in a small village in northern Tunisia that set off a tsunami that just hasn't reached our shores yet?  What then?
The only thing that prevents us from talking about these things is ideological stasis, and the fear of being wrong.  Stipulated: Democrats are wrong 80% of the time … and so are Republicans.  Now, let’s talk.
Print or Download --> Print version

[1]  See my previous blog post: In the Shrinking of a Pie      Further evidence: Census figures show that from April 1, 2000 to April 1, 2010 US population grew from 281,422,000 to 308,745,000, an increase of 27,323,000. In contrast, Bureau of Labor Statistics figures show that from April, 2000 to April, 2010 non-farm payrolls decreased from 131,660,000 to 130,162,000.  (No, everyone didn't suddenly decide to go into farming...)
[2] In 2003, 11.5% of workers are union members, three-fourths working in the public sector.  In 1954, virtually no public workers were union members. 
[3] For a brief historical perspective, see NYTimes: Every Revolution Is Revolution in Its Own Way

Monday, March 14, 2011

'In the Shrinking of a Pie' by Paul Schlieben


The Articles First

Here are links to three very interesting articles that relate, in a roundabout way, to the effects of technology on employment and education.  I’ll try to tie them together later.  Even if you have read them already, they are worth reading again.  (Also take the time to read the Reader’s Comments.  Often they are as interesting as the articles.) 
Last Two Jobs in America
The first article published in the NY Times on March 4, 2011, addresses the effects of technology on high-level jobs.  Apparently, the performance of the computer known as ‘Watson’ on the quiz show ‘Jeopardy’ caused more than a few people to wonder, “What’s next?”  The first example used in this article refers to something called 'e-discovery,' (law firms’ using software to examine thousands of legal documents that might take a team of lawyers weeks to research); the second example refers to the use of software to do computer chip design.  
Here’s the link:
That second article, a NY Times Op-Ed by Economist Paul Krugman, discusses the “hollowing out” of the middle class:
Op-Ed Columnist:  Degrees and Dollars and subtitled, 'The hollow promise of good jobs for highly educated workers.' 
And that leads me to the third article, which ties into the overall effects of these technological "advances" on education—the inevitable negative feedback loop.  That’s not the point raised in the article, but the inference is hard to ignore.  I would have titled this piece, "Preparing A Nation for Walmart," but Bob Herbert opted for:
Colleges deliver the education that students’ demand and, absent a vision for their own futures (the essential 'spark' that ignites a student’s ambition), that’s not saying much.   Students opt for fun.  Colleges, competing for seats in seats, are all too willing to oblige.  The inevitable result is that academic standards have eroded and most students who graduate lack, as Bob Hebert says, “critical thinking, complex reasoning and writing” skills.  The trend is clear; fun for all, no heavy lifting please, and college degrees to nowhere.
Paul Krugman rightly points out that the “idea that modern technology eliminates only menial jobs, that well-educated workers are clear winners, may dominate popular discussion, but it’s actually decades out of date.”

Getting At The Bigger Questions This Raises

I remember back when computer technology was emerging in the early 60's.  A common topic was to what degree, and how quickly, computers and robots would displace workers.  As it turned out, these worries were premature.  As more and more people were employed in the computer industry, these concerns faded.  For decades, there was a net increase in employment.  Even unskilled workers could find a job in IT.
Well, it turns out our worries were justified; we just had the timeframe wrong.  The efficiencies promised by technology took a lot longer to take root and, not until the recessions of the past decade enabled companies to layoff workers did it become clear that corporate America could shed jobs without adverse impact on profits.  In fact, profits in many industries increased dramatically.  As the economy recovers (driven more by foreign markets than our own) companies opt to invest in technology to forestall the need to rehire workers.
What has happened, in fact, is that computers, whose effects have been accelerated by high-speed communications, perform higher-level tasks formerly thought to be beyond their capabilities.  The result is fewer and fewer jobs, even for those with advanced degrees.  Call it the ‘Watson Effect.’
Think about this:  In 1997, an IBM computer called ‘Deep Blue’ beat the world chess champion, Garry Kasparov.  In the finite world of a two dimensional chessboard, this was relatively easy.  Fourteen years later, an IBM computer called ‘Watson’ achieved a far more ambitious task by beating, to an overwhelming degree, the two most successful ‘Jeopardy’ players ever.
‘Watson’ consists of 2500 ‘cores’ and fills a small room, but don’t let size fool you.   How long will it take for something that powerful to fit on a desktop?  If ‘Moore’s Law’ applies, the answer is about ten or fifteen years.  If you add the collaborative, parallel processing[1] potential of the Internet, the timeframe may be even shorter.  “Yes,” you say, “but that was just a parlor trick—a quiz show.”  No, it was powerful demonstration of the ability of computers to understand language and interpret complex, tricky questions.  For those of you who missed it, here’s a typical question[2]:
Q. “Kathleen Kenyon’s excavation of this city mentioned in Joshua showed the walls had been repaired 17 times.”
Watson’s answer (in less than 3 seconds): “Jericho.”
 In the past, computers and robots we’re employed to perform routine computations and data processing tasks.  Today they are able to do much more.  Think Google on steroids.  Then, think of that version of Google on steroids.

Workers Twitter while Rome Burns

We keep hearing the term "worker productivity" as if this were a measure of human output—as if people were actually working harder or smarter.  Politicians often applaud improvements in ‘worker productivity’ and brag about how productive American workers are.  (In politics, one must always pay homage to the fiction of American Exceptionalism, no matter how out of date that notion may be.)  How many times have you heard that “American workers are the most productive in the world?” 
What is this statistic really telling us?

The Technology Productivity Index

I’m sure there are some people who are working harder, especially in organizations that have suffered drastic staffing cuts, but I can assure you that many more spend a significant part of their workday surfing the web, updating their Facebook pages, reading newsfeeds and Twittering, (“In my cubie … this job sux.  Tx God for Angry Birds!”).  
No, what the ‘worker productivity index’ really measures is the degree to which technology has supplanted people.  Or, as Paul Krugman puts it, “technological progress is actually reducing the demand for highly educated workers.”  It’s hollowing out job opportunities for the majority of the middleclass. 
 Did you realize that personnel costs are now just 12% of the cost of manufacturing a car?  And that’s not just on the manufacturing floor.  I don’t have the percentage from 10 years ago, but I wouldn’t be surprised if it was close to 50%. (I’m looking for it.  Any help here?)  Check any manufacturing plant today.  What you see are robots, not people.  The few people still employed are in engineering or behind glass partitions, monitoring the robots[3].  Personnel costs are no longer a factor in determining where to site an auto plant, markets are.  We don’t outsource jobs because labor costs are too high; we outsource to build product closer to those who will buy them or to avoid tariffs or for myriad other reasons. (Only industries that still rely on thousands of skilled workers, like clothing and shoes, outsource jobs because of labor costs.)
Instead of a worker productivity index, a more accurate description would be "technology productivity index”. This would drive home the reality of most industries today—the degree to which technology is elbowing people out of their jobs.

How Does This Relate to the Quality of Education?

This part demands that you pretend you are a high school or college student with no experience and your whole life in front of you.  (For the time-being, pretend also that you couldn’t find Chicago on a map.  Now you got empathy!)  How would you view your prospects in a country that has lost more than 6.5 million manufacturing jobs since 2000?  Oh sure, you’re an exceptional student – you even know where Afghanistan is – and you intend to go into medicine or finance (neither one of which actually produces anything useful.  From a business perspective, they’re expenses, not revenue.)  Bear with me.  I’m talking about the average student, the fat belly of the bell-shaped curve.  Maybe your father or uncle worked in construction or as a machinist.  Today, it is likely they are unemployed or working a job that pays much less.  But somehow, your parents saved enough for you to attend a state college, or their low income entitled you to student loans[4] and Pell Grants.  But, remember—essentially, you’re clueless.  You don’t know what you’re going to do and you don’t have much ambition.  You’ve been told that your lifetime earnings will be much greater with a college degree, and your loving parents want to see you succeed, so off you go.   “I just need that degree,” you think, “That’s my ticket to the good life.  And college will be a blast.  And if I don’t go, then what?  But, just don’t make it too hard.”  Not to worry.
Students may not be sophisticated, but they can’t miss obvious signs of a declining job market.  They are swimming against the tide of demotivation, spinning in the vortex of “No Help Wanted” signs.  They see unemployed parents and neighbors; they see the underemployed now working part-time for $10 an hour, displaced from jobs that paid three or four times that much a decade ago.  They see jobs going offshore or disappearing into thin air. 
Really good students (like you) will do just fine, but the fat of the bell-curve will have a hard time visualizing a rosy future for themselves.  You’ll opt for the gut courses, just to get that degree. 
Why is it that today only 75% of high school students graduate from high school?   Why do more girls graduate from college than boys?  Sloth, you say?  Grand Theft Auto?  Drugs and alcohol?  Yes, they contribute.  But stop.  What are the best motivators?  Is it the prospect of a rich and rewarding life, and a belief that it is attainable if you work hard?  That’s when distractions hold less sway.  An imagined life, one that marries aptitude, opportunity and prestige, is what makes the difference between being easily distracted or motivated to learning.  Without such a vision, you have today’s high college dropout rates.  Less than 50% of male college freshman graduate.
Could a factor be that jobs that were traditionally dominated by men, like in manufacturing, have disappeared?   Gender is less a determinant of success for today’s jobs.  Women and men are equally adept at most jobs.  In fact, I can think of many reasons why men may not perform as well, like stereotype-conflicted self-esteem, masculine expectations and testosterone (born out by my own observations in business.) 
So, here you are; your male mentors are shuffling around complaining life is unfair and it’s all those Republicans’… Democrats’… Bankers’ … Socialists’ … Corporate elites’ fault, but, at the same time, your parents expect you to ignite that spark of ambition that will launch you into a really high-paying career.  “Doing what?” you ask.  “Well, something will come along… some exciting new technology like the Internet or renewable energy will light your fire.”  Trouble is, on a macroeconomic level, those new technologies are likely to hollow out more of the middle class by elbowing aside more workers.  Think about the effects of the Internet on retail, where a website can eliminate the need for retail outlets and the function of bricks and mortar is reduced to marketing, useful for driving people to a website.  Don’t think Apple builds stores to sell computers.  They build them to sell image.  Most people buy technology on the web.
While I agree with Paul Krugman's analysis, I’m not ready to accept his solutions.  I think expanding education opportunity; collective bargaining and tax reform (where the rich pay a greater share) will only take you so far.  They’re stopgaps, necessary, perhaps, but not farsighted and, in today’s political climate, not very realistic.  Looking further into the future, it's hard to imagine how 7-9 billion people will be productively employed.
In the 50s, the question about where the explosion of computer technology might lead was academic; today it’s anything but.  Is the workforce doomed to be unemployed or underemployed?  If you look at the trends of the last 60 years, and in particular, the last decade, it’s hard to conclude otherwise.  We read about Americans loosing high-paying jobs who are now either unemployed or working for $10 an hour, without benefits.  The underemployment[5] rate in the US is about 18%[6].  We read that real wages of middleclass workers (adjusted for inflation) have been stagnant for the past 10 years.  Is work becoming obsolete?   More and more of our nation’s resources are being directed towards projects that will increase the wealth of a few, with the unintended consequence of impoverishing the many.  

What’s the Endgame? 

Nice Ride, but where are we headed?
The pie is shrinking.  What’s left of the pie is being consumed by those who own a seat at the table.  The rest are left fighting over the crumbs.  I don’t intend this to be a political diatribe railing against wealth or dividing the world between the haves and have-nots. Circumstances are accomplishing that all on their own, people are just doing what people do.  This is an attempt to understand the factors at work: to understand trends over a span of decades and what they tell us about the future.  I don’t blame those who have been successful and find themselves on the winning side.  I’ve been more fortunate than I could have ever imagined.  It’s just that the long-term consequences are likely to be very ugly if we don’t figure out where we’re headed.
The stark reality is this: If you are not in a position to control the production of wealth, you will be out of a job or minimally employed to the degree that you make just enough to consume what’s necessary to keep the engine running—for most of the middleclass, that means running on idle.
Print and download version --> Print version

Saturday, February 26, 2011

'The King's Speech' - A Review by Paul Schlieben


Many will see ‘The King’s Speech’ and ‘Social Network’ through the prism of class and privilege.  (Some see all films as political propaganda.)  In one film, you have a Prince afflicted with a debilitating stammer, who reluctantly becomes King when his older brother abdicates the throne of the King of England.  In the other film, you have a privileged, ambitious, brilliant Harvard student who goes on to create Facebook and become the world’s youngest billionaire.  No question about it, historian Howard Zinn would have abhorred them equally.
I admit to having grown weary of British period dramas — all dressed up with suitably starched upper lips and sphincter-choked accents — so I was not predisposed to like ‘The King’s Speech’ and was reluctant to see it.  But Colin Firth’s reputation combined with widespread praise for the film persuaded me to go.  My prejudice evaporated in the very first scene, as a fearful Prince (Colin Firth), attending some grand event being broadcast live on radio, approached a microphone as though being led to the gallows, and … is unable to speak.  As they say, you could feel his pain.
This week I saw 'The Kings Speech' for a second time.  I can't explain why I find it so entertaining – maybe I'm getting old and sentimental – but at its heart and stripped of its pomp, I believe it can be explained simply like this: it’s a film about two men who during the course of the film become friends.  Neither one, certainly not the Prince, would have described it that way, but inevitably, both might well have agreed years later.  That this friendship was improbable and true (or, at the least, "based on" a true story) and the historical context extraordinary, heightens the dramatic tension and makes us sit up and pay attention.   The Prince of Wales, Bertie, needs a voice coach to help him overcome a terrible stammer.  Ultimately, what will cure his stammer is to unburden himself to a friend, one who helps him identify the source of the stammer and overcome its effects.
Lionel Logue (Geoffrey Rush), a self-taught, unaccredited, and unorthodox voice coach and part-time actor, with a family to feed, needs clients.  While not the primary focus of the movie, one can imagine that Logue craves professional validation, as an actor and as a coach, and is equally in need of a friend.  Both have the love and support of their families, which seem to me to be important prerequisite to their forming their friendship.
That they were brought together out of necessity should not distract from the bond that emerges.  Friendships have been built on weaker foundations.  Some might say that Lionel’s desire for the King’s success is evidence of his living vicariously, but perhaps the best friendships are between people who, motivated by mutual affection, live vicariously through each other and celebrate each other’s successes.   In fact, that might serve as a definition of a good marriage. 
Successful drama must be emotionally engaging.  ‘The King’s Speech’ is successful because we feel every moment as if we were Bertie, one moment, and Lionel Logue, the next.   Much of the credit for this film’s success must go to Colin Firth and Geoffrey Rush, who deliver performances so convincing I would not be surprised to learn they had become lifelong friends in the course of shooting this film (and disappointed if they had not.)   While I am happy to see Colin Firth as the likely recipient of the Academy Award’s Best Actor, I was more pleased to see Geoffrey Rush nominated for Best Supporting Actor.  He would get my vote. 
Sometimes a simple story of friendship, told well, trumps a more complex tale, no matter how brilliantly conceived.  I admired 'Social Network' and believe Aaron Sorkin to be the most brilliant screenwriters alive today, but 'The Kings Speech' broke through to my heart.  I'd vote with my heart. 
And, to give ‘The King’s Speech’ screenwriter David Seidler his due, his script was also brilliant.  Take, for instance, this dialogue:
Prince of Wales (Bertie): “I’ve had the very best voice coaches.”
Lionel Logue: “They’re idiots.”
Prince: “They’ve all been knighted.”
Logue: “Makes it official then.”
Or, when asked whether he knows any jokes, Bertie stutters this sardonic reply: “… T… T…Timing isn’t one of my strengths.”
Many believe that ‘Winter Bones’, that dark Appalachian tale of America’s desperate, poor meth-addicted underbelly, better represents the reality with which many American’s contend.  I agree.  Others will insist that ‘Social Network,’ with its clever plotting and dialogue is more deserving of an Oscar.  In some respect, they would be right.  But all these movies could be from different planets—isn’t there something absurd about having to pick one over the other?  Each film is original, brilliant and unique.   Throw in ‘True Grit’ as well.  That film was truly gritty, from beginning to end.  To pick one film over the others is a fruitless exercise.  We don’t need a ‘Best Picture’ category.  The nominations are sufficient.  Stop there.  Tune out the Academy Awards and just go see them all.
For print and download version -- Print version

Sunday, February 13, 2011

'The Great Influenza' by John M. Berry


Imagine half a dozen ocean waves, emanating from different parts of the globe, converging on a coastal community in summer, a community whose residents are unaware of the impending danger.  Some waves are visible on the horizon; others are but deep swells, typical of the season.  What happens when all of the waves combine their amplitude the moment they reach shore and form a giant tidal wave, a tsunami of historic proportions?  This, or something like it, is what happened in 1918, when the influenza pandemic enveloped the globe in just a few months, resulting in the deaths of, by some estimates, one hundred million people.  Of course, one of the biggest waves was the war in Europe, now in its fourth year.  But this was not the deadliest.  There was also the long swell of medical history, only recently jolted out of a Hippocratic stupor lasting over two thousand years.  Then there was the wave set in motion by President Wilson, who unleashed a powerful political force determined to whip a fractious country to war.  This, along with the hastily passed a new Sedition Act and the Espionage Act of 1917, combined with a propaganda machine that brooked no dissent, made reporting actual conditions nearly impossible, criminal even.  Then there was biggest wave of all, the influenza virus itself.  Capable of slipping through all the body’s defenses and adapting to its hosts with increasing effectiveness – a virus, too small to see by conventional means – it infected and killed young adults to a disproportionate degree.   Compounding the devastation, most of the lifeguards – the doctors and nurses – were off tending to the troops in Europe, leaving communities begging for volunteers to nurse the sick and dying, and those few doctors and nurses left behind were overwhelmed and decimated by the disease.  And worse, public health officials, military and civilian, were overruled by the supreme urgency of war and made powerless to limit the influenza’s devastating effects.

War

For the first few years of the war in Europe, the United States tried to maintain its neutrality.  President Wilson himself was extremely reluctant to enter the fray.  However, in 1917, Germany outraged the nation when it announced unrestricted submarine warfare and tried to persuade Mexico to its side.  The President was forced to act.  As reluctant as he was, once the decision was made, Wilson pursued war with incredible single-mindedness, an almost religious fervor.  “To Wilson, this war was a crusade, he intended to wage total war.” 
“To fight,” Wilson declared, “you must be brutal and ruthless, and the spirit of ruthless brutality will enter into the very fibre of our national life….”  And it did.
“The government compelled conformity, controlled speech in ways… not known in America before or since....”  Wilson pushed the Espionage Act and a new Sedition Act through Congress, and established the FBI and a volunteer group called the American Protective League (destined to become the Secret Service) to enforce these new laws, and initiated a “voluntary” Liberty Bond drive, and other measures.  He created, by executive order, the Committee on Public Information headed by George Creel, who went on to produce “tens of thousands of press release and feature stories that were routinely run unedited by newspapers.” 
In many ways, as is the case with the Influenza epidemic itself, our nation’s memory of the “Great War” has been eclipsed by the depression, WW II and the wars fought since.  John Berry’s excellent account reminds us of the draconian measures begun under the guise of war.  (The 2001 USA Patriot Act seems mild in comparison.)  Regardless of the motive or justification, these two statements sum up conditions leading up to the outbreak of the influenza epidemic:
“… Columbia University President Nicholas Murray Butler said, ‘What had been folly was now treason.’”
And Berry himself says, “As an unintended consequence, the nation became a tinderbox for epidemic disease….”

Epidemic

In the summer of 1918, the influenza crashed along the Coasts of all the continents of the world, working its way inland along the rivers and roads of commerce, spreading suffering and death in its wake, then receding, as it ran out of hosts to infects.   It started at an Army base in Kansas as “LaGrippe,” quickly mutating to its most lethal form and “swarming” through the population, through the port cities of Boston, Philadelphia, New York, New Orleans, Chicago and on and on, relentlessly infecting even remote inland outposts, then just as quickly running out of hosts and mutating again into a less dangerous form, infecting fewer and fewer, as it ebbed, but not before President Wilson was caught in its undertow while attending peace talks in Paris.  His encounter with influenza very likely resulted in a bout of depression that affected the course of history.  This strain of influenza would never disappear completely, it would just lie in wait for a new mutation or for new hosts whose lack of immunity would provide the opportunity to do it all again.
“It was influenza, only influenza,” yet it had mutated, exploded and “swarmed” into a disease more deadly than the Bubonic Plague, or any other in human memory. 
So widespread was the influenza that this morbid little ditty, sung by schoolgirls as they jumped rope in schoolyards across the country, spread like a virus too.
“I had a little bird,
Her name was Enza
Opened the window
And In-flu-enza”
My advice for those who think the H1N1 influenza scare a few years ago was overblown is to read this book.  The reason the medical community was so concerned about this particular virus in 2008 is that H1N1 is the same flu virus that affected millions in 1918[1].  John Berry’s well-researched, comprehensive book tells the story of this devastating pandemic; about the men and women who worked to contain and defeat it; about its spread from Kansas to virtually every corner of the globe; about our state of war that placed the need to mobilize forces above everything else, ignoring even the Army Surgeon General pleas and suppressing news reports that might have saved lives—reports about the virus itself.  John Berry takes it even further, and describes, in elegant detail, how the virus worked in the body and why it became so lethal; and how the epidemic spread, and about how this pandemic accelerated scientific research to an unprecedented degree, eventually leading, in one instance, to an understanding of DNA, that most essential building block of life.
Why was this influenza so much worse than ordinary flu?  It affected people in two ways.  For some, the luckier ones, it did act like regular flu from which most people recovered.  However, as is frequently the case with influenza, as symptoms subside and the patient starts to feel better, secondary infections take hold, often resulting in pneumonia.  Think of it this way:  The flu breaks down the body’s defenses—the natural mechanisms that work to keep the lungs sterile.  Enter pneumococcus, streptococcus and other bacterial pathogens.  These are the sources of bacterial pneumonia.  Today, as these secondary symptoms emerge, doctors typically prescribe antibiotics.  Antibiotics are effective against bacterial infections, but not viral infections.  In 1918, antibiotics had not been invented and pneumonia frequently resulted in death.
With the so-called Spanish Flu[2], however, the disease frequently took a more lethal turn.  After ravaging the respiratory track and defeating the body’s normal defenses, it penetrated the deepest recesses of the lungs, infecting those tiny cells responsible for oxygenating the blood, and, literally, choking them off by filling them with fluids[3].   This was the course the flu took in many, if not most, of the young adults who died quickly[4].   Often, symptoms progressed so rapidly that a person could wake up in the morning feeling fine and be dead within twelve hours.   Symptoms included intense headaches, bones that felt like they would break, hallucinations, high fever – all typical of flu, but much more intense.  What was new this time was that blood literally pored from eye sockets, nose, ears, mouth, and victims coughed up blood, even, as was frequently reported, projecting a stream of blood across the room, and in the final stages, “cyanosis”—victims turning such a dark shade of blue from lack of oxygen, it was hard to tell “Caucasians from Negroes.”  So fast did influenza spread, and so overwhelmed were the few medical staff available that “…nurses wrapped more than one living patient in winding sheets and put toe tags on the boys’ left big toe.  It saved time…” 
This was no ordinary “grippe,” this mutation managed to break down all the body’s defenses and confound public health official.  Facemasks – which became ubiquitous, were as useless as a window screen in a dust storm.  People were advised to avoid crowds (virtually impossible) and, as one health board advised, “…stay warm, keep the feet dry and the bowels open—this last piece of advice a remnant of the Hippocratic tradition.”  The problem was, and still is today, that “men could appear healthy while incubating influenza themselves, and they could also infect others before symptoms appeared.”  One patient could infect thousands without knowing it.
So devastating and so quickly did it spread, that there was a breakdown in civil society.  Indeed, at one point, so dire did the situation appear, Victor Vaughan, the acting Army Surgeon General, wrote, “If the epidemic continues its mathematical rate of acceleration, civilization could easily disappear … from the face of the earth within a matter of a few more weeks.”
People avoided people, many refusing to go to work or even to the store.  People were dying at such a fast rate that caskets were in short supply.  Sometimes entire families were infected, with nobody to even dispose of bodies.  Some were so ill, they were forced to sleep in the same bed with the dead.  Unable to buy and prepare food, many adults and children starved, and orphans roamed the streets, and many people collapsed and died in the street.  Horse carts roamed the city, collecting bodies and stacking them like cordwood.  But gravediggers were in short supply and, inevitably, after weeks of paralysis, officials organized to deal with the crisis by digging mass graves.  But public officials were powerless to prevent or slow its progress; chaos reigned.
In the midst of all this, newspapers were reporting, “This is only the grip, nothing to be concerned about,” or complete fictions such as this: “Scientific Nursing Halting Epidemic.”   “On a single day of October 10,” Berry tells us, “the epidemic alone killed 759 people in Philadelphia” and, “During the week of October 16 alone, 4,597 Philadelphians died….” And this was in just one city.  Yet, referring to people not yet infected, a public official is quoted as saying, “There is no question that by a right attitude of the mind these people have kept themselves from illness.  I have no doubt that many persons have contracted the disease through fear… Fear is the first thing to be over come, the first step in conquering this epidemic,” and “The weak and timid succumb first.”  These sentiments, propagated by Washington in the midst of war, appeared in papers across the country.  Of course, people could see what was happening all around them, so these admonitions had just the opposite effect as was intended, magnifying fear and distrust.

Medical History

I was most impressed with how well Berry put the medical history in context.  In just a few chapters, Berry covers a sweep of history from Hippocrates (460-370 BC) (and those who wrote under his name), up to the outbreak of the influenza, and beyond, rendered so expertly, you quickly understand the frustrations and challenges with which the medical community had to contend as the disease spread.
In the course of telling this story, Berry writes about dozens of scientists and health professionals.  I’ll just mention a few.  Most notable was William Henry Walsh, the “impresario” who “intended to precipitate a revolution” in medicine, and did.  He is described as “the glue that cemented together the entire American Medical establishment,” as he dragged the science and practice of medicine out of the dark ages and into the 20th Century.  He was instrumental in establishing Johns Hopkins as a world-class medical institute, starting in the 1880s, and, from his position there, permanently altered medical research, education and its practice throughout the United States.  Along with his protégé Simon Flexner, who would lead the Rockefeller Institute, and Simon’s brother, Abraham, and dozens of others that he inspired, Walsh completed “the reform of all medical education” in the US, and directed “the flow of tens of millions of dollars into laboratory research.” 
To me, the most surprising revelation was the state of medical education and practice before 1900.  Even schools like Harvard, Penn and Columbia did not require students to have a college degree and, what’s worse, many schools admitted anyone who could pay.  Some could hardly read or write!  “The whole system of medical education … is something horrible to contemplate,” complained Harvard’s president Eliot in 1869.  When he urged the adoption of written exams, Harvard’s Professor of Surgery, Henry Bigelow, complained, “…[Eliot] actually proposes to have written examinations for a degree of doctor of medicine.  I had to tell him that he knows nothing about the quality of the Harvard medical students.   More than half of them can barely write.” (Judging from the handwriting on prescription forms, this still may be the case.)
Simon Flexner’s story is illustrative of the state of medical education in the late-19th Century.   It begins from “his growing up the black sheep in an immigrant Jewish family in Louisville, Kentucky.  Older and younger brothers were brilliant students, but he quit school in the sixth grade.”   Described as “sullen and flirting with delinquency,” he worked and was fired from several jobs before getting a job at nineteen with a druggist who had a microscope.  While forbidden to use it, he did anyway and, “Abruptly his mind was engaged.”  He attended the Louisville College of Pharmacy, graduated at the top of his class, and attended medical school, at night, while working in a brother’s pharmacy.  About his medical school experience, “Flexner later recalled, ‘I never made a physical examination.  I never a heard a lung sound.”  Nevertheless, he was then free to hang his shingle and practice medicine.  This is where his story takes another turn.  Flexner was of exceptional intelligence and it was obvious to him just how ill prepared he was.  “His younger brother Abraham had graduated from the Hopkins. … Simon sent some of his microscopic observations to Walsh.  Soon Simon was studying at the Hopkins himself.”  Walsh was so impressed with Flexner that he “arranged a fellowship for him in Germany.”  Four years later, he returned to become a professor of pathology at Hopkins.  A true autodidact, Flexner made up for the gaps in his education by reading and studying widely.  Not only was he well prepared to become a professor, soon afterwards, with Walsh’s whole-hearted endorsement, he became the head of the new Rockefeller Institute, which he led with distinction for many years.  Flexner’s story is just one of dozens of compelling stories John Berry tells. 
Even as late as 1870, at a time when European schools taught the use of microscopes, stethoscopes, ophthalmoscopes and thermometers, doctors in the United States seldom used them.  Indeed, few American medical schools had them available.  While there were “two hundred endowed chairs on Theology at American colleges, there were only five endowed chairs in Medicine.”  Several states didn’t even license doctors, and “the titles ‘Professor’ and ‘Doctor’ went to anyone who claimed them.”  By in large, nurses were more knowledgeable and better trained than many of the doctors with whom they worked.  This infuriated some doctors, who resorted to using numeric codes when prescribing medicine so that nurses could not tell what the Doctors were prescribing, and object.  Traditional “heroic measures” such as bleeding, cupping, blistering, purging (with caustic purgatives) and so on were the methods employed for hundreds of years and, even though many doctors where aware of advances in medical knowledge and knew these techniques did little good, they were frustrated since “little of this new science could be translated into curing or preventing disease.”  
For a time, doing nothing beyond comforting the afflicted was the best medicine.  More often than not, “Do no harm,” meant, “do nothing.”  Not until Walsh and his generation of European-trained doctors began to address the inadequacies of research and education in the United States, would things change.  (Europe was decades ahead of the US until around 1910 or so.)
And change it did, first, with the establishment of the Johns Hopkins Medical School in 1893, followed by the Rockefeller Institute and, quickly thereafter, at other universities and institutions across the country.  An important development that served to precipitate this change was what became known as the “Flexner Report” – a comprehensive survey of medical education in the United States conducted by Samuel Flexner’s brother, Abraham.  This study brought to light the sorry state of medical education throughout the country.  Out of more than one hundred and fifty medical schools nationwide, one hundred and twenty were judged substandard, in fact, abysmal.  With the publicity that followed – publicity very much resisted by the American Medical Association – most schools closed, and those that survived now had a clear set of standards to meet; Abraham Flexner provided the model.  For over 2000 years, medical understanding and practice were frozen in Hippocratic stasis—in theories based on the “four humours[5]” and in practices that included bleeding, cupping, purging and so on; still performed by country doctors even as late as 1918.  But, thanks to Walsh and his protégés, medicine in the U.S. was turned on its head in just a few decades.  By 1918, there were vaccines to prevent bacterial diseases such as smallpox, cholera, typhoid, diphtheria and tetanus, and even cures for diphtheria, the first “cure … entirely the result of laboratory work.”
There is much more here about medical history and science and the men and woman who pursued it that is fascinating to read.  The lives of men and woman like Paul Lewis, William Park and Anna Wessel Williams, to name just a few, are probably deserving of books of their own, but Oswald Avery story deserves special mention.  Initially in pursuit of the cause and prevention of influenza during the pandemic, Avery spent decades in his laboratory, not emerging until 1943 at the age of 67, with a paper describing the function of DNA.  As luck would have it, that year, as he was being considered for a Nobel Prize for an earlier discovery, this new paper (the first he published in a dozen years) was so revolutionary and startling that the Nobel committee hesitated.  Even though Avery was first to publish, he never did get the recognition for his brilliant work on DNA by the Nobel committee that he deserved.  “Tenacious and persistent” don’t even come close to describing Avery.  My favorite Avery quote is, “Disappointment is my daily bread.  I thrive on it.”  Until his death ten years or so later, he never let up.
Researching and writing ‘The Great Influenza’ took John Berry seven years.  Perhaps he was inspired by Avery’s example.  It is surprising to me that, outside of some fictional accounts and memoirs, so little had been written about this pandemic.  It’s almost as though the country developed collective amnesia about an event that cost more lives than all the wars in the 20th Century.   All I can say is, “It’s about time!’  This book is truly a monumental achievement, one that deserves to sit alongside ‘The Microbe Hunters’ and other classics of science.  It should be required reading for every medical student.
During a recent appointment with my doctor, I mentioned that I was reading ‘The Great Influenza,” and asked if he had heard of it.   I was delighted to hear him say he read it last year and express enthusiasm for the book.   I mentioned how surprised I was that even in the last half of the 19th Century, most doctors didn’t use stethoscopes and other instruments we take for granted, and then pointed to the computer attached to the wall (which, I hasten to say, me doctor used) and said, “in fifty years, will we be as surprised and horrified to learn that most doctors didn’t use computers in 2010, and failed to understand the accumulated benefit it provides, for example, in epidemiology and reduction of medical errors?”  He laughed and said to me, “As I was reading the book, I wondered how many of the things I’m doing now I’ll be embarrassed about in fifty years.”
I thought, “now there’s a healthy, self-critical way to look at it—no matter what we’re doing now, how will we view it fifty years later?”  We could apply that to life as well as medicine.  In an instant, my respect for my doctor grew.
“I had a little bird,
Her name was Enza
Opened the window
And In-flu-enza”
Print or download version --> Print version

[1] Thanks to the Army, which preserved lung tissue from autopsies performed in 1918, this was confirmed by RNA analysis in the 1990s.
[2] Spain was one of the few nations not at war and therefore free to report the spread of the disease without interference by the censors.  Since Spain reported it first, people thought that that was where it started, hence the misnomer, Spanish Flu.  Kansas flu would have been more accurate.
[3] What’s going on here is a bit more complicated.  In some ways, the younger and healthier the victim, the greater the danger, since a healthy immune system mounts a more robust counteroffensive.  Ultimately, what killed many patients was the immune response, which flooded the lungs with white blood cells and antigens, which, together with a stew of dead lung cells, clogged up the lungs so completely that the body could not eliminate the congestion fast enough.  Some victims coughed so hard they tore cartilage and broke ribs trying to clear respiratory passages.  Many died and many others suffered brain damage, depression and other long-term disabilities.
[4] One theory for why this virus affected a higher proportion of young adults as compared to people over forty-five, besides the close-quarters imposed by war, is that older people who lived through the “Russian flu” of 1889-90 were more likely to have an immunity.  That flu was similar enough to offer some protection to this newer strain of the disease.  Also see previous note.
[5] Blood, black bile, yellow bile, phlegm.