News from the lake…

News from the lake

August has slipped by with a kind of Parisian ennui, and great decisions loom. Will we heat the cabin over the winter? Will this be the year when the deer finally do in the holly bushes, or should we go back to protecting them once again?

These are lake decisions and now with September upon us and closing time coming they assume great importance.

But I can defer them for a bit, and I would rather think back over the departing summer. There’s good news. Two years ago, the homeowners’ association shut down the lodge operation —no more paying guests who “catch and keep,” taking fish home for dinner. With their departure, our fishing has improved dramatically: nice-size bass, carefully caught and released.

Insect variety seems to be on the uptick, too. White Flower Farm, of Litchfield, CT supplies wonderful plants, although my skills are largely limited to day lilies in sun and hostas for shade. This spring, we put in their “Hummingbird-Butterfly Garden for Sun” in a well-lit corner of the front yard. As a consequence, Monarchs have rewarded us by visiting the flowers all summer. White Admirals, a butterfly we have not seen before, appeared in substantial numbers. Down by the boathouse, a few mayflies fluttered out of the lake in July.

Hummingbirds seemed a little late in coming, although they arrived in good numbers in late June. Hairy and downy woodpeckers have been busy since mid-August. Our local loons seemed to have a rough summer; we think they lost this year’s chicks.

We had visitors, of course, over the summer. Everyone who came pitched in helping with the cooking and, by importing young people, broadening our perspectives. This is what summer can still be about: noticing the butterflies, catching a fish or two, and absorbing the excitement of a young friend or relative heading off to his or her freshman year in college.


Jews burnt during the bubonic plague, accusing them in the contamination of christians wells (woodcut illustration from Nuremberg Chronicle, 1493)

A couple of weeks ago, one of my Amherst classmates sent me a thoughtful comment that I thought deserved more readers. I asked if I could post it as a “guest blog,” and he kindly agreed. Here it is.



Through the end of May only 21 weeks had elapsed in 2018, and 23 school shootings resulting in either death or injury had occurred: more than one school shooting a week so far this year.

Apparently, our elected representatives, for whatever reasons, are not going to tackle this plague of mass school shootings, accidents involving firearms, and the resolution of disagreements by violence.  Many possible avenues are open to them: licensing; nationwide data banks of purchases and sales of firearms; background checks; restriction of various fire arm accessories; more in-school personnel protections; and early detection of likely troubled minds.   But they gag when asked if the scourge might have something to do with firearms. Instead, they argue that the gun accidents and massacres are a function of our nation’s abortion policy, or of too much pornography, or of various mental health issues– and then cut the budget for mental health care.

To a lawyer, which I am, the current interpretation of the Second Amendment to our US Constitution, in vogue for only the last ten years – is contrary to our interpretation of it for the preceding 200 years. This current interpretation clearly frustrates any common-sense remedies to our current plague.  I agree with former US Supreme Court Justice John Paul Stevens, who wrote in March 2018:

 “Concern that a national standing army might pose a threat to the security of the separate states led to the adoption of that amendment, which provides that “a well-regulated militia, being necessary to the security of a free state, the right of the people to keep and bear arms, shall not be infringed.’ In 1939 the Supreme Court unanimously held that Congress could prohibit the possession of a sawed-off shotgun because that weapon had no reasonable relation to the preservation or efficiency of a “well regulated militia.  During the years when Warren Burger was our Chief Justice, from 1969 to 1986, no judge, federal or state, as far as I am aware, expressed any doubt as to the limited coverage of that amendment.”

 “In 2008, the Supreme Court overturned Chief Justice Burger’s and others’ long-settled understanding of the Second Amendment’s limited reach by ruling, in District of Columbia v. Heller, that there was an individual right to bear arms. I was among the four dissenters. That decision — which I remain convinced was wrong and certainly was debatable — has provided the N.R.A. with a propaganda weapon of immense power.“

According to Dennis Baron, a professor of English and linguistics at the University of Illinois, Justice Scalia, being a strict constructionist and writing the opinion, focused on the words, “bear arms”, and Scalia thus concluded that the right to “bear arms” would not have been spoken in a military context in the 18th century. He thought that the meaning of “bear arms” simply referred to carrying a weapon and had nothing to do with armies.  Justice Scalia wrote,

 “Although [‘bear arms’] implies that the carrying of the weapon is for the purpose of ‘offensive or defensive action,’ it in no way connotes participation in a structured military organization.  From our review of founding-era sources … [i]n numerous instances, ‘bear arms’ was unambiguously used to refer to the carrying of weapons outside of an organized militia.” 

But Professor Baron says that Justice Scalia is dead wrong. Baron points out that databases of English writing from the founding era confirm “bear arms” as a military term; uses of that phrase in the non-military sense are almost nonexistent.  Among 1,500 separate occurrences of “bear arms” in 17th and 18th century writings, only a handful don’t refer to war, soldiering or organized, armed action.

More than 150 years ago, Tennessee Supreme Court Judge Nathan Green wrote,

“A man in the pursuit of deer, elk and buffaloes, might carry his rifle every day, for forty years, and yet, it would never be said of him, that he had borne arms….”

And in 1995, historian Garry Wills put it more succinctly:

 “One does not bear arms against a rabbit”;

Despite the flawed reasoning in the 2008 Heller case, its decision is still binding.  While I agree with Justice Stevens and Professor Baron, and with the interpretation of the Warren and preceding Courts, even if the 2nd Amendment were correctly interpreted and all of the proposed regulatory approaches were enacted, somehow, somewhere, someone would likely slip through the cracks. Another mass shooting, or one toddler accidentally killing another, or worse, would still occur.


Policy-makers have available two broad types of instruments for changing the various habits and activities in society: traditional regulatory approaches that set specific standards and expectations, or economic incentives —market-based policies that rely on market forces to correct or modify societal behavior. Throughout the history of the United States, most activities that we as a society choose to encourage or discourage are implemented through the use of “sticks”, that is, regulation or through the use of “carrots”, or incentives.  To enhance charitable giving, we allow a tax deduction for qualifying gifts. To prevent excess cultivation, we reduce real estate taxes via a “green acres” program.  To encourage people to “go green”, society offers incentives, including tax credits, low interest loans, property tax abatements, and others. To encourage people to understand right from wrong, and to make the right choice as opposed to the wrong choice there are disincentives, usually couched in terms of loss of money or loss of freedom.

When the legislative “stick” fails, as it has in the case of gun legislation, because policy makers are at loggerheads, flummoxed, addicted to the contributions of arms manufacturers, or just lacking in political courage to act, then the “carrots” of incentives and disincentives offer an alternative approach.

So, let’s consider the concept of “strict liability”.  Strict liability, sometimes referred to as “absolute liability” exists in both civil and criminal law.  It refers to holding an individual liable for damages or losses without having to prove either fault or negligence.  Generally, in tort law, an aggrieved person has to prove that his grief was sustained on account of another person’s fault, whether by negligence or intent.  The law, however, recognizes that there are certain circumstances that are so inherently dangerous or hazardous, that there is no need for the aggrieved to prove direct fault or negligence.  Instead there is strict liability.

Some years ago, I bred and exhibited Morgan horses throughout the Midwest. I kept a stallion on my property. I knew that if my stallion broke out and kicked or bit someone, behaviors for which stallions are noted, I would be “strictly liable” for the resulting damages. Consequently, I built a corral with a six-foot high fence made out of sturdy two-inch thick oak planking and supported by posts almost a foot in diameter.  I did this for self -protection. I would have been strictly liable if that stud horse got out. My lack of negligence or malfeasance would have been irrelevant.   Other examples of strict liability include product liability, harboring of wild animals either in zoos or privately, the disposal of hazardous chemical wastes, and the storage of explosives.

In 2012, Adam Lanza used weapons that his mother owned, apparently legally, to kill 26 children and adults at Sandy Hook Elementary School in Newtown, CN.  Just recently a 17-year-old loner slaughtered 10 at a high school in Santa Fe, Texas using a pistol and a shotgun; weapons legally owned by his father.  I don’t know where Nikolas Cruz got his weaponry to terrorize the Parkland, Florida high school, or where James Holmes got his to terrorize Aurora, Colorado, or where Devin Kelley or Stephen Paddock got theirs to kill people in Sutherland Springs, Texas or Las Vegas, Nevada, respectively. But someone owned those guns, probably legally. Additionally, there are many, many incidents of children injured or killed by other children who are playing with guns. And just last month, a man was shot by his own dog; the dog jumped on a gun lying on a sofa and somehow managed to discharge it.  All of these incidents raise the issue of whether there is simply too much carelessness about the storage and protection of firearms.

Courts and legislators could extend the concept of strict liability to the housing or storage of firearms, just as they have to keeping stallions, wild animals or explosives.  I suggest that legislators might more easily focus on the back-end of the issue with a “disincentive” approach to our current plague, as opposed to the front-end “regulatory” approach that has them so flummoxed.  If an individual wishes to own firearms, then if those firearms “get loose” and cause harm perhaps the owner should be strictly liable, at least civilly, without any burden thrust upon the aggrieved to prove negligence.

Placing that burden on a gun owner is certainly not any more onerous than was my effort to build a corral that would have probably restrained an elephant, maybe even a rhinoceros or two.


Stephen E. Smith II, BA (Amherst College), JD (University of Minnesota)                                  June 2018

Bad Blood. A review.

My blog, The Weekly Packet, ( ) of April 30, 2016 ran under the title “The Theranos evidence, waiting for a story.”

In the text, I said, “… looking at the facts is like looking at individual organs at an autopsy, after the diener has washed them, weighed them, and put them in clean pans. What we need now is the pathologist to come in, and with knowledge and experience, he or she will tell the story that puts the facts together into a coherent narrative. At some point, the narrative may well make an instructive case study.”

John Carreyrou’s new book, Bad Blood: Secrets and Lies in a Silicon Valley Startup, tells the story with spellbinding skill and detail. Those of us who had some background, however minimal, in laboratory work knew that Theranos had almost certainly over-promised and under-delivered. None of us knew the details.

Carreyrou’s reporting reveals the astonishing extent of the process in a marvelously “coherent narrative.” He details the scope of the deceptions involved, including misleading deals with Walgreen’s, Safeway and even an attempt to involve the army, the vast sums of money lost, and the lives disrupted.

First, no spoilers. Read the book! Then, after the mesmerizing read comes the hard part: what can we learn from this story?

One obvious lesson relates to corporate culture. A corporate culture that encourages tough questioning across disciplines and insists on facts may sometimes seem harsh. In such an environment, though, mutual respect and civility make the system work. What we see in Bad Blood, though, is the destructive effect of siloing and secrecy. And we see that destructive process emanating from the top levels of management.

There’s another lesson, too. A very old one. Making the error of hubris provokes the outcome of nemesis. David Ronfeldt explained in his excellent essay for the Rand Corporation, “Beware the Hubris-Nemesis Complex: A Concept for Leadership Analysis.”

“In Greek literature, hubris often afflicted rulers and conquerors who, though endowed with great leadership abilities, abused their power and authority and challenged the divine balance of nature to gratify their own vanity and ambition. Thus hubris was no common evil: It led people to presume that they were above ordinary laws…”

He continued on,

“Hubris above all is what attracted Nemesis, who then retaliated to humiliate and destroy the pretender, often through terror and devastation. Thus she [Nemesis] was an agent of destruction. The battle won, she did not turn to constructive tasks of renewal and redemption—that was for others to do. Yet her behavior was never a matter of pure angry revenge. There were high, righteous purposes behind her acts, for she intervened in human affairs primarily to restore equilibrium when it was badly disturbed, usually by figures who attained excessive power and prosperity.”

With Bad Blood, John Carreyrou has written not just a stunning piece of non-fiction reporting, but a cautionary tale for our times.

Ink still damp from the press

Back in the 19 century, there must have been times when the weekly packet boats had just started to slip the moorings. Then a young boy came running down to the wharf and tossed the very latest newspaper on board, ink still damp from the press.

That scene describes how I feel watching the evolution of articles in the staid pages of JAMA, the Journal of the American Medical Association, and other AMA publications.

For decades, through the 1970s well into the 2000s, the AMA seemed to have a multiple personality disorder. Politically, the organization represented the views of conservative private-practice oriented white male physicians, while the editorial views of JAMA were blatantly ultra-liberal and anti-pharmaceutical industry. Now, the journal and the other organizational media seem to be making a real effort to deal with issues that occupy the interface of society and organized medicine: opioids, income inequality, and environmental- public health concerns including air pollution and water quality.

My experience as the medical director of a heart transplant program taught me firsthand that public health and prevention are far more effective for the population at large than high-technology approaches to end-stage disease. Now, from the AMA of all places, we seem to be witnessing the same realization taking hold. So, I feel like the youngster tossing the fresh newspaper aboard when I tell you about a really worthwhile read in the April 24, 2018 issue of JAMA titled “Health, Faith, and Science on a Warming Planet.” (JAMA. 2018;319(16):1651-1652.) The authors are Monsignor Marcelo Sánchez Sorondo PhD, of the Vatican’s Pontifical Academy of Sciences, Howard Frumkin MD DrPH, from Environmental and Occupational Health Sciences, University of Washington School of Public Health, and Veerabhadran Ramanathan PhD, Atmospheric and Climate Sciences, Scripps Institution of Oceanography.

This astounding trio opened their article by stating, “Climate change, altered natural cycles, and pollution of air, water, and biota threaten the very conditions on which human civilization has depended for the last 12,000 years. While human health is better now than ever before in human history, climate change is undermining many public health advances of the last century and ultimately may be associated with the unprecedented extinction of species. The increasing gap between the wealthy and poor—already unconscionable, and the cause of profound preventable morbidity and mortality—amplifies the effects of climate change on health and deepens health disparities.”

This message, the news that has me running to the dock, is that this is an establishment voice —JAMA, for heaven’s sake— announcing a new paradigm that aligns public health, environmental issues, and economics in a new view of civic and social engagement. This is big news.

Here are the six main points from the paper:
1. Disciplined, critical thinking, and an unfailing commitment to distinguish what is verifiable from what is not, characterize the best of the health, science, and faith communities.
2. Scientific evidence is a primary basis for distinguishing what is verifiable from what is not.
3. With unchecked climate change and air pollution, the very fabric of life on Earth, including that of humans, is at grave risk.
4. There is a role for reverence and awe.
5. There is a moral obligation to safeguard the earth for future generations
6. There is a moral obligation to care for the most vulnerable.

Take a deep breath, and resolve to become, or to remain, engaged.




Photo is the interior of the Metropolitan Cathedral in Buenos Aires, where Pope Francis, as Archbishop Jorge Bergoglio, used to perform mass. (KZM photo, Mar. 14, 2014. Sony DSC-HX300. 1/25th at f2.8, ISO 800)

Perspectives on my 50th medical school reunion.

In 1682, William Penn, an English Quaker, founded the city of Philadelphia as the capital of his Pennsylvania Colony. 83 years later, in 1765, Drs. John Morgan and William Shippen convinced the Trustees of the College, Academy, and Charity School of Philadelphia to found the first medical school in the then-colonies on the eastern seaboard of North America—modeled on the University of Edinburgh, where they had trained. When the Medical School was founded in 1765, the College and the Medical School became a university, although the term “university” was not added to the institution’s official title until 1779.

At the first commencement, June 21, 1768, ten medical students received their M.B. degrees. (The College granted the first M.D. degrees to four of these men in 1771.)

Two hundred years later, in 1968, my class graduated from that same medical school at the University of Pennsylvania, and last weekend, 250 years later, we celebrated our fiftieth reunion. Perspective comes from such an event.

Individual lives, and deaths, occur. The patients, the teachers, and some of the classmates from whom and with whom we learned so much are gone. Soon we—with all the fragile neuronal connections that we treasure as our professional knowledge and skills—will also return to the dust. But the institution continues. What most concerns our class now is what that institution will look like after another half-century.

Philadelphia and its 1.5 million inhabitants could be vaporized in a thermonuclear holocaust, could be flooded by warm rising seas, or could fall victim to some other unpleasant end. But for the foreseeable future, our civilization, the city, and the university will most likely muddle on.

What can’t muddle on is our way of doing health care. From my experience in the US Navy and from visits to institutions all across the country representing Janssen Pharmaceuticals, I know that good medical care does not require the opulence, the cavernous spaces and grand edifices, that we see in major hospitals today.

What do we need to really do our work as doctors? Accessible simple, sturdy well-lighted buildings with good heating and cooling – easily cleaned surfaces – basic hematology, chemistry, and microbiology labs – a couple of reliable X-ray machines – a delivery room and an operating room – a few rooms for overnight stays –a basic food-service operation— a functional record system – a well-stocked pharmacy – high-speed internet, that’s all. Oh yes, and a place to send the patients when that simple facility can’t handle their problems. No questions asked.

What I’m describing could exist. Should exist. If we had a system where every person in the country had low-cost insurance to cover basic immunization and preventive care, maternity care, trauma, and out-patient primary care, then such places would exist. They would provide the network of referrals that research universities and their academic medical arms must have to do their work.

Every single one of us has benefited from what Penn, Harvard, UCSF, Stanford, and the other 137 accredited MD-granting institutions and 31 accredited DO-granting institutions in the United States do. We should help to fund them through taxes, grants, gifts, and health-care insurance that helps to pay for their services.

But we would also all benefit from extending basic care to our whole population. Make no mistake, breakthroughs in science will come; brilliant younger people are hard at work in their labs. My hope, as I left Philadelphia, was for progress in how we deliver care.


I Love It When a Plan Comes Together


Imagine a half-dozen or so college students in hip boots, brandishing wide nets and paper punches, invading your quiet, secure home. Fortunately, fish don’t think much at all, so the fish in the woodland pond that my ecology class visited back in 1963 probably don’t recall our visit to their ancestors, but I still do.

We set out to calculate the number of fish in the pond. The method required catching and counting a sample of fish, hence the nets. Then we marked them with a small, neatly punched hole in the thin membrane of their tails and carefully released the known number of marked fish (= M1) with detailed instructions to go and mix-and-mingle with their companions for a week. (Indicator dilution, for you purists.)

One week later, back in the hip boots, we netted a new sample and counted those with (=M2) and without (=M0) tail punch marks. With this much information, we could calculate the number of fish in the pond:
If x = number of fish in pond
Then M1/X = M2/M0 and the rest is algebra.
This was my favorite experiment in all my academic experience. Imagine getting OUTSIDE in HIP BOOTS and doing something scientific. Like a stonefly emerging from the depths of the library, I turned into an environmentalist.

Stay with me for just a few more minutes. As an almost-ten-year-old growing up in northeast Ohio, I remember the November 1952 picture of the Cuyahoga River on fire that ended up in Time Magazine a month later – a truly arresting image showing flames leaping up from the water, completely engulfing a ship. Over the years, as a physician, I’ve followed the stories of various health problems that seem to have had roots in the environment; Dan Fagin’s Toms River is one of the best. A few years ago, I first read Steven Johnson’s marvelous book, The Ghost Map, the story of John Snow and the London cholera epidemic of 1854.

Then, just this morning, I had a real “Ah-ha!” moment. I read Margaret Talbot’s New Yorker article (The New Yorker, April 2, 2018), “Scott Pruitt’s Dirty Politics,” and my son David, an environmental economist, sent me a piece from the American Public Health Association on environmental health.
“Many communities lack access to nutritious, affordable food; are denied safe           places to walk and exercise; or live near polluting factories. The health risks for these families are greater. We support research and action to help ensure healthy environments for all.”–APHA Executive Director Georges Benjamin

All of these issues are related. It all comes together!

We are not separate from the environment. In populated areas, we ARE the environment, or at least, the environment is largely man-made.

Some individuals with political power do not seem to understand the connection between environmental health —clean air, clean water, open spaces— and human health. Those individuals will not be swayed by facts. In fact, they actively reject science as a basis for public policy.

For now, we can support the public organizations that do battle on behalf of the environment, particularly those that wage their battles in the courts. And soon, we can, we should, we must…VOTE.

PS: The photo this month is an outhouse in the Chinese section of Arrowtown, New Zealand. The Chinese, who came to New Zealand as gold miners, were keenly aware of the importance of sanitation.

Polarization, 1900

“How can I maintain a balanced, centrist, well-informed point of view in such a polarized society?” This was the major problem as the United States headed into the election of 1900. William McKinley was nominated by acclaim at the Republican convention. After a brass-knuckle back-room fight, McKinley had to accept Teddy Roosevelt as his running mate, largely as a concession to keep TR away from New York state electoral office. As in 1896, McKinley faced William Jennings Bryan, this time with Adlai Stevenson as Bryan’s vice-presidential hopeful. The front page of the Akron Daily Democrat of Friday July 6, 1900 covered the three polarizing issues. The democrats had pinned their hopes on the free-silver plank, and the “overthrow of imperialism and trusts.”


The free-silver issue rapidly lost steam as improvements in mining technology expanded production and filled the stores of the US Mint. In his first term, McKinley had put the United States firmly on the gold standard, and there it would stay. The trust issue also slipped to the back burner for the time being. McKinley, the prototype incrementalist, viewed trusts largely as a problem for the states.


Meanwhile, imperialism came to the fore. The United States, under McKinley, had acquired Puerto Rico, Guam, and the Philippines as the spoils of the Spanish American War and had added Hawaii as a territory by annexation. Not all Americans celebrated these island acquisitions. The “robust opposition movement included former presidents Harrison and Cleveland, …college presidents and academics, labor leaders, prominent clergymen, and famous writers….” There was no shortage of opposition to, and polarization about, American “imperialism.”


With the parties firmly at odds, the New York Times described Bryan’s democrats as “the army of the discontented.” Bryan proposed changes in “currency, banks bonds taxes, trusts, wages, and labor law.” He faced McKinley, “who spoke to the sober-minded, conservative, property-owning Americans.”   (Merry, p.445)


Into this fray rode Teddy Roosevelt, the former “Rough Rider” hero of the Spanish American War. “In eight weeks of campaigning, he traveled 21,209 miles in delivering 673 speeches to an estimated three million people in 24 states.” (Merry, p447) How important was TR to McKinley’s election victory? How much did his image as an outdoorsman and his popularity with the troops he had commanded help to swing a large segment of the populist vote to the Republicans? These are still issues for historians to debate.


The lessons 1900, however, are more straightforward. The 1900 electorate was highly polarized, between populist Democrats and middle-of-the-road incrementalist Republicans. Today, the labels are reversed but the divisions are not all that different. Who brought the country back together? A somewhat manic Harvard graduate that no one wanted on the ticket – no one, that is, except the voters.


High marks for Robert W. Merry’s book, President McKinley: Architect of the American Century.  Simon & Schuster, New York. 2017

How Often Do You Pull Out the Map?


The photo above, taken on our recent visit to New Zealand, illustrates an irrevocable commitment. We did not explore this particular activity, bungee jumping, at any level deeper that photography!

This week, The Packet has arrived with two items, both well-seasoned, about leadership and learning.

Here’s some news from 1898. At the time, William McKinley was the President of the United States, the last president to have served in the Civil War.

On May 1, 1898, Commodore George Dewey led a United States Navy fleet to victory in the Battle of Manila Bay, in the Spanish-American War. In the words of Robert Merry, author of a recent, well-received biography of McKinley, Dewey’s victory “brought forth a kind of serendipitous imperialism.”

As a result of a very short conflict, the United States had gained control of Puerto Rico in the Caribbean, and Guam and the Philippines in the Pacific, and annexed Hawaii as well.

To continue Merry’s story, “The president, it was said, began his education on the Philippines by tearing a small map from a schoolbook, and when a government official arrived with more detailed charts he received them avidly while acknowledging his limited knowledge. ‘It is evident,’ he said, ‘that I must learn a great deal of geography in this war.’”

My point? McKinley had enlisted in the Union Army as a private; he attained the rank of brevet major. Before he was elected president, he had served in Congress, and as the governor of Ohio. Faced with the sudden, unexpected turn of events that transpired in the Pacific, what did he do? He immediately set out to educate himself.

That’s what leaders do; they look for information.

Fast forward to the New York Times Magazine of October 12, 2010. In a piece by Peter Baker titled “Education of a President, Baker says, “To better understand history, and his role in it, Obama invited a group of presidential scholars to dinner in May in the living quarters of the White House. Obama was curious about, among other things, the Tea Party movement. Were there precedents for this sort of backlash against the establishment? What sparked them and how did they shape American politics? The historians recalled the Know-Nothings in the 1850s, the Populists in the 1890s and Father Charles Coughlin in the 1930s. “He listened,” the historian H. W. Brands told me. “What he concluded, I don’t know.”

The two items span 112 years. There’s not much recently.

Come back soon for more photos of New Zealand.


EPSON DSC picture

I have just carefully read through several editorials in the January 16, 2018 issue of JAMA, the Journal of the American Medical Association. This issue reflects a growing enthusiasm among medical editors for what’s generally called a “focus issue.”

Since I’m not an editor, I have only a vague idea of how this works. In my mind’s eye, I see the editor sitting at a very large, cluttered desk with accepted but unpublished manuscripts piled by subject. As a pile grows to a threshold size, say seven and three-quarters inches, the editor says, “Aha! A focus issue.” Then he or she gathers up the pile and says to the staff, “Print all these together in 6 or 8 weeks, and we’ll be done with them.”

Readers know that I have no intention of writing about the focus of the issue, which happened to be obesity. What I do want to reflect upon are the social changes that have moved obesity from a straightforward statement about body composition to a medical problem worthy of a focus issue of JAMA, arguably one of the most prestigious medical journals in the world.

When I was a kid, turning to a dictionary definition of the assigned subject provided an easy “out” for starting an essay. I hope I’ve become a bit more sophisticated; now, I’ll turn to Harvard Magazine instead. I quote from the issue of April 23, 2009: “There are perhaps few academic topics of equal interest to scholars of history, law, anthropology, neuroscience, and literature. But this was part of the point when scholars of these disciplines gathered on April 22 for a symposium on medicalization—a phenomenon, they argued, that has infiltrated nearly every facet of modern life.” Not exactly stirring prose, but I’m sure you see the point. Or do you?

Beginning roughly in the mid-1970s, when faced with really tough social-behavioral problems, particularly those that have serious health consequences like alcohol abuse, drug addiction, and obesity, Western society has declared them medical problems.

This process, “medicalization,” relieves broad swaths of professionals from dealing with insoluble problems. Physicians, however, seem to willingly accept the process. We seem to say, “Give us your obese, your addicted, your anxiety-ridden… Send these to my clinic, to my hospital, I lift my stethoscope beside the golden door.” Not only do we engage in this altruism, we campaign to make their diagnoses “official” and billable, and then try to find treatments.

Lest this sound a bit negative, the Harvard conference attendees catalogued the forces that help to drive the trend toward medicalization:

  • “the very existence of health insurance (costs are only reimbursable when associated with a definable medical condition
  • death certificates (the need to give a name to what caused a person’s death)
  • research funding (funding is more likely for problems defined as diseases)
  • drug trials and approval
  • and even a desire to wash one’s hands of blame for one’s condition (for instance, by considering obesity a disease that assails people rather than the result, at least in part, of one’s own actions and lifestyle).”

As I become more senior in the medical community, my awareness of the importance of communication, both among doctors and between doctors and patients, continues to grow. How long will the medical community continue to accept the process of medicalization before we say, “Look, we can help to manage the physical consequences of behavioral problems. If you are too heavy, we can replace your worn-out joints, get your cholesterol and blood sugar down, and help with your blood pressure. But we can’t modify your behavior; you have to decide to do that.”

Sending off the final draft



This is Monday, January 08, 2018. About 3 PM on Friday the 6th, I sent my (new) publisher the final draft of a new book, a memoir-conversation with my long-time friend and double-sculls partner, Bernie Witholt. It’s like sending a youngster off to prep school. Someone else will look at her and correct a word here, or delete a comma there. She’s not ready to graduate yet. She needs proofing, formatting, and a pretty cover. But she’s largely out of my hands.

Over the weekend, I have thought a lot about her. I would like to introduce her. How about just three paragraphs in which Bernie describes his first symptoms?

I was one of some 150 people at a Zurich Stock Exchange Symposium on new high-tech startups emanating from the ETH and other Swiss universities. I came in late and, finding no seat, stood leaning against a steel railing. I did not feel good and I wondered why a few of my students, seated only a few meters away, did not jump up to offer me a seat.

About an hour after the meeting started, after the first two or three talks, I felt a strange rattling in my chest. This was new and quite unsettling. Was this a heart attack? What else might it be? I felt weak, eager to sit down, but there were no empty seats. The next speaker had just started. I looked around; no one else seemed to hear the rattling. I leaned harder into the railing. After ten or fifteen minutes the rattling subsided; the speaker finished his presentation, and I felt better.

With the rattle gone there seemed to be no reason to leave immediately, and so I stayed. The meeting ended a few hours later. I had no interest in chatting with other participants, but had some juice and left. This meant negotiating stairs from the auditorium level up to the higher street level. This was unpleasant. When I reached my car, I sat contemplating what had happened. Whatever it was, it did not seem trivial. I decided to go home this time, but to go to the emergency room of the University of Zurich hospital if it happened again.

Her name, unless the publisher decides to change it, is “240 Beats per Minute. Life with an Unruly Heart.”

Sending her off has left me with what people in my generation called “post-exam letdown.” As we advanced academically, end-of-semester exams became more and more important. At least, we thought they did. Performance on exams determined where we went to graduate school, medical school, or law school. And then, once in a graduate or professional school, better exam performance would mean a “better” post-doc, internship, or clerkship.

As a result, we studied intensely and after all the bubbles were filled with #2 lead pencil and the blue-books turned in, we crashed. Or, at least, we deflated. So it is with sending off the final draft. Even though now, it’s just a matter of hoping that people will like her.

I have neglected my blogging in the push to get her ready. Now that she’s out the door, I’ll take a deep breath and get back at it. Next week.