Where Did All Those Statues Come From?

In my last post, I proposed a look at the present-day conflict over removal of Confederate monuments from public spaces as a way to explore both the objective history of, and the widespread beliefs about, the American Civil War in the context of today’s culture. A little homework has convinced me that, like some other projects recently, this is no small endeavor. In fact, the smart thing to do would be to admit pre-emptive defeat and move on to explaining something relatively simple like quantum computing.

But, having started on some research, let’s go with the monument issue for a while. There’s a lot to learn. Let’s start with 3 questions:
1. When were the public monuments built? We need to explore the social and political context that fostered a flurry of Confederate monuments in public spaces across the South.
2. Who built them? With this question, we want to understand who organized the community; who obtained the land and the permits, and who paid for the materials and labor?
3. And finally, why?

After Lee surrendered to U.S. Grant, on April 9, 1865 and Johnston surrendered to Sherman on April 26, the Civil War effectively ended. The confusion of Reconstruction consumed another dozen years, only to be followed by almost a century of Jim Crow laws, mandated segregation, and the “separate but equal” travesty of Plessy vs. Ferguson.

Against that political backdrop, the United Daughters of the Confederacy (UDC) and Confederate veterans’ organizations took up the burden of vindicating the defeated South through the conscious, purposeful promotion of the “Lost Cause” myth. (The UDC itself grew from at most 4000 to almost 30,000 women between 1894 and 1904.)

The Lost Cause had three central themes. First, the “War Between the States” occurred when patriotic Southern states took up arms to defend states’ rights. Second, in the ante-bellum South with its “peculiar institution,” Black slaves had been happy and content with life under benevolent, patrician masters. And third, after a gallant, chivalrous conflict the forces of the South were overwhelmed by the vast armies and brutal strategies of Grant, Sherman and Lincoln.

From roughly 1880 to 1910, construction of grand monuments in public spaces played a central role in making the Lost Cause myth the primary “true history” of the war in the South; it was taught in the public schools from “approved” textbooks, related to the public by Memorial Day speeches, celebrated with the birthdays of Robert E. Lee, “Stonewall” Jackson, and Jefferson Davis, and symbolized by the unveiling of statues of these and other Confederate heroes. Looking back on the development, spread and acceptance of the Lost Cause mythology, it’s difficult to see it as anything other than a vindication of whites, by whites and for whites. There were, after all, laws to enforce segregation and to disenfranchise Black voters; if the laws failed, there was the Ku Klux Klan and the lynch mob.

As the Confederate veterans and subsequently the first generation of the Lost Cause proponents passed away, the Lost Cause and the dramatic Lee, Jackson, and Davis monuments became part of accepted white culture. Never mind that the memories of moonlight and magnolias, banjos and happy darkies and gallant patrician horsemen had been fabricated to justify a pro-slavery armed rebellion against the United States costing 600,000 lives, led by scores of West Point graduates who broke their vows and betrayed their country. Public attention, white public attention, turned to the Spanish-American War and then to Europe and the horrors of World War I.

But what did those monuments symbolize to the Black generations who grew up with them? Come back for another post, and watch for news about the reorganization of rmillsmd.com and theweeklypacket.com.

Karen L. Cox. Dixie’s Daughters. The United Daughters of the Confederacy and the Preservation of Confederate Culture. University Press of Florida, Gainesville. 2003
Mitch Landrieu. In the Shadow of Statues. A white Southerner Confronts History. Penguin Books, New York. 2018

Flawless Execution

Sitting at the keyboard, I try to visualize my readers. An audience of casually, but neatly, attired adults from early 30s to late 50s, probably a bit more than half female, most with an interest in health care. Everyone out there knows that older people faced with physical tasks move more slowly than those middle-aged and younger. And I suspect that, like me when I was a mid-career physician, you attribute this slowing to the physical impact of increasing years on one’s strength, flexibility, and fitness. But there’s another factor, not widely appreciated, that gains in magnitude as one ages: the requirement for flawless execution.

Let’s take a task as simple as preparing the breakfast coffee. A mid-30s mother of two can do this without really focusing on it, multi-tasking while making sure two youngsters are dressed, fed, and on their way to school with everything necessary for the day. In contrast, a retired physician in his mid-70s knows that if he were to accidentally swing a glass coffee pot full of water against the edge of a granite counter, the impact would result in shards of glass and puddles of water all over the kitchen floor. The dog would want to inspect it all, with great hazards to her paws. The results of the error would require getting down to the floor, no longer a quick or easy process. Glass shards are hard to see. Would he find them all? What if he needed more paper towels — another up and down?

Or suppose, just suppose, in the confusion the bag of ground coffee were to fall over on the counter? Another clean-up! And if some of the coffee has some fallen off the counter-top onto the floor, then there’s another up-and-down with its inevitable question, “Did I get it all?”

With advancing years, one has the advantage of having repeated the simple tasks of daily life thousands of times. We seniors have our habits and routines. But daily life becomes a constant struggle with gravity. Earth wants to draw not just our artifacts but our bodies themselves ever nearer. Wading a trout stream, walking the dog, climbing the stairs, all these activities require flawless execution. Otherwise, gravity will extract the inevitable penalty for error.

So, younger colleagues, the lesson for the day is patience. The older people among you may want a bit more time. Watch them carefully, though, and you may well get to see flawless execution.

Government in Your Daily Life


Much of the current controversy in American public life seems to reflect dramatic differences of opinion about the role of government in our society, particularly our public life. The debate is often disturbing, largely for the neglect of historical data on both sides.

If we are asked to name the government agencies that we all interact with in our daily lives, the Internal Revenue Service (IRS) would probably top the list. After all, it’s had to escape the tax man. But what agencies would come next? Those who travel a lot might push for the Federal Aviation Agency (FAA), and investors might list the SEC or the Federal Reserve system. My candidate might not be one you would immediately think of — the FDA. Yes, the Food and Drug Administration.

Why? Well, you may think about drugs first. A few years, ago a Mayo Clinic study[1] documented that almost 70 percent of Americans take at least one prescription drug; more than half take two. Here are more recent, and more granular, data from the CDC website:

  • Percent of persons using at least one prescription drug in the past 30 days: 48.9% (2011-2014)
  • Percent of persons using three or more prescription drugs in the past 30 days: 23.1% (2011-2014)
  • Percent of persons using five or more prescription drugs in the past 30 days: 11.9% (2011-2014).

But you are even more likely to interact with the FDA at the grocery store. Who sets the standards for food labeling? Who’s in charge of health and safety in the food industry? Who investigates outbreaks of food-born illness? Answer: the FDA, for all of the above.

In her new book[2], The Poison Squad, Deborah Blum tells the story of Dr. Harvey Wiley, chief chemist of the US Department of Agriculture, set in the context of America in the late nineteenth and early twentieth century: a rich tapestry of unscrupulous businessmen, reformers, activists, muckrakers, and a few dedicated public servants.

Blum recounts in detail food adulteration with toxic “preservatives” and coloring agents, and adds in the fascinating conflicts over whiskey labeling. The primary theme is Wiley’s lifelong dedication to the scientific study of chemical food additives and to accurate labeling, but the story includes the colorful figures of the times including journalist/writer Upton Sinclair and President Teddy Roosevelt.

The separation of food and drug regulation from the Department of Agriculture and the birth of the FDA represented the culmination of decades of conflict. The price, mortality and morbidity paid in food born illness and toxicity from unregulated “medicines” by countless numbers of our ancestors will never be known.

The Poison Squad will give readers a good story, well told, about the forces that brought about the FDA. In a larger sense, it provides a case study of the question of the role of government in our society, set at a time when America was growing more complex and industry more powerful. The narrative will surely open some readers’ eyes to the positive aspects of regulation.

As Alan Wolfe writes, “The idea that liberalism comes in two forms [“classical” and “modern” or “social” liberalism] assumes that the most fundamental question facing mankind is how much government intervenes into the economy… When instead we discuss human purpose and the meaning of life, Adam Smith and John Maynard Keynes are on the same side. Both of them possessed an expansive sense of what we are put on this earth to accomplish. […] For Smith, mercantilism was the enemy of human liberty. For Keynes, monopolies were. It makes perfect sense for an eighteenth-century thinker to conclude that humanity would flourish under the market. For a twentieth century thinker committed to the same ideal, government was an essential tool to the same end.”



[1] Wenjun Zhong, Hilal Maradit-Kremers, Jennifer L. St. Sauver, Barbara P. Yawn, Jon O. Ebbert, Véronique L. Roger, Debra J. Jacobson, Michaela E. McGree, and others. Age and Sex Patterns of Drug Prescribing in a Defined American Population. Mayo Clinic Proceedings, Vol. 88, Issue 7, p697–707

[2] Penguin Press, 2018. New YorkCuba 2


News from the lake…

News from the lake

August has slipped by with a kind of Parisian ennui, and great decisions loom. Will we heat the cabin over the winter? Will this be the year when the deer finally do in the holly bushes, or should we go back to protecting them once again?

These are lake decisions and now with September upon us and closing time coming they assume great importance.

But I can defer them for a bit, and I would rather think back over the departing summer. There’s good news. Two years ago, the homeowners’ association shut down the lodge operation —no more paying guests who “catch and keep,” taking fish home for dinner. With their departure, our fishing has improved dramatically: nice-size bass, carefully caught and released.

Insect variety seems to be on the uptick, too. White Flower Farm, of Litchfield, CT supplies wonderful plants, although my skills are largely limited to day lilies in sun and hostas for shade. This spring, we put in their “Hummingbird-Butterfly Garden for Sun” in a well-lit corner of the front yard. As a consequence, Monarchs have rewarded us by visiting the flowers all summer. White Admirals, a butterfly we have not seen before, appeared in substantial numbers. Down by the boathouse, a few mayflies fluttered out of the lake in July.

Hummingbirds seemed a little late in coming, although they arrived in good numbers in late June. Hairy and downy woodpeckers have been busy since mid-August. Our local loons seemed to have a rough summer; we think they lost this year’s chicks.

We had visitors, of course, over the summer. Everyone who came pitched in helping with the cooking and, by importing young people, broadening our perspectives. This is what summer can still be about: noticing the butterflies, catching a fish or two, and absorbing the excitement of a young friend or relative heading off to his or her freshman year in college.


Jews burnt during the bubonic plague, accusing them in the contamination of christians wells (woodcut illustration from Nuremberg Chronicle, 1493)

A couple of weeks ago, one of my Amherst classmates sent me a thoughtful comment that I thought deserved more readers. I asked if I could post it as a “guest blog,” and he kindly agreed. Here it is.



Through the end of May only 21 weeks had elapsed in 2018, and 23 school shootings resulting in either death or injury had occurred: more than one school shooting a week so far this year.

Apparently, our elected representatives, for whatever reasons, are not going to tackle this plague of mass school shootings, accidents involving firearms, and the resolution of disagreements by violence.  Many possible avenues are open to them: licensing; nationwide data banks of purchases and sales of firearms; background checks; restriction of various fire arm accessories; more in-school personnel protections; and early detection of likely troubled minds.   But they gag when asked if the scourge might have something to do with firearms. Instead, they argue that the gun accidents and massacres are a function of our nation’s abortion policy, or of too much pornography, or of various mental health issues– and then cut the budget for mental health care.

To a lawyer, which I am, the current interpretation of the Second Amendment to our US Constitution, in vogue for only the last ten years – is contrary to our interpretation of it for the preceding 200 years. This current interpretation clearly frustrates any common-sense remedies to our current plague.  I agree with former US Supreme Court Justice John Paul Stevens, who wrote in March 2018:

 “Concern that a national standing army might pose a threat to the security of the separate states led to the adoption of that amendment, which provides that “a well-regulated militia, being necessary to the security of a free state, the right of the people to keep and bear arms, shall not be infringed.’ In 1939 the Supreme Court unanimously held that Congress could prohibit the possession of a sawed-off shotgun because that weapon had no reasonable relation to the preservation or efficiency of a “well regulated militia.  During the years when Warren Burger was our Chief Justice, from 1969 to 1986, no judge, federal or state, as far as I am aware, expressed any doubt as to the limited coverage of that amendment.”

 “In 2008, the Supreme Court overturned Chief Justice Burger’s and others’ long-settled understanding of the Second Amendment’s limited reach by ruling, in District of Columbia v. Heller, that there was an individual right to bear arms. I was among the four dissenters. That decision — which I remain convinced was wrong and certainly was debatable — has provided the N.R.A. with a propaganda weapon of immense power.“

According to Dennis Baron, a professor of English and linguistics at the University of Illinois, Justice Scalia, being a strict constructionist and writing the opinion, focused on the words, “bear arms”, and Scalia thus concluded that the right to “bear arms” would not have been spoken in a military context in the 18th century. He thought that the meaning of “bear arms” simply referred to carrying a weapon and had nothing to do with armies.  Justice Scalia wrote,

 “Although [‘bear arms’] implies that the carrying of the weapon is for the purpose of ‘offensive or defensive action,’ it in no way connotes participation in a structured military organization.  From our review of founding-era sources … [i]n numerous instances, ‘bear arms’ was unambiguously used to refer to the carrying of weapons outside of an organized militia.” 

But Professor Baron says that Justice Scalia is dead wrong. Baron points out that databases of English writing from the founding era confirm “bear arms” as a military term; uses of that phrase in the non-military sense are almost nonexistent.  Among 1,500 separate occurrences of “bear arms” in 17th and 18th century writings, only a handful don’t refer to war, soldiering or organized, armed action.

More than 150 years ago, Tennessee Supreme Court Judge Nathan Green wrote,

“A man in the pursuit of deer, elk and buffaloes, might carry his rifle every day, for forty years, and yet, it would never be said of him, that he had borne arms….”

And in 1995, historian Garry Wills put it more succinctly:

 “One does not bear arms against a rabbit”;

Despite the flawed reasoning in the 2008 Heller case, its decision is still binding.  While I agree with Justice Stevens and Professor Baron, and with the interpretation of the Warren and preceding Courts, even if the 2nd Amendment were correctly interpreted and all of the proposed regulatory approaches were enacted, somehow, somewhere, someone would likely slip through the cracks. Another mass shooting, or one toddler accidentally killing another, or worse, would still occur.


Policy-makers have available two broad types of instruments for changing the various habits and activities in society: traditional regulatory approaches that set specific standards and expectations, or economic incentives —market-based policies that rely on market forces to correct or modify societal behavior. Throughout the history of the United States, most activities that we as a society choose to encourage or discourage are implemented through the use of “sticks”, that is, regulation or through the use of “carrots”, or incentives.  To enhance charitable giving, we allow a tax deduction for qualifying gifts. To prevent excess cultivation, we reduce real estate taxes via a “green acres” program.  To encourage people to “go green”, society offers incentives, including tax credits, low interest loans, property tax abatements, and others. To encourage people to understand right from wrong, and to make the right choice as opposed to the wrong choice there are disincentives, usually couched in terms of loss of money or loss of freedom.

When the legislative “stick” fails, as it has in the case of gun legislation, because policy makers are at loggerheads, flummoxed, addicted to the contributions of arms manufacturers, or just lacking in political courage to act, then the “carrots” of incentives and disincentives offer an alternative approach.

So, let’s consider the concept of “strict liability”.  Strict liability, sometimes referred to as “absolute liability” exists in both civil and criminal law.  It refers to holding an individual liable for damages or losses without having to prove either fault or negligence.  Generally, in tort law, an aggrieved person has to prove that his grief was sustained on account of another person’s fault, whether by negligence or intent.  The law, however, recognizes that there are certain circumstances that are so inherently dangerous or hazardous, that there is no need for the aggrieved to prove direct fault or negligence.  Instead there is strict liability.

Some years ago, I bred and exhibited Morgan horses throughout the Midwest. I kept a stallion on my property. I knew that if my stallion broke out and kicked or bit someone, behaviors for which stallions are noted, I would be “strictly liable” for the resulting damages. Consequently, I built a corral with a six-foot high fence made out of sturdy two-inch thick oak planking and supported by posts almost a foot in diameter.  I did this for self -protection. I would have been strictly liable if that stud horse got out. My lack of negligence or malfeasance would have been irrelevant.   Other examples of strict liability include product liability, harboring of wild animals either in zoos or privately, the disposal of hazardous chemical wastes, and the storage of explosives.

In 2012, Adam Lanza used weapons that his mother owned, apparently legally, to kill 26 children and adults at Sandy Hook Elementary School in Newtown, CN.  Just recently a 17-year-old loner slaughtered 10 at a high school in Santa Fe, Texas using a pistol and a shotgun; weapons legally owned by his father.  I don’t know where Nikolas Cruz got his weaponry to terrorize the Parkland, Florida high school, or where James Holmes got his to terrorize Aurora, Colorado, or where Devin Kelley or Stephen Paddock got theirs to kill people in Sutherland Springs, Texas or Las Vegas, Nevada, respectively. But someone owned those guns, probably legally. Additionally, there are many, many incidents of children injured or killed by other children who are playing with guns. And just last month, a man was shot by his own dog; the dog jumped on a gun lying on a sofa and somehow managed to discharge it.  All of these incidents raise the issue of whether there is simply too much carelessness about the storage and protection of firearms.

Courts and legislators could extend the concept of strict liability to the housing or storage of firearms, just as they have to keeping stallions, wild animals or explosives.  I suggest that legislators might more easily focus on the back-end of the issue with a “disincentive” approach to our current plague, as opposed to the front-end “regulatory” approach that has them so flummoxed.  If an individual wishes to own firearms, then if those firearms “get loose” and cause harm perhaps the owner should be strictly liable, at least civilly, without any burden thrust upon the aggrieved to prove negligence.

Placing that burden on a gun owner is certainly not any more onerous than was my effort to build a corral that would have probably restrained an elephant, maybe even a rhinoceros or two.


Stephen E. Smith II, BA (Amherst College), JD (University of Minnesota)                                  June 2018

Bad Blood. A review.

My blog, The Weekly Packet, (https://theweeklypacket.com/category/health-care/ ) of April 30, 2016 ran under the title “The Theranos evidence, waiting for a story.”

In the text, I said, “… looking at the facts is like looking at individual organs at an autopsy, after the diener has washed them, weighed them, and put them in clean pans. What we need now is the pathologist to come in, and with knowledge and experience, he or she will tell the story that puts the facts together into a coherent narrative. At some point, the narrative may well make an instructive case study.”

John Carreyrou’s new book, Bad Blood: Secrets and Lies in a Silicon Valley Startup, tells the story with spellbinding skill and detail. Those of us who had some background, however minimal, in laboratory work knew that Theranos had almost certainly over-promised and under-delivered. None of us knew the details.

Carreyrou’s reporting reveals the astonishing extent of the process in a marvelously “coherent narrative.” He details the scope of the deceptions involved, including misleading deals with Walgreen’s, Safeway and even an attempt to involve the army, the vast sums of money lost, and the lives disrupted.

First, no spoilers. Read the book! Then, after the mesmerizing read comes the hard part: what can we learn from this story?

One obvious lesson relates to corporate culture. A corporate culture that encourages tough questioning across disciplines and insists on facts may sometimes seem harsh. In such an environment, though, mutual respect and civility make the system work. What we see in Bad Blood, though, is the destructive effect of siloing and secrecy. And we see that destructive process emanating from the top levels of management.

There’s another lesson, too. A very old one. Making the error of hubris provokes the outcome of nemesis. David Ronfeldt explained in his excellent essay for the Rand Corporation, “Beware the Hubris-Nemesis Complex: A Concept for Leadership Analysis.”

“In Greek literature, hubris often afflicted rulers and conquerors who, though endowed with great leadership abilities, abused their power and authority and challenged the divine balance of nature to gratify their own vanity and ambition. Thus hubris was no common evil: It led people to presume that they were above ordinary laws…”

He continued on,

“Hubris above all is what attracted Nemesis, who then retaliated to humiliate and destroy the pretender, often through terror and devastation. Thus she [Nemesis] was an agent of destruction. The battle won, she did not turn to constructive tasks of renewal and redemption—that was for others to do. Yet her behavior was never a matter of pure angry revenge. There were high, righteous purposes behind her acts, for she intervened in human affairs primarily to restore equilibrium when it was badly disturbed, usually by figures who attained excessive power and prosperity.”

With Bad Blood, John Carreyrou has written not just a stunning piece of non-fiction reporting, but a cautionary tale for our times.

Ink still damp from the press

Back in the 19 century, there must have been times when the weekly packet boats had just started to slip the moorings. Then a young boy came running down to the wharf and tossed the very latest newspaper on board, ink still damp from the press.

That scene describes how I feel watching the evolution of articles in the staid pages of JAMA, the Journal of the American Medical Association, and other AMA publications.

For decades, through the 1970s well into the 2000s, the AMA seemed to have a multiple personality disorder. Politically, the organization represented the views of conservative private-practice oriented white male physicians, while the editorial views of JAMA were blatantly ultra-liberal and anti-pharmaceutical industry. Now, the journal and the other organizational media seem to be making a real effort to deal with issues that occupy the interface of society and organized medicine: opioids, income inequality, and environmental- public health concerns including air pollution and water quality.

My experience as the medical director of a heart transplant program taught me firsthand that public health and prevention are far more effective for the population at large than high-technology approaches to end-stage disease. Now, from the AMA of all places, we seem to be witnessing the same realization taking hold. So, I feel like the youngster tossing the fresh newspaper aboard when I tell you about a really worthwhile read in the April 24, 2018 issue of JAMA titled “Health, Faith, and Science on a Warming Planet.” (JAMA. 2018;319(16):1651-1652.) The authors are Monsignor Marcelo Sánchez Sorondo PhD, of the Vatican’s Pontifical Academy of Sciences, Howard Frumkin MD DrPH, from Environmental and Occupational Health Sciences, University of Washington School of Public Health, and Veerabhadran Ramanathan PhD, Atmospheric and Climate Sciences, Scripps Institution of Oceanography.

This astounding trio opened their article by stating, “Climate change, altered natural cycles, and pollution of air, water, and biota threaten the very conditions on which human civilization has depended for the last 12,000 years. While human health is better now than ever before in human history, climate change is undermining many public health advances of the last century and ultimately may be associated with the unprecedented extinction of species. The increasing gap between the wealthy and poor—already unconscionable, and the cause of profound preventable morbidity and mortality—amplifies the effects of climate change on health and deepens health disparities.”

This message, the news that has me running to the dock, is that this is an establishment voice —JAMA, for heaven’s sake— announcing a new paradigm that aligns public health, environmental issues, and economics in a new view of civic and social engagement. This is big news.

Here are the six main points from the paper:
1. Disciplined, critical thinking, and an unfailing commitment to distinguish what is verifiable from what is not, characterize the best of the health, science, and faith communities.
2. Scientific evidence is a primary basis for distinguishing what is verifiable from what is not.
3. With unchecked climate change and air pollution, the very fabric of life on Earth, including that of humans, is at grave risk.
4. There is a role for reverence and awe.
5. There is a moral obligation to safeguard the earth for future generations
6. There is a moral obligation to care for the most vulnerable.

Take a deep breath, and resolve to become, or to remain, engaged.




Photo is the interior of the Metropolitan Cathedral in Buenos Aires, where Pope Francis, as Archbishop Jorge Bergoglio, used to perform mass. (KZM photo, Mar. 14, 2014. Sony DSC-HX300. 1/25th at f2.8, ISO 800)

Perspectives on my 50th medical school reunion.

In 1682, William Penn, an English Quaker, founded the city of Philadelphia as the capital of his Pennsylvania Colony. 83 years later, in 1765, Drs. John Morgan and William Shippen convinced the Trustees of the College, Academy, and Charity School of Philadelphia to found the first medical school in the then-colonies on the eastern seaboard of North America—modeled on the University of Edinburgh, where they had trained. When the Medical School was founded in 1765, the College and the Medical School became a university, although the term “university” was not added to the institution’s official title until 1779.

At the first commencement, June 21, 1768, ten medical students received their M.B. degrees. (The College granted the first M.D. degrees to four of these men in 1771.)

Two hundred years later, in 1968, my class graduated from that same medical school at the University of Pennsylvania, and last weekend, 250 years later, we celebrated our fiftieth reunion. Perspective comes from such an event.

Individual lives, and deaths, occur. The patients, the teachers, and some of the classmates from whom and with whom we learned so much are gone. Soon we—with all the fragile neuronal connections that we treasure as our professional knowledge and skills—will also return to the dust. But the institution continues. What most concerns our class now is what that institution will look like after another half-century.

Philadelphia and its 1.5 million inhabitants could be vaporized in a thermonuclear holocaust, could be flooded by warm rising seas, or could fall victim to some other unpleasant end. But for the foreseeable future, our civilization, the city, and the university will most likely muddle on.

What can’t muddle on is our way of doing health care. From my experience in the US Navy and from visits to institutions all across the country representing Janssen Pharmaceuticals, I know that good medical care does not require the opulence, the cavernous spaces and grand edifices, that we see in major hospitals today.

What do we need to really do our work as doctors? Accessible simple, sturdy well-lighted buildings with good heating and cooling – easily cleaned surfaces – basic hematology, chemistry, and microbiology labs – a couple of reliable X-ray machines – a delivery room and an operating room – a few rooms for overnight stays –a basic food-service operation— a functional record system – a well-stocked pharmacy – high-speed internet, that’s all. Oh yes, and a place to send the patients when that simple facility can’t handle their problems. No questions asked.

What I’m describing could exist. Should exist. If we had a system where every person in the country had low-cost insurance to cover basic immunization and preventive care, maternity care, trauma, and out-patient primary care, then such places would exist. They would provide the network of referrals that research universities and their academic medical arms must have to do their work.

Every single one of us has benefited from what Penn, Harvard, UCSF, Stanford, and the other 137 accredited MD-granting institutions and 31 accredited DO-granting institutions in the United States do. We should help to fund them through taxes, grants, gifts, and health-care insurance that helps to pay for their services.

But we would also all benefit from extending basic care to our whole population. Make no mistake, breakthroughs in science will come; brilliant younger people are hard at work in their labs. My hope, as I left Philadelphia, was for progress in how we deliver care.


I Love It When a Plan Comes Together


Imagine a half-dozen or so college students in hip boots, brandishing wide nets and paper punches, invading your quiet, secure home. Fortunately, fish don’t think much at all, so the fish in the woodland pond that my ecology class visited back in 1963 probably don’t recall our visit to their ancestors, but I still do.

We set out to calculate the number of fish in the pond. The method required catching and counting a sample of fish, hence the nets. Then we marked them with a small, neatly punched hole in the thin membrane of their tails and carefully released the known number of marked fish (= M1) with detailed instructions to go and mix-and-mingle with their companions for a week. (Indicator dilution, for you purists.)

One week later, back in the hip boots, we netted a new sample and counted those with (=M2) and without (=M0) tail punch marks. With this much information, we could calculate the number of fish in the pond:
If x = number of fish in pond
Then M1/X = M2/M0 and the rest is algebra.
This was my favorite experiment in all my academic experience. Imagine getting OUTSIDE in HIP BOOTS and doing something scientific. Like a stonefly emerging from the depths of the library, I turned into an environmentalist.

Stay with me for just a few more minutes. As an almost-ten-year-old growing up in northeast Ohio, I remember the November 1952 picture of the Cuyahoga River on fire that ended up in Time Magazine a month later – a truly arresting image showing flames leaping up from the water, completely engulfing a ship. Over the years, as a physician, I’ve followed the stories of various health problems that seem to have had roots in the environment; Dan Fagin’s Toms River is one of the best. A few years ago, I first read Steven Johnson’s marvelous book, The Ghost Map, the story of John Snow and the London cholera epidemic of 1854.

Then, just this morning, I had a real “Ah-ha!” moment. I read Margaret Talbot’s New Yorker article (The New Yorker, April 2, 2018), “Scott Pruitt’s Dirty Politics,” and my son David, an environmental economist, sent me a piece from the American Public Health Association on environmental health.
“Many communities lack access to nutritious, affordable food; are denied safe           places to walk and exercise; or live near polluting factories. The health risks for these families are greater. We support research and action to help ensure healthy environments for all.”–APHA Executive Director Georges Benjamin

All of these issues are related. It all comes together!

We are not separate from the environment. In populated areas, we ARE the environment, or at least, the environment is largely man-made.

Some individuals with political power do not seem to understand the connection between environmental health —clean air, clean water, open spaces— and human health. Those individuals will not be swayed by facts. In fact, they actively reject science as a basis for public policy.

For now, we can support the public organizations that do battle on behalf of the environment, particularly those that wage their battles in the courts. And soon, we can, we should, we must…VOTE.

PS: The photo this month is an outhouse in the Chinese section of Arrowtown, New Zealand. The Chinese, who came to New Zealand as gold miners, were keenly aware of the importance of sanitation.

Polarization, 1900

“How can I maintain a balanced, centrist, well-informed point of view in such a polarized society?” This was the major problem as the United States headed into the election of 1900. William McKinley was nominated by acclaim at the Republican convention. After a brass-knuckle back-room fight, McKinley had to accept Teddy Roosevelt as his running mate, largely as a concession to keep TR away from New York state electoral office. As in 1896, McKinley faced William Jennings Bryan, this time with Adlai Stevenson as Bryan’s vice-presidential hopeful. The front page of the Akron Daily Democrat of Friday July 6, 1900 covered the three polarizing issues. The democrats had pinned their hopes on the free-silver plank, and the “overthrow of imperialism and trusts.”


The free-silver issue rapidly lost steam as improvements in mining technology expanded production and filled the stores of the US Mint. In his first term, McKinley had put the United States firmly on the gold standard, and there it would stay. The trust issue also slipped to the back burner for the time being. McKinley, the prototype incrementalist, viewed trusts largely as a problem for the states.


Meanwhile, imperialism came to the fore. The United States, under McKinley, had acquired Puerto Rico, Guam, and the Philippines as the spoils of the Spanish American War and had added Hawaii as a territory by annexation. Not all Americans celebrated these island acquisitions. The “robust opposition movement included former presidents Harrison and Cleveland, …college presidents and academics, labor leaders, prominent clergymen, and famous writers….” There was no shortage of opposition to, and polarization about, American “imperialism.”


With the parties firmly at odds, the New York Times described Bryan’s democrats as “the army of the discontented.” Bryan proposed changes in “currency, banks bonds taxes, trusts, wages, and labor law.” He faced McKinley, “who spoke to the sober-minded, conservative, property-owning Americans.”   (Merry, p.445)


Into this fray rode Teddy Roosevelt, the former “Rough Rider” hero of the Spanish American War. “In eight weeks of campaigning, he traveled 21,209 miles in delivering 673 speeches to an estimated three million people in 24 states.” (Merry, p447) How important was TR to McKinley’s election victory? How much did his image as an outdoorsman and his popularity with the troops he had commanded help to swing a large segment of the populist vote to the Republicans? These are still issues for historians to debate.


The lessons 1900, however, are more straightforward. The 1900 electorate was highly polarized, between populist Democrats and middle-of-the-road incrementalist Republicans. Today, the labels are reversed but the divisions are not all that different. Who brought the country back together? A somewhat manic Harvard graduate that no one wanted on the ticket – no one, that is, except the voters.


High marks for Robert W. Merry’s book, President McKinley: Architect of the American Century.  Simon & Schuster, New York. 2017