• Home page of novelist William S. Frankl, M.D.
  • About author William S. Frankl, M.D.
  • Books by novelist William S. Frankl, M.D.
  • Reviews of the writing of author William S. Frankl, M.D.
  • Blog of author William (Bill) S. Frankl, M.D.
  • Contact author William S. Frankl, M.D.
Title: Blog by Novelist William S. Frankl, MD

Archive for the ‘History’ Category

A Letter on Justice and Open Debate

Sunday, August 2nd, 2020

This letter should be of great importance whether you are a Republican or a Democrat or an independent. It goes to the heart of what we are and how we must get along with each other or else we will dissolve into a quagmire of hatred.

 

Harpers Magazine

A Letter on Justice and Open Debate

July 7, 2020
The below letter will be appearing in the Letters section of the magazine’s October issue. We welcome responses at letters@harpers.org

Our cultural institutions are facing a moment of trial. Powerful protests for racial and social justice are leading to overdue demands for police reform, along with wider calls for greater equality and inclusion across our society, not least in higher education, journalism, philanthropy, and the arts. But this needed reckoning has also intensified a new set of moral attitudes and political commitments that tend to weaken our norms of open debate and toleration of differences in favor of ideological conformity. As we applaud the first development, we also raise our voices against the second. The forces of illiberalism are gaining strength throughout the world and have a powerful ally in Donald Trump, who represents a real threat to democracy. But resistance must not be allowed to harden into its own brand of dogma or coercion—which right-wing demagogues are already exploiting. The democratic inclusion we want can be achieved only if we speak out against the intolerant climate that has set in on all sides.

The free exchange of information and ideas, the lifeblood of a liberal society, is daily becoming more constricted. While we have come to expect this on the radical right, censoriousness is also spreading more widely in our culture: an intolerance of opposing views, a vogue for public shaming and ostracism, and the tendency to dissolve complex policy issues in a blinding moral certainty. We uphold the value of robust and even caustic counter-speech from all quarters. But it is now all too common to hear calls for swift and severe retribution in response to perceived transgressions of speech and thought. More troubling still, institutional leaders, in a spirit of panicked damage control, are delivering hasty and disproportionate punishments instead of considered reforms. Editors are fired for running controversial pieces; books are withdrawn for alleged inauthenticity; journalists are barred from writing on certain topics; professors are investigated for quoting works of literature in class; a researcher is fired for circulating a peer-reviewed academic study; and the heads of organizations are ousted for what are sometimes just clumsy mistakes. Whatever the arguments around each particular incident, the result has been to steadily narrow the boundaries of what can be said without the threat of reprisal. We are already paying the price in greater risk aversion among writers, artists, and journalists who fear for their livelihoods if they depart from the consensus, or even lack sufficient zeal in agreement.

This stifling atmosphere will ultimately harm the most vital causes of our time. The restriction of debate, whether by a repressive government or an intolerant society, invariably hurts those who lack power and makes everyone less capable of democratic participation. The way to defeat bad ideas is by exposure, argument, and persuasion, not by trying to silence or wish them away. We refuse any false choice between justice and freedom, which cannot exist without each other. As writers we need a culture that leaves us room for experimentation, risk taking, and even mistakes. We need to preserve the possibility of good-faith disagreement without dire professional consequences. If we won’t defend the very thing on which our work depends, we shouldn’t expect the public or the state to defend it for us.

Elliot Ackerman
Saladin Ambar, Rutgers University
Martin Amis
Anne Applebaum
Marie Arana, author
Margaret Atwood
John Banville
Mia Bay, historian
Louis Begley, writer
Roger Berkowitz, Bard College
Paul Berman, writer
Sheri Berman, Barnard College
Reginald Dwayne Betts, poet
Neil Blair, agent
David W. Blight, Yale University
Jennifer Finney Boylan, author
David Bromwich
David Brooks, columnist
Ian Buruma, Bard College
Lea Carpenter
Noam Chomsky, MIT (emeritus)
Nicholas A. Christakis, Yale University
Roger Cohen, writer
Ambassador Frances D. Cook, ret.
Drucilla Cornell, Founder, uBuntu Project
Kamel Daoud
Meghan Daum, writer
Gerald Early, Washington University-St. Louis
Jeffrey Eugenides, writer
Dexter Filkins
Federico Finchelstein, The New School
Caitlin Flanagan
Richard T. Ford, Stanford Law School
Kmele Foster
David Frum, journalist
Francis Fukuyama, Stanford University
Atul Gawande, Harvard University
Todd Gitlin, Columbia University
Kim Ghattas
Malcolm Gladwell
Michelle Goldberg, columnist
Rebecca Goldstein, writer
Anthony Grafton, Princeton University
David Greenberg, Rutgers University
Linda Greenhouse
Rinne B. Groff, playwright
Sarah Haider, activist
Jonathan Haidt, NYU-Stern
Roya Hakakian, writer
Shadi Hamid, Brookings Institution
Jeet Heer, The Nation
Katie Herzog, podcast host
Susannah Heschel, Dartmouth College
Adam Hochschild, author
Arlie Russell Hochschild, author
Eva Hoffman, writer
Coleman Hughes, writer/Manhattan Institute
Hussein Ibish, Arab Gulf States Institute
Michael Ignatieff
Zaid Jilani, journalist
Bill T. Jones, New York Live Arts
Wendy Kaminer, writer
Matthew Karp, Princeton University
Garry Kasparov, Renew Democracy Initiative
Daniel Kehlmann, writer
Randall Kennedy
Khaled Khalifa, writer
Parag Khanna, author
Laura Kipnis, Northwestern University
Frances Kissling, Center for Health, Ethics, Social Policy
Enrique Krauze, historian
Anthony Kronman, Yale University
Joy Ladin, Yeshiva University
Nicholas Lemann, Columbia University
Mark Lilla, Columbia University
Susie Linfield, New York University
Damon Linker, writer
Dahlia Lithwick, Slate
Steven Lukes, New York University
John R. MacArthur, publisher, writer
Susan Madrak, writer
Phoebe Maltz Bovy
, writer
Greil Marcus
Wynton Marsalis, Jazz at Lincoln Center
Kati Marton, author
Debra Mashek, scholar
Deirdre McCloskey, University of Illinois at Chicago
John McWhorter, Columbia University
Uday Mehta, City University of New York
Andrew Moravcsik, Princeton University
Yascha Mounk, Persuasion
Samuel Moyn, Yale University
Meera Nanda, writer and teacher
Cary Nelson, University of Illinois at Urbana-Champaign
Olivia Nuzzi, New York Magazine
Mark Oppenheimer, Yale University
Dael Orlandersmith, writer/performer
George Packer
Nell Irvin Painter, Princeton University (emerita)
Greg Pardlo, Rutgers University – Camden
Orlando Patterson, Harvard University
Steven Pinker, Harvard University
Letty Cottin Pogrebin
Katha Pollitt
, writer
Claire Bond Potter, The New School
Taufiq Rahim
Zia Haider Rahman, writer
Jennifer Ratner-Rosenhagen, University of Wisconsin
Jonathan Rauch, Brookings Institution/The Atlantic
Neil Roberts, political theorist
Melvin Rogers, Brown University
Kat Rosenfield, writer
Loretta J. Ross, Smith College
J.K. Rowling
Salman Rushdie, New York University
Karim Sadjadpour, Carnegie Endowment
Daryl Michael Scott, Howard University
Diana Senechal, teacher and writer
Jennifer Senior, columnist
Judith Shulevitz, writer
Jesse Singal, journalist
Anne-Marie Slaughter
Andrew Solomon, writer
Deborah Solomon, critic and biographer
Allison Stanger, Middlebury College
Paul Starr, American Prospect/Princeton University
Wendell Steavenson, writer
Gloria Steinem, writer and activist
Nadine Strossen, New York Law School
Ronald S. Sullivan Jr., Harvard Law School
Kian Tajbakhsh, Columbia University
Zephyr Teachout, Fordham University
Cynthia Tucker, University of South Alabama
Adaner Usmani, Harvard University
Chloe Valdary
Helen Vendler, Harvard University
Judy B. Walzer
Michael Walzer
Eric K. Washington, historian
Caroline Weber, historian
Randi Weingarten, American Federation of Teachers
Bari Weiss
Sean Wilentz, Princeton University
Garry Wills
Thomas Chatterton Williams, writer
Robert F. Worth, journalist and author
Molly Worthen, University of North Carolina at Chapel Hill
Matthew Yglesias
Emily Yoffe, journalist
Cathy Young, journalist
Fareed Zakaria

Institutions are listed for identification purposes only.

 

Final Forms/Counting Death

Wednesday, July 29th, 2020

Given the terrible times we are having today, as COVID-19 ravage the Earth, the death rate increases. It is important to know if a death is due to the virus or of other maladies that plague mankind. I thought that the following article would be of interest.

Dept. of Public Health

April 7, 2014 Issue of The New Yorker

Final Forms

What death certificates can tell us, and what they can’t

By Kathryn Schulz

March 31, 2014

We have developed a vast, macabre bureaucracy to answer the question of why we die.I

 

llustration from Oxford Science / Getty

It starts with a dead body, as so many mysteries do. A middle-aged man is found unconscious and rushed to a hospital. For four days, he lingers in a coma; on the fifth, he dies. The clues are few and dark and point in different directions. The man was a drug addict. He was diabetic. Some of his family members say that he acted strangely the last time they saw him conscious. Others disagree. The lab tests are inconclusive. Everything is inconclusive. If this were a mystery of the Conan Doyle kind, there would be a detective, and there would be a solution. In the event, there is neither. Instead, there is a young doctor, in her first year on the job, and there is a single piece of paper: a death certificate, on which she is meant to record, precisely and for posterity, why this person died.

Not every mystery involves a dead body, but every dead body is a mystery. Death is an assassin with infinite aliases, and the question of what kills us is tremendously complex. It is also tremendously labile. We ask it with clinical curiosity and keen it in private grief; we pose it rhetorically and inquire specifically; we address it to everyone from physicians to philosophers to priests. It is as bare as bone and as reverberant as bell metal: Why do we die?

For millennia, our answers to that question were sharply constrained. Lacking any real understanding of the physiological causes of death, we pointed instead to the entities we knew could make things happen: conscious (or putatively conscious) agents. Sometimes that agent was us. We killed one another, obviously; we hexed one another, allegedly; we brought about mortality in general—from Prometheus to Eve, through hubris and through sin. Alternatively, sometimes the agent was death itself. Many early cosmologies include a Grim Reaper, give or take a costume change: Thanatos in Greek mythology, the Hindu Yama, the Angel of Death in the Bible. As a rule, these were agents in the other sense as well: mere instruments of a higher power. For most of history, no matter how we died, we did so at the bidding of God or of the gods.

A correlative of all this agency was passivity. If an omnipotent being wants to kill you, there’s not much you can do about it except beg for mercy—a popular strategy even today. Only after we started looking to the physical world to determine why we die did premodern fatalism begin to fade. Sentient agents yielded to disease agents, divine intercession to medical intervention. Today, “Why do we die?” is one of the fundamental questions of epidemiology, and we have developed a vast and macabre bureaucracy to answer it.

The atomic unit of that bureaucracy is the death certificate. Of all the ways we have ever devised to grapple with our mortality, it is the strangest, least elegiac, and by far the most ambitious. It emerged by an accident of history and evolved to serve two different masters. In part, it is a public-health measure—though even the doctors who deal with death certificates often forget that, regarding them instead as one more piece of paperwork. In part, it is a form of personal identification: the saddest of diplomas, the most mysterious of passports.

And, in part, it is a clue. Of the roughly fifty million people who will die this year, approximately half will get a death certificate. That figure includes every fatality in every developed nation on earth: man, woman, child, infant. The other half, death’s dark matter, expire in the world’s poorest places, which lack the medical and bureaucratic infrastructures for end-of-life documentation. Yet, even with so many people unaccounted for, this number represents the spread of a remarkable idea: that death should be accounted for—that by documenting every single decedent and every possible cause we can solve its mystery.

The antecedent of the modern death certificate emerged in early-sixteenth-century England, in a form known as a Bill of Mortality. The antecedent of the Bill of Mortality does not exist. No earlier civilization we know of kept systematic track of its dead: not ancient Egyptians, for all their elaborate funerary customs; not the Greeks; not the Romans, those otherwise assiduous centralized bookkeepers.

Even Christianity, one of the world’s most successful purveyors of ideas about death, seldom attended to the specifics of why we die. Churches did traditionally keep records of baptisms and burials—and, practically speaking, those serve as a good proxy for births and deaths. But, as a philosophical matter, they are tellingly different: the church was interested in the fate of the soul, not the body. If the goal of life is to gain access to heaven, and death is in God’s hands, there’s no point, and no grace, in dwelling on the particulars of how we die.

That cosmological indifference coincided with scientific ignorance. Early medicine relied more on folklore than on physiology, and its practitioners were not in the habit of examining bodies, living or dead. Well into the nineteenth century, the limits of medical knowledge were such that doctors sometimes didn’t even know if someone had died, let alone how. The widespread terror of being buried alive, which today seems like a dark little wiggle of the id, once reflected a genuine possibility. In the absence of any scientific way to confirm the end of life, it sometimes happened that those consigned to coffins were only mostly dead.

Compounding all this was political irrelevance. Early states had neither the means nor the motive to track individual deaths—or, for that matter, individual anything. Low literacy rates made individual documentation on a broad scale impractical, and reigning administrative practices made it unnecessary. You don’t need a tax I.D. number if taxes are levied on your entire town, and you don’t need a draft card if conscription is collective. Only in exceptional cases did everyday people need to be able to identify themselves—there was, for instance, the vexing premodern problem of how to tell true messengers from false ones—and, accordingly, individual documentation was rare.

The modern death certificate owes its existence to the cosmological, scientific, and political revolutions that eventually overturned this entire world order. But its prototype emerged in response to something else: death itself, on an epic and horrifying scale. In 1347, the Black Death broke out in Europe. By 1351, a third or more of all Europeans were dead. With a huge percentage of the remaining population infectious and the rest of it terrified, the plague turned the formerly private experience of death into a matter of (extreme) public concern. Italy responded by passing the first modern quarantine laws, tracking the living. England took a different route, and began tracking the dead.

Thus the Bills of Mortality: weekly lists of the plague dead, broken down by parish. The earliest known bill is a single handwritten document, thought to date from 1512, which states that in the city of London, between the sixteenth and the twenty-third of November, thirty-four people died of “the plague” and thirty-two of unspecified “oder dyseases.” No information about the dead appeared on early bills, not even their names. And the bills themselves appeared only sporadically: cropping up when the plague did, fading away again when the crisis passed. Their intended purpose seems to have been to help the healthy steer clear of the most infectious parts of town.

Many of history’s great inventions are really great appropriations—middling ideas if used as intended, brilliant when reoriented or co-opted. In their original form, Bills of Mortality were not a particularly powerful or inspired device. But, in the hundred years after their introduction, two modifications altered both the function of the bills and the future of public health. In 1603, the bills began appearing weekly rather than episodically, and did so continuously for the next two hundred and thirty-three years. In 1629, during a lull in the plague, the court of King James I ordered parish clerks to begin listing deaths from other causes as well. The first change turned the Bills of Mortality into one of history’s richest data sets. The second turned them into a global first: a state-mandated system for recording why we die.

It was both a short step and a long time from there to the modern death certificate. As an epidemiological document, the death certificate would have to wait for disease to more fully migrate from the metaphysical to the material realm. As personal identification, it would have to wait for the political revolutions of the eighteenth century, which, by reconfiguring the relationship between the individual and the state, made documenting the lives and deaths of every citizen newly desirable. The flip side of democracy is bureaucracy: if everyone counts, everyone must be counted. The flip side of equality is equality: the pauper gets a driver’s license, the President needs one, and you wait in line at the D.M.V. And the flip side of representation is surveillance: by 1851, the French political theorist Pierre-Joseph Proudhon could observe that “to be governed is to be noted, registered, enumerated, accounted for, stamped, measured, classified, audited, patented, licensed, authorized, endorsed, reprimanded, prevented, reformed, rectified, and corrected, in every operation, every transaction, every movement.”

By the time Proudhon wrote those words, the Bills of Mortality were all but extinct. A numeric tally of the anonymous dead had evolved into a list of the named dead, one person per line, and then into a dedicated form: one decedent per page. This was the death certificate, the grave end of cradle-to-grave documentation. Bureaucratically speaking, that ex-post personal identification represented the death certificate’s ultimate end. But, for public-health purposes, the name of the dead didn’t matter. What mattered, and what had evolved as well, was the cause of death.

In the Unetanneh Tokef, a Jewish liturgical poem thought to have been composed in the eleventh century A.D., the poet notes that, at the beginning of every year, it will be determined who shall perish in the coming months, and how:

who by water and who by fire

who by sword and who by beast

who by famine and who by thirst

who by earthquake and who by plague

who by strangling and who by stoning.

That poem nicely captures the way the premodern world parsed death: into a few coarse causes, all reducible to God’s will. “And You shall apportion the destinies of all Your creatures,” the poet writes.

By the time of the Bills of Mortality, the list of things we thought could kill us had expanded dramatically. Yet, reading those bills today, you could be forgiven for failing to recognize them as an advance in public health. It was possible, in seventeenth- and eighteenth-century England, to die of Bleach and of Blasted, of Cramp and of Itch, of Sciatica and of Lethargy. You could be carried off by Cut of the Stone, or King’s Evil, or Planet-struck, or Rising of the Lights. You could succumb to Overjoy, which sounds like a decent way to go, or be Devoured by Lice, which does not. You could die of Stopping of the Stomach, or Head-Ach, or Chin-cough, or Teeth. You could die of HorseshoeHead, though don’t ask me how. You could die of being a Lunatick. You could die of, basically, death: “Suddenly”; “Killed by several Accidents”; “Found dead in the Streets.” You could die of Frighted, and of Grief.

If what we are after is a revolution in our understanding of death, this does not seem like an entirely promising start. But in the mid-seventeenth century a haberdasher named John Graunt got interested in the question of why we die. That interest was neither medical nor philosophical but actuarial. Like many successful shopkeepers, Graunt was a meticulous accountant, and he realized that he could use the Bills of Mortality to crunch the numbers on death. By trawling through twenty years of those bills, Graunt compiled a list of eighty-one causes of death, which he divided into four main categories: chronic diseases, epidemic diseases, conditions that killed children, and “outward griefs”—that is, injuries. With information like that available for the first time, “it becomes necessary to discuss the problem—can lifetime be prolonged by a knowledge of the causes that cut it short?”

The man who asked that question was Graunt’s most important successor, William Farr, one of the founders of epidemiology. In 1836, when Farr was twenty-eight, England replaced the Bills of Mortality with what would become the global prototype of a modern death-registration system, and created the General Register Office to manage it. The office opened in 1837, and Farr became its first Compiler of Abstracts.

Unlike Graunt, Farr was in the game not to keep books but to save lives, and he realized that vital statistics were the language in which public-health questions could be asked and answered—and, crucially, changed. In 1853, at the behest of the newly formed International Statistical Congress, he helped compile a comprehensive list of causes of death, for use in standardizing mortality data worldwide. The resulting classification contained a hundred and thirty-nine ways to die, divided into seven categories, from “Deaths from accident or violence” to “Deaths from old age.” There were still a few ringers on this list—you could die of laryngitis, and of teething—but it was a long way from Blasted and Itch.

The definitive advance, however, came forty years later, when the classification was revised by a committee headed by the French statistician and demographer Jacques Bertillon. Bertillon doubled the categories of the earlier list from seven to fourteen, expanded causes of death from a hundred and thirty-nine to a hundred and sixty-one, and organized them, as we still do today, by anatomical systems: “Diseases of circulatory system,” “Diseases of respiratory system,” and so forth. The result was published in 1893, as the International List of Causes of Death.

A hundred and twenty years later, that list is still with us. Today, it is managed by the World Health Organization, and is known as the International Statistical Classification of Diseases and Related Health Problems—or, more commonly, the ICD-10. The ICD still reflects Bertillon’s original structure, but it has expanded prodigiously in the course of ten revisions. As its new name suggests, that is partly because it now includes entries for nonfatal diseases. (And much more besides. Beginning after the Second World War, the W.H.O. bowed to the desire of hospitals and insurance companies to use the ICD for billing purposes; as a result, it now contains entries for every imaginable health-care interaction, from well visits to warts. That shift displeases some epidemiologists, since, as a report by the Centers for Disease Control has pointed out, public-health priorities no longer drive the management of the list.)

But, even if you strip the classification of everything that can’t kill you, you are left with a staggering number of things that can. The ICD-10 comes in three forest-green volumes (or as a download, or on CD-rom), can be purchased for $562.82 through Barnes & Noble, and runs to twenty-two hundred pages. The first cause of death that it lists is A00.0, “Cholera due to Vibrio cholerae 01.” The last is Y89.9, “Sequelae of unspecified external cause.” Arrayed between them are more than eight thousand other officially sanctioned ways to die. Taken together, those ICD entries are used to code and standardize the causes of death on death certificates.

Contemplating all this, one suspects that we have got about as far as possible from the premodern relationship to death. A single reason for death, divine will, has mutated into ever more numerous and narrow causes; sixty-six anonymous deaths in sixteenth-century London have grown to twenty-five million death certificates per year. Yet the why of death remains elusive—practically, philosophically, above all emotionally. And, the more extensively we attempt to document it through death certificates, the stranger and more troubled that project comes to seem.

Cede any part of your life to the state, no matter how profound, and soon enough it will hold its own in the bureaucratic triathlon of tedium, arcana, and complexity. Consider: a death certificate is a single piece of paper, one-sided. The official instructions for how to fill it out include the “Physicians’ Handbook on Medical Certification of Death” (fifty-seven pages), the “Funeral Directors’ Handbook on Death Registration and Fetal Death Reporting” (sixty pages), and the “Medical Examiners’ and Coroners’ Handbook on Death Registration and Fetal Death Reporting” (a hundred and thirty pages). This is to say nothing of various supplementary guidelines, such as “Instructions for Completion of Death Certificates in the Aftermath of a Hurricane” and “Completing the Cause-of-Death Section of the Death Certificate for Injury and Poisoning.”

Why does a one-page document require two hundred and fifty pages of instructions? The most generous answer is that death certificates are legitimately difficult to fill out. In the United States, the task of doing so often falls to interns or residents—newly minted M.D.s, in their first year or two on the job. (Death certificates, like all paperwork, obey the law of occupational gravity, and residents are on the bottom.) You have already met one of those M.D.s. Sasha Swartzman, a resident in internal medicine at the Oregon Health and Science University, was the doctor on duty when the man at the beginning of this story met his mysterious end. She describes the process of filling out death certificates as “sort of like doing your own taxes. Shouldn’t I be smart enough to know how to do this?”

A nine-year-old is smart enough to fill out ninety per cent of a death certificate. The difficulties arise almost exclusively in the cause-of-death section, which consists of just four lines. On the first, doctors are instructed to enter the “immediate cause” of death, defined on the form as the “final disease or condition resulting in death.” (If you are already pausing to consider the relationship between “immediate” and “final”: let it go.) On the second line, doctors enter whatever caused the condition on the first line, and on the third line they enter whatever caused the condition on the second line. The last line is reserved for the “underlying cause of death”: “the disease or injury that initiated the chain of morbid events that led directly and inevitably to death.” It is this line that will get turned into an ICD code and identified as the thing that killed you.

The National Center for Health Statistics provides this example of how to correctly complete those lines:

Rupture of myocardium (the immediate cause)

Acute myocardial infarction

Coronary artery thrombosis

Atherosclerotic coronary artery disease (the underlying cause).

Clear enough, even if you don’t know a thrombosis from a bass drum. But real death, like real life, is complicated, as Swartzman’s experience with the diabetic drug user shows. In that case, the immediate cause was obvious: the man died of anoxia, lack of oxygen to the brain. But why? He could have gone hypoglycemic. He could have had a seizure. He could have suffered sudden cardiac arrest. He could have overdosed, accidentally or on purpose. “At some point,” Swartzman says, “you just have to make an educated guess as to what might have happened and go with it.”

As that suggests, death certificates, again like tax returns, do not always scrupulously reflect the truth. From the beginning, they have been compromised both by the limits of medical knowledge and by dodgy reporting practices. In 1662, John Graunt complained that syphilis was underreported as a cause of death because medical investigators failed to recognize it “after the mist of a Cup of Ale, and the bribe of a two-grout fee.” Similar treatment befell other causes of death viewed as morally damning or unmentionable in polite company: tuberculosis, breast cancer, alcoholism, aids, suicide. To protect the reputations of the deceased and the sensibilities of survivors, doctors sometimes upgraded those socially awkward deaths to more acceptable ones—issuing, in essence, vanity death certificates. That practice was sufficiently common in nineteen-thirties New York that the city began issuing a confidential medical report of death: a second, separate document stating the real cause of death.

The practice of bowdlerizing death certificates has faded (although not disappeared), but other reporting problems persist. In 2010, researchers from St. Luke’s–Roosevelt Hospital Center and Columbia University surveyed five hundred and twenty-one doctors in thirty-eight residency programs across New York City. Only a third believed death certificates to be accurate. Nearly half reported knowingly listing an inaccurate cause of death, and that number rose to almost sixty per cent among residents with the most experience. Those who intentionally list inaccurate causes typically choose familiar ones, with the result that common causes of death appear even more common, and rare ones more rare. The Framingham Heart Study, an ongoing longitudinal study in Massachusetts, found that death certificates overstate coronary-heart disease as a cause of death by as much as twenty-four per cent in the general population and by a far greater percentage in the elderly.

Why do residents fudge these forms? Part of the problem is inadequate training; in the New York study, only two in five reported receiving any instruction in how to fill out a death certificate, and only one in five had taken the city’s ostensibly mandatory training module. But, when asked, they also pointed to other issues. Sometimes the death-registration system would not accept the cause they felt was correct. Sometimes a hospital administrator overrode them. Sometimes they had never met the patient. Under three per cent reported ever correcting a death certificate in light of new information. Reading about all this, I recalled how a doctor friend of mine had responded when I told her I was interested in death certificates and found myself thinking of them partly as a genre. “Yes,” she snorted. “Fiction.”

The errors that creep into death certificates from inadequate training and other systemic issues are troubling. They overstate leading causes of death, obscure emerging ones, and distort the data we use to allocate funds for research, education, prevention, and treatment. But bad answers are only part of the issue. A more interesting and difficult problem is how we decide what counts as a good answer.

That problem is wonderfully illustrated by a passage from “Huckleberry Finn,” which enlivens an otherwise arid report by the C.D.C. One afternoon, while chatting with the Wilks sisters, the ever-inventive Huck spontaneously invents a new disease—a form of mumps so virulent that, he claims, a neighbor is in danger of dying from it. But mumps can’t kill you, Susan Wilks protests. Oh, yes, this kind can, Huck insists, because it is all “mixed up with other things,” from “yaller janders” to “brain-fever.” Fine, Susan retorts, but in that case it’s not the mumps that will kill the neighbor: “A body might stump his toe, and take pison, and fall down the well, and break his neck, and bust his brains out, and somebody come along and ask what killed him, and some numskull up and say, ‘Why, he stumped his toe.’ ”

This is precisely the problem posed by death certificates: when filling them out, how far back should we chase the causal chain? If a stubbed toe initiates a fatal sequence of events, is it the underlying cause? Alternatively, how far forward should we chase it? If we are someday able to parse “rupture of myocardium” into its sequential parts, will it cease to be a final cause? And how many causal chains should we chase? To the annoyance of statisticians, it is perfectly possible to die of multiple causes; indeed, as more people live into extreme old age, multifactorial deaths might well become the norm. But multiple causes of death do messy things to mortality data—reporting that one person died of three causes makes it look like three hundred per cent of your population died—and death certificates are not optimized for that kind of recording.

Problems like these have troubled philosophers for centuries. It is formidably difficult to distinguish beyond doubt a cause from a non-cause, or a proximal cause from a distal cause, or which of six rock-throwing hoodlums smashed your picture window. Yet in everyday life we draw such distinctions constantly. That is not imprudent. It is expedient. Causal reasoning is motivated reasoning; we do it not to discover the fundamental make-it-happen mechanisms of the world but to achieve some ends. And that is why the stumped-toe problem matters. We identify the causes we care about—and, conversely, we care about the causes we identify.

On death certificates, the causes we identify are constrained in one specific way: to the immediate physical breakdown that triggered the events that killed you. “If someone dies of a heart attack,” Harvey Fineberg, the president of the Institute of Medicine, says, “you don’t say he died of high cholesterol, sedentary life style, and a forty-pack-year history of smoking.” For that matter, he notes, we no longer say that “you died of despair, you died of poverty, you died of heartbreak. But certainly those are all pretty clear risks for premature death.”

That point has been made, and contested, many times before. In a now famous 1993 paper called “Actual Causes of Death in the United States,” the epidemiologists William Foege and Michael McGinnis showed that roughly half of all deaths in the United States in 1990 could be attributed to nine factors not listed on death certificates: tobacco, diet and physical activity, alcohol, microbial agents, toxic agents, firearms, sexual behavior, motor vehicles, and illicit use of drugs. Omitting such causes mattered, they argued, because the conditions listed on death certificates get the lion’s share of U.S. health-care allocations. Yet the non-listed causes might make better investments; the earlier you intervene on a causal chain the easier and cheaper the intervention tends to be. Consider the relative costs, literal and figurative, of anti-smoking campaigns versus smoking-cessation programs versus lung-cancer treatment.

We could, in theory, redesign death certificates to capture more distant links in the causal chain. But it is not clear that we should. For one thing, a harried young doctor completing a death certificate is unlikely to have access to the desired information. For another, there is an inherent trade-off to adding more fields to any form. Thomas Frieden, the director of the C.D.C., puts it concisely: “The quality of the data you collect is inversely proportional to the amount of data you collect from each reporter.” That is, if you increase the number of questions you ask on a death certificate, you decrease the accuracy of the answers. “There’s lots more information, different information, better information I’d love to have,” Frieden acknowledges. “But whether the juice is worth the squeeze is the question.”

In Bernard Malamud’s short story “Take Pity,” a census-taker named Davidov asks a man named Rosen how an acquaintance of his died. When Rosen shrugs off the question, the census-taker grows irritable:

“How did he die?” Davidov spoke impatiently. “Say in one word.”

“From what he died?—he died, that’s all.”

“Answer, please, this question.”

“Broke in him something. That’s how.”

“Broke what?”

“Broke what breaks.”

Thus does the mandate of data collection—say it in one word—meet the mystery of dying. That encounter is improbable, uncomfortable, and, as exemplified by death certificates, one of the most felicitous in history. In the past two centuries, global life expectancy has more than doubled, from twenty-eight years to seventy-one. In the United States, the infant-mortality rate in 1900 hovered around one in three; today, it is barely six in a thousand. Death certificates did not bring all this about unilaterally, of course. But it is a measure of their importance that, without them, we wouldn’t even know these numbers.

Still, that importance, like a life, has a limit. The C.D.C. will tell you that a death certificate, in addition to its primary functions, “provides family members closure, peace of mind, and documentation of the cause of death.” But death certificates and family members are like Davidov and Rosen. Both may ask why someone died, but the causes that count as good answers are irreconcilably different. As the bereaved, we ask because we want to know if a loved one suffered or was at peace, or if her death was meaningful, or whether we could have prevented it, or how the universe could have permitted it.

On all those questions, a death certificate is mute. Instead, it provides the pathological basis of death, determined by some combination of fact, convention, and guesswork, and described in terms that most non-doctors struggle to understand. That is the kind of answer it should give; a death certificate is not Auden’s elegy for Yeats, meant to both solemnize and lift our grief.

Nor is a death certificate likely to provide peace of mind in its other capacity. Among forms of personal identification, the death certificate is the one that undoes the work of all the rest, removing someone we love from Social Security rolls and voting registers and all the other ranks of the living. That process might be necessary, but it is hardly soothing. The bureaucratization of death that began with the Bills of Mortality has evolved over time into a massively complex checkpoint at the border between the living and the dead: Charon’s T.S.A. At its behest, we supply death certificates to cell-phone companies to induce them to terminate contracts; to airlines to release frequent-flier miles; to Netflix, no kidding, to cancel accounts. We track down fax machines to send copies to six separate government offices, and send another to an attorney via registered mail. In short, we spend vast amounts of energy using death certificates to convince various entities of what is, to us, the most devastatingly obvious fact in the world: that someone we love is no more.

The primary purpose of a death certificate is to explain why we die. But when we are in the pitch of grief—or, for that matter, in the full sunshine of joy—what form, what blank, what cause, whether final, immediate, or underlying, could possibly answer that question to anyone’s satisfaction? Why do we die? For all the medical advances of modernity, there is a sense in which the ancient fatalists had it right. Broke what breaks. We die because we were born; because we are mortal; because that is, after all, life. ♦

Published in the print edition of the April 7, 2014, issue of The New Yorker.

 

US-China Confrontation Will Define Global Order

Friday, May 8th, 2020

China is the source of COVID-19 that is presently destroying our world. China is America’s most serious enemy both economically and militarily. Somehow, after the VIRUS are gone, we will need to deal with the Chinese. Victor Davis Hanson lays out some interesting ideas.

Victor Davis Hanson: US-China Confrontation Will Define Global Order

Monday, May 20, 2019

Hoover Institution, Stanford University

The United States is at a crossroads with an increasingly aggressive China, which could define America’s security and the international order for decades to come, Hoover scholar Victor Davis Hanson says.

Hanson, the Martin and Illie Anderson Senior Fellow at the Hoover Institution, studies military history and the classics. Last year, Hanson won the Edmund Burke Award, which honors people who have made major contributions to the defense of Western civilization. He is the author of the 2019 book The Case for Trump, and 2017’s The Second World Wars. He was recently interviewed on US policy toward China:

What is the Trump strategy behind these tariffs, short term and long term?

Hanson: Short term, Trump feels that he can take the hit of reciprocal Chinese tariffs, given that quietly his opposition, the Democrats, have been raging about Chinese cheating for decades, and, second, that the US economy is so huge and diverse that China simply cannot cause serious damage.

Remember the United States is a country one-third the size of China that produces over double China’s annual gross domestic product and fields a military far more formidable with far more allies—while enjoying a far more influential global culture and a far more sophisticated system of higher education and technological innovation. China’s Asian neighbors and our own European Union allies quietly are hoping Trump can check and roll back Chinese mercantilism, while publicly and pro forma chiding or even condemning Trump’s brinksmanship and his resort to fossilized strategies such as tariffs and loud jawboning.

Long term, Trump believes that if present trends are not reversed, China could in theory catch and surpass the US. And as an authoritarian, anti-democratic superpower, China’s global dominance would not be analogous to the American-led postwar order, but would be one in which China follows one set of rules and imposes a quite different set on everyone else—perhaps one day similar to the system imposed on its own people within China.

Is China a more formidable rival now than Russia was during the Cold War, and if so, why?

Hanson: Yes. Its population is five times greater than that of even the old Soviet Empire’s. Its economy is well over twenty times larger, and over a million Chinese students and business people are in European and American universities and colleges and posted abroad with Chinese companies. So, unlike the old Soviet Union, China is integrated within the West, culturally, economically, and politically. The Soviets—like Maoist China—never leased Western ports, or battled Hollywood over   unflattering pictures, or posed as credible defenders of Asian values or owned large shares of Western companies or piled up huge trade surpluses with Western nations. Soviet propaganda and espionage were crude compared to current Chinese efforts.

What is China doing in terms of cheating on trade and intellectual property as the Trump administration says, and how can the United States stop this behavior? 

Hanson: China does not honor patents and copyright laws. It still exports knock-off and counterfeit products. It steals research and development investment through a vast array of espionage rings. It manipulates its currency.

Its government companies export goods at below the cost of production to grab market share. It requires foreign companies to hand over technology as a price of doing business in China. And, most importantly, it assumes, even demands, that Western nations do not emulate its own international roguery—or else.

The result is a strange paradox in which the United States and Europe assume that China is an international commercial outlaw, but the remedy is deemed worse than the disease. So, many Western firms make enormous profits in China through joint projects, and so many academic institutions depend on China students, and so many financial institutions are invested in China, that to question its mercantilism is to be derided as a quaint nationalist, or a dangerous protectionist, or a veritable racist. China is an astute student of the Western science of victimology and always poses as a  target of Western vindictiveness, racism, or puerile jealousy.

Remedies? First, we must give up the 40-year fantasies that the richer China gets, the more Western and liberal it will become; or that the more China becomes familiar with the West, the greater its admiration and respect for Western values; or that China has so many internal problems that it cannot possibly pose a threat to the West; or that Western magnanimity in foreign policy and trade relations will be appreciated and returned in kind. Instead, the better paradigm is imperial Japan between 1930 and 1941, when Tokyo absorbed Asian allies; had sent a quarter-million students and attachés to the West to learn or steal technology and doctrine; rapidly Westernized; declared Western colonial powers and the US as tired and spent, and without any legitimate business in the Pacific; and considered its own authoritarianism a far better partner to free market capitalism than the supposedly messy and clumsy democracies of the West.

How is China able now to leverage its arguably less powerful military to confront the United States globally?

Hanson: Global naval dominance is not in the Chinese near future. Its naval strategy is more reminiscent of the German Kriegsmarine of 1939 to 1941, which sought to deny the vastly superior Royal Navy access at strategic points without matching its global reach. China is carving out areas where shore batteries and coastal fleets can send showers of missiles to take out a multibillion-dollar American carrier. And its leasing of 50 and more strategically located ports might serve in times of global tensions as transit foci for armed merchant ships. But for now they do not have the capabilities of the American carrier or submarine fleet or expeditionary Marine forces—so the point is to deny America reach, not to emulate its extent.

Why are the current administration policies different than those in the past in confronting China on many different fronts and levels?

Hanson: Trump believes that economic power is the key to global influence and clout. Without it, a military wilts on the vine. A country with GDP growth at a 3 percent annual clip, energy independence, full employment, and increasing labor productivity and trade symmetry can renegotiate Chinese mercantilism and reassure China’s Asian neighbors that they need not appease its aggression. Past administrations might have agreed that China violated copyright and patent laws, dumped subsidized goods, appropriated technology, and ran a massive global espionage apparatus, but they considered remedies either impossible or dangerous and so essentially negotiated a slowing of the supposed predestined Chinese global hegemony. Trump was willing to confront China to achieve fair rather than free trade and take the ensuing heat that he was some sort of tariff-slapping Neanderthal.

Any other thoughts?

Hanson: I think Secretary of State Mike Pompeo’s State Department is the first to openly question the idea that China will eventually rule the world and has offered a strategic plan to check its trade and political agendas. In this regard, a number of Hoover Institution scholars, currently working with Hoover fellow Kiron Skinner, director of policy planning at the US Department of State, are offering alternatives to orthodox American approaches of the past, with the caveat that the most dangerous era in interstate relations is the transition from de facto appeasement to symmetry—given that the abnormalities of the  past had become considered “normal,” and the quite normal efforts of a nation to recalibrate to a balanced relationship are damned as dangerously “abnormal.”

Victor Davis Hanson is also the chairman of the Role of Military History in Contemporary Conflict Working Group at the Hoover Institution. 
:

 

The “New Normal”: Thoughts about the Shape of Things to Come in the Post-Pandemic World

Sunday, May 3rd, 2020

Here is a very interesting and insightful essay on what the future now holds after COVID – 19. It is disturbing, but certainly quite possible.

 

The “New Normal”: Thoughts about the Shape of Things to Come in the Post-Pandemic World

by Nicholas Eberstadt

April 18, 2020

 Nicholas Eberstadt

Nicholas Eberstadt holds the Henry Wendt Chair at the American Enterprise Institute in Washington, D.C., and is a Senior Advisor to the National Bureau of Asian Research.He is Senior Advisor, The National Bureau of Asian Researc        Nicholas Eberstadt offers insights into the challenges to U.S. leadership in a post-pandemic world. This is the inaugural essay in the series “The New Normal in Asia,” which explores ways in which the Covid-19 pandemic might adjust, shape, or reorder the world across multiple dimensions.

Though we are as yet barely weeks into the Covid-19 pandemic, what should already be apparent is that it has precipitated the deepest and most fundamental crisis for Pax Americana that this set of global economic and security arrangements has faced in the past three postwar generations.

We are still very much in the “fog of war” phase of the calamity. The novel coronavirus and its worldwide carnage have come as a strategic surprise to thought leaders and political decision-makers alike. Indeed, it appears to be the intellectual equivalent of an unexpected asteroid strike for almost all who must cope in these unfamiliar new surroundings. Few had seriously considered the contingency that the world economy might be shaken to its foundations by a communicable disease. And even now that this has happened, many remain trapped in the mental coordinates of a world that no longer exists.

Such “prewar” thinking is evident everywhere right now in the earliest phase of what may turn out to be a grave and protracted crisis. Here in the United States, we watch, week by week, as highly regarded financial analysts from Wall Street and economists from the academy misestimate the depths of the damage we can expect—always erring on the side of optimism.

After the March lockdown of the country to “flatten the curve,” the boldest voices dared to venture that the United States might hit 10% unemployment before the worst was over. Four weekly jobless claims reports and 22 million unemployment insurance applications later, U.S. unemployment is already above the 15% mark: north of 1931 levels, in other words. By the end of April, we could well reach or break the 20% threshold, bringing us to 1935 levels, and 1933 levels (25%) no longer sound fantastical. Even so, political and financial leaders talk of a rapid “V-shaped recovery” commencing in the summer, bringing us back to economic normalcy within months. This is prewar thinking, and it is looking increasingly like the economic equivalent of talk in earlier times about how “the boys will be home by Christmas.”

This is moreover a global crisis, and vision has not yet focused on the new realities in other leading powers and major economies. If we try to take an unflinching measure of the impact globally, we can see both good news and bad news—although the two are by no means equally balanced.

The good news is that policymakers the world over have learned from the prewar Great Depression and are unlikely to repeat its exact mistakes. Instead of reducing the money supply and forcing bank collapses, the U.S. Federal Reserve this time is flooding the world with liquidity. Likewise, U.S. fiscal policy, far from attempting to impose further austerity on an already imploding economy through balancing budgets, is embracing Keynesianism with an abandon that might have startled Keynes himself. Given the “stimulus” packages already passed in the last month, this year’s U.S. budget deficit to GDP ratio is already certain to be of World War II scale. And, at least so far, no emanations of Smoot-Hawley-like impulses are on the policy horizon. Last time around, protectionism had devastating reverberations on an already severely stressed international trade and financial system. Confidence in U.S. and international economic management of the current crisis, at least for the time being, is reflected inter alia in the surprisingly sanguine valuations of the stock indices both in the United States and abroad.

The bad news, on the other hand, lies in the nature of the virus itself and in its implications for human life and socioeconomic arrangements. Covid-19 is an extremely contagious virus with high lethality for those exposed to it, and it can be transmitted by asymptomatic “super spreaders.” Further, since this disease is zoonotic (contracted from another species) and novel (our species has no preexisting immunity), the pandemic will roam the world in search of human quarry until an effective vaccine is invented and mass-produced—or until so many people are infected that herd immunity is conferred.

A Darwinian experiment to invite global herd immunity is unthinkable because it could entail untold millions of deaths. New vaccines, for their part, typically take many years to develop. Barring some miracle, even a crash program to perfect a vaccine is currently expected to take at least a year, and it could be a year and a half or longer before a serviceable serum is generally available to the public. Reports now emanating from South Korea, moreover, suggest that survivors might also be susceptible to reinfection. If so, the quest to come up with a lasting inoculation against Covid-19 may be all that much more daunting.

Consequently, societies the world over face the prospect of rolling lockdowns and quarantines until such time as a technological breakthrough rescues them from this condition. This would seem to mean that not just a single national lockdown of a country’s population and economy is in store to fend off mass contagion but rather quite possibly a succession of them—not just one mother-of-all-economic-shocks but an ongoing crisis that presses economic performance severely in countries all around the world simultaneously.

The potential downside of this crisis looks dire enough for affluent societies: even with excellent economic management, they may be in for gruesome recessions, both painful and prolonged. But the situation for the populations of low-income countries—and for least-developed, fragile states—could prove positively catastrophic. Not only are governments in these locales much less capable of responding to pandemics, but malnourished and health-compromised people are much more likely to succumb to them. Even apart from the humanitarian disasters that may result directly from raging outbreaks in poor countries, terrible indirect consequences may also lie in wait for these vulnerable societies. The collapse of economic activity, including demand for commodities, such as minerals and energy, will mean that export earnings and international remittances to poor countries are set to crash in the months ahead and remain low for an indefinite period. Entirely apart from contagion and lockdowns, this can only mean an unavoidable explosion of desperate need—and under governments least equipped to deal with this. While we can hope for the best, the worst could be much, much worse than most observers currently imagine.

Eventually, of course, we will emerge from the current crisis. Envisioning the post-crisis “new normal” is extraordinarily difficult at this early juncture—not that much less demanding, perhaps, than imagining what the postwar world would look like from the vantage point of, say, autumn 1939. Lacking clairvoyance, we can only peer through the glass darkly at what may be the shape of things to come in the post-pandemic order. Yet it is not too soon to offer one safe prediction about that coming order, and to identify three critical but as yet unanswerable questions, the answers to which promise to shape it decisively.

The safe prediction is that the Indo-Pacific, then as now, will be the locus of global economic, political, and military power—and will remain so for at least the coming generation, possibly much longer. Currently, countries belonging to the Asia-Pacific Economic Cooperation (APEC) account for as much as 60% of the world’s estimated GDP and close to half of global trade. If we add India, which is not an APEC member, to that roster, the economic predominance of the region looks even more overwhelming. APEC plus India likewise accounts for much—perhaps most—of the ongoing knowledge production in the world today. By such necessarily imprecise measures as publications in peer-reviewed scientific journals, authors from the APEC-plus-India region are responsible for about three-fifths of current global output. The only state with truly global military capabilities (the United States) is part of this region, as are the only other two governments entertaining global strategic ambitions (China and Russia). In addition to these countries, India and (alas) North Korea are nuclear weapons states. For the moment, the combined nuclear potential of all nuclear powers outside the APEC-plus-India region (France, Britain, Pakistan, and Israel) is dwarfed by the atomic arsenals within it.

Barring a catastrophe of truly biblical proportion (a formulation that may admittedly seem to be tempting fate, given current circumstances) it is impossible to see what configuration of states or regions could displace the Indo-Pacific as the epicenter of world power anytime soon. Someday Africa might in theory become a contender for geopolitical dominance, but that date looks so distant that such scenarios for now are perhaps best narrated by science fiction writers.

“Will the Covid-19 pandemic bring a brutal end to the second age of globalization that began in 1945, just as World War I heralded the cataclysmic death of the first globalization (1870–1914)?”

As for the questions that stand decisively to shape the coming global order, the first concerns the scope and character of what we have been calling “globalization” in the years and decades ahead. Will the Covid-19 pandemic bring a brutal end to the second age of globalization that began in 1945, just as World War I heralded the cataclysmic death of the first globalization (1870–1914)?

At this early point in the crisis, it would take a brave (or foolish) soul to assert confidently that an end to our current far-reaching arrangements for world economic integration simply could not happen. That said, at least for now, it would look as if a lot of things that have not yet gone wrong would have to go wrong, and at the same time sweep away the foundations (and memory plastic) for the networks of trade, finance, communications, technology, culture, and more that have come to deeply connect societies all around the world today. Not much less than a continuing, cascading, and unabated series of worldwide political blunders—not excluding military adventures—would be required to burn this edifice to the ground.

On the other hand, it is also hard to see how a post-pandemic world will pick itself up and carry on with commerce, finance, and global governance as if nothing much happened around the year 2020. Even under the optimistic assumptions—i.e., the assumptions wherein the second age of globalization survives Covid-19’s heavy blow—much will need to be dramatically different. Until the advent of some biometric, post-privacy future, the more or less free movement of peoples across national borders will be a nonstarter. “Davos” stands to become a quaint word, somewhat like “Esperanto,” as national interests and economic nationalism come roaring back. International supply chains will tend to be resourced domestically, notwithstanding the immediate apparent cost in terms of production and profits. At the same time, today’s crisis may explode and wipe out old inefficient business models that had already outlived their usefulness: the “big box” store and retail malls, the unproductive (but sociologically alluring) office, the law firm (with its Soviet-style valuations of its services on the basis of inputs rather than outputs), perhaps the cartelized, price-fixing university as well, and more.

On the positive side, the creative destruction the crisis will unleash will eventually offer immense opportunities for innovation and dynamic improvements in productivity, so long as resources from inefficient or bankrupt undertakings are reallocated to more promising new purposes. To give just one example, the returns on remote communications will likely be high, incentivizing impressive breakthroughs. Post-pandemic economies around the world will need all the productivity surges they can squeeze out of technological and organizational innovation, too—for they will almost certainly be saddled with a far higher burden of public debt than today. Moreover, given current demographic trends and the prospect of significantly less immigration, the shrinking of labor forces and the pronounced aging of national populations may be characteristic of a growing number of economies in the APEC-plus-India region and the rest of the world, and not just in high-income settings. Japan may become a model here, but not in a good way: avoiding “Japanification” could become a preoccupation of policymakers, pundits, and populaces in an epoch of diminished expectations for globalization.

A second huge question for the post-pandemic world concerns China: more specifically, how will the rest of the international community treat this increasingly powerful but intrinsically problematic state?

The world has yet to conduct the authoritative blue-ribbon scientific inquiry into the origins of the coronavirus pandemic that is obviously and urgently needed. However, there is little doubt that heavy responsibility for the global health and economic crisis we are now coping with falls on the Chinese Communist Party (CCP)—and to a lesser but by no means negligible degree, on China’s collaborators within the World Health Organization. Had the CCP placed its population’s health above its own—had it behaved like an open society or followed international transparency norms—there is no question that the global toll from the Covid-19 pandemic would only be a fraction of what has been exacted to date. Epidemiologists from the University of Southampton in the United Kingdom have suggested that the damage might have been contained to just 5% of what we have thus far suffered with an expeditious (and honest) response to the Wuhan outbreak. If that estimate is overly precise, it nonetheless gives a sense of the price the world has paid for the CCP’s priorities and standard operating procedure. We also already know of the complicity of the World Health Organization at its highest levels in buying time for Beijing as the regime figured out how to spin the story of what happened in Hubei Province.

“…the post-pandemic world will have no choice but to contend at last with a problem long in the making: the awful dilemma of global integration without solidarity.”

It would be one thing if this crisis were a one-off—dreadful as the tragedy would be. The problem, unfortunately, is that it is not a one-off, and in fact cannot be. At the heart of the tragedy is an uncomfortable but unavoidable truth: the CCP simply does not share the same interests and norms as the international community into which it has been so momentously and thoroughly integrated. Moreover, there is scant evidence that integration into the world economy and global governance has been “reforming” the Chinese regime, in the sense of bringing its politics and behavior into closer alignment with those acceptable to Western populations. Quite the contrary: in the Xi Jinping era, China’s politics have manifestly been moving away from convergence as the regime has concentrated on perfecting a surveillance state policed by “market totalitarianism” (a social credit system powered by big data, artificial intelligence, and more).

Thus, the post-pandemic world will have no choice but to contend at last with a problem long in the making: the awful dilemma of global integration without solidarity. China is deeply interlinked with every APEC-plus-India economy and with those of the rest of the world as well. Chinese interests are likewise deeply embedded in much of the institutional apparatus that has evolved to facilitate international cooperation. How will the rest of the countries in the international community manage to protect their interests (including health security interests, but by no means limited to this alone) in such a world? Will it be possible to accurately identify and carefully isolate all the areas in which win-win transactions with the CCP are genuinely possible and cordon off everything else? Or will the CCP’s authoritarian influence compromise, corrupt, and degrade these same institutions, and likewise constrain or poison opportunities for truly free international economic cooperation and development after the Covid-19 pandemic?

Last, but by no means least important, there is the question of the United States’ disposition in a post-pandemic world.

Even before the Covid-19 crisis, it was not exactly a secret that the United States—which is to say, Americans—was becoming increasingly reluctant to shoulder responsibility for world leadership in the global order that Washington had been instrumental in creating and that U.S. power was indispensable in supporting. The skepticism and disfavor with which American proponents of internationalism were increasingly greeted at home, however, was not entirely explained by the deep historical roots of isolationism in our country. Nor can it be dismissively described as yet another paroxysm of paranoia and anti-intellectualism on the part of the yahoos, as would-be Hofstadters from today’s chattering classes would like to have it.

Such discontent with our nation’s considerable international obligations skews strongly with socioeconomic status. For those in the bottom half of the country, grievances with the status quo (which not so incidentally includes a strong political commitment to Pax Americana) are by no means delusional. Over the past two generations, the American escalator has broken down for many. Just before the Covid-19 crisis, at the supposed peak of a business cycle, work rates for prime-age American men (the 25–54 age group) were slightly lower than they had been in 1939, near the end of the Great Depression. It is hardly reassuring that this alarming situation has attracted relatively little attention from the talking and deciding classes (many of whom are shielded from personal familiarity with how the other half lives by Charles Murray’s famous bubble).

Scarcely less disconcerting than the work rates for American men are the dismal trends in wealth formation for the less well to do. According to estimates by the Federal Reserve, the mean real net worth for the bottom half of households in the United States was lower in 2019 than it had been in 1989 when the Berlin Wall fell. By these estimates, in fact, the net worth of such households was about a third lower in 2019 than it had been three decades before. Voters from these households might be excused if they were prompted to ask what the fabled “end of the Cold War” had done for them. Recall that these same Americans witnessed a decline in net household worth in a period when overall nominal net worth in the United States soared by almost $80 trillion—an average of almost $250,000 for every man, woman, and child in our country today. Since the arrival of Covid-19 on our shores, the net worth of the bottom half of Americans has dropped still further, as their indebtedness has risen and the value of their assets (mainly homes) declined. It could be quite some time before the balance sheets of those homes look as “favorable” as they did in 2019.

In the United States, the constitutional duty to obtain the consent of the governed obtains for the little people, too, even if they happen to comprise a majority of voters. And in a post-pandemic world, it may be even more difficult to convince a working majority that the globalized economy and other international entanglements actually work in their favor.

If U.S. leaders wanted to generate broad-based domestic support for Pax Americana, they need to devise a formula for generating prosperity for all. Such an agenda, of course, would win on its own merits, with or without an eye toward international security. Absent such a credible agenda, popular support for U.S. international leadership could prove increasingly open to question in the post-pandemic United States. The peril that declining domestic U.S. support poses to the current global order should not be minimized. If or when Pax Americana is destroyed, its demise may be due not to threats from without but rather to pressures from within.

Nicholas Eberstadt holds the Henry Wendt Chair at the American Enterprise Institute in Washington, D.C., and is a Senior Advisor to the National Bureau of Asian Research.

 

 

Trump Derangement Syndrome

Monday, September 23rd, 2019

 

As always, Victor Davis Hanson has produced a superb essay, which in this case helps explain much of why Trump is so violently hated.

The Daily Signal

The Real Reason for Trump Derangement Syndrome

Victor Davis Hanson

September 19, 2019

Donald Trump is waging a nonstop, all-encompassing war against progressive culture, in magnitude analogous to what 19th-century Germans once called a Kulturkampf.

As a result, not even former President George W. Bush has incurred the degree of hatred from the left that is now directed at Trump. For most of his time in office, Trump, his family, his friends, and his businesses have been investigated, probed, dissected, and constantly attacked.

In 2016 and early 2017, Barack Obama appointees in the FBI, CIA, and Department of Justice tried to subvert the Trump campaign, interfere with his transition, and, ultimately, abort his presidency. Now, congressional Democrats promise impeachment before the 2020 election.

The usual reason for such hatred is said to be Trump’s unorthodox and combative take-no-prisoners style. Critics detest his crude and unfettered assertions, his lack of prior military or political experience, his attacks on the so-called bipartisan administrative state, and his intent to roll back the entire Obama-era effort of “fundamentally transforming” the country leftward.

Certainly, Trump’s agenda of closing the border, using tariffs to overturn a half-century of Chinese mercantilism, and pulling back from optional overseas military interventions variously offends both Democrats and establishment Republicans.

Trump periodically and mercurially fires his top officials. He apparently does not care whether the departed write damning memoirs or join his opposition. He will soon appoint his fourth national security adviser within just three years.

To make things worse for his critics, Trump’s economy is booming as never before in the new 21st century: near record-low unemployment, a record number of Americans working, increases in workers’ wages and family incomes, low interest rates, low inflation, steady GDP growth, and a strong stock market.

Yet the real source of Trump derangement syndrome is his desire to wage a multifront pushback—politically, socially, economically, and culturally—against what might be called the elite postmodern progressive world.

Contemporary elites increasingly see nationalism and patriotism as passé. Borders are 19th-century holdovers.

The European Union, not the U.S. Constitution, is seen as the preferable model to run a nation. Transnational and global organizations are wiser on environmental and diplomatic matters than is the U.S. government.

The media can no longer afford to be nonpartisan and impartial in its effort to rid America of a reactionary such as Trump, given his danger to the progressive future.

America’s ancient sins can never really be forgiven. In a new spirit of iconoclasm, thousands of buildings, monuments, and statues dedicated to American sinners of the past must be destroyed, removed, or renamed.

A new America supposedly is marching forward under the banner of ending fossil fuels, curbing the Second Amendment, redistributing income, promoting identity politics and open borders, and providing free college, free health care, and abortion on demand.

An insomniac Trump fights all of the above nonstop and everywhere. In the past, Republican presidents sought to slow the progressive transformation of America but despaired of ever stopping it.

No slugfest is too off-topic or trivial for Trump. Sometimes that means calling out former NFL quarterback Colin Kaepernick for persuading NFL stars to kneel during the national anthem. Huge, monopolistic Silicon Valley companies are special Trump targets. Sometimes Trump enters cul-de-sac Twitter wars with Hollywood has-beens who have attacked him and his policies.

Trump variously goes after Antifa, political correctness on campus, the NATO hierarchy, the radical green movement, Planned Parenthood, American universities, and, above all, the media—especially CNN, The Washington Post, and The New York Times.

For all the acrimony and chaos—and prognostications of Trump’s certain failure—a bloodied Trump wins more than he loses. NATO members may hate Trump, but more are finally paying their promised defense contributions.

In retrospect, many Americans concede that the Iran deal was flawed and that the Paris climate accord mere virtue-signaling. China was long due for a reckoning.

Special counsel Robert Mueller’s investigation proved fruitless and was further diminished by Mueller’s bizarrely incoherent congressional testimony.

Some of the most prominent Trump haters—Michael Avenatti, James Comey, Andrew McCabe, Anthony Scaramucci, and Rep. Adam Schiff—either have been discredited or have become increasingly irrelevant.

Trump has so enraged his Democratic adversaries that the candidates to replace him have moved farther to the left than any primary field in memory. They loathe Trump, but in their abject hatred he has goaded the various Democratic candidates into revealing their support for the crazy Green New Deal, reparations for slavery, relaxed immigration policies, and trillions of dollars in new free stuff.

In a way, the left-wing Democratic presidential candidates understand Trump best. If he wins his one-man crusade to stop the progressive project, they are finished, and their own party will make the necessary adjustments and then sheepishly drift back toward the center.

(C) 2019 TRIBUNE CONTENT AGENCY, LLC.

Commentary By

Victor Davis Hanson @VDHanson

Victor Davis Hanson is a classicist and historian at the Hoover Institution at Stanford University, and author of the book “The Second World Wars: How the First Global Conflict Was Fought and Won.” You can reach him by e-mailing authorvdh@gmail.com.

 


William S. Frankl, MD, All Rights Reserved