• Home page of novelist William S. Frankl, M.D.
  • About author William S. Frankl, M.D.
  • Books by novelist William S. Frankl, M.D.
  • Reviews of the writing of author William S. Frankl, M.D.
  • Blog of author William (Bill) S. Frankl, M.D.
  • Contact author William S. Frankl, M.D.
Title: Blog by Novelist William S. Frankl, MD

Archive for the ‘Culture and Religion’ Category

A Letter on Justice and Open Debate

Sunday, August 2nd, 2020

This letter should be of great importance whether you are a Republican or a Democrat or an independent. It goes to the heart of what we are and how we must get along with each other or else we will dissolve into a quagmire of hatred.

 

Harpers Magazine

A Letter on Justice and Open Debate

July 7, 2020
The below letter will be appearing in the Letters section of the magazine’s October issue. We welcome responses at letters@harpers.org

Our cultural institutions are facing a moment of trial. Powerful protests for racial and social justice are leading to overdue demands for police reform, along with wider calls for greater equality and inclusion across our society, not least in higher education, journalism, philanthropy, and the arts. But this needed reckoning has also intensified a new set of moral attitudes and political commitments that tend to weaken our norms of open debate and toleration of differences in favor of ideological conformity. As we applaud the first development, we also raise our voices against the second. The forces of illiberalism are gaining strength throughout the world and have a powerful ally in Donald Trump, who represents a real threat to democracy. But resistance must not be allowed to harden into its own brand of dogma or coercion—which right-wing demagogues are already exploiting. The democratic inclusion we want can be achieved only if we speak out against the intolerant climate that has set in on all sides.

The free exchange of information and ideas, the lifeblood of a liberal society, is daily becoming more constricted. While we have come to expect this on the radical right, censoriousness is also spreading more widely in our culture: an intolerance of opposing views, a vogue for public shaming and ostracism, and the tendency to dissolve complex policy issues in a blinding moral certainty. We uphold the value of robust and even caustic counter-speech from all quarters. But it is now all too common to hear calls for swift and severe retribution in response to perceived transgressions of speech and thought. More troubling still, institutional leaders, in a spirit of panicked damage control, are delivering hasty and disproportionate punishments instead of considered reforms. Editors are fired for running controversial pieces; books are withdrawn for alleged inauthenticity; journalists are barred from writing on certain topics; professors are investigated for quoting works of literature in class; a researcher is fired for circulating a peer-reviewed academic study; and the heads of organizations are ousted for what are sometimes just clumsy mistakes. Whatever the arguments around each particular incident, the result has been to steadily narrow the boundaries of what can be said without the threat of reprisal. We are already paying the price in greater risk aversion among writers, artists, and journalists who fear for their livelihoods if they depart from the consensus, or even lack sufficient zeal in agreement.

This stifling atmosphere will ultimately harm the most vital causes of our time. The restriction of debate, whether by a repressive government or an intolerant society, invariably hurts those who lack power and makes everyone less capable of democratic participation. The way to defeat bad ideas is by exposure, argument, and persuasion, not by trying to silence or wish them away. We refuse any false choice between justice and freedom, which cannot exist without each other. As writers we need a culture that leaves us room for experimentation, risk taking, and even mistakes. We need to preserve the possibility of good-faith disagreement without dire professional consequences. If we won’t defend the very thing on which our work depends, we shouldn’t expect the public or the state to defend it for us.

Elliot Ackerman
Saladin Ambar, Rutgers University
Martin Amis
Anne Applebaum
Marie Arana, author
Margaret Atwood
John Banville
Mia Bay, historian
Louis Begley, writer
Roger Berkowitz, Bard College
Paul Berman, writer
Sheri Berman, Barnard College
Reginald Dwayne Betts, poet
Neil Blair, agent
David W. Blight, Yale University
Jennifer Finney Boylan, author
David Bromwich
David Brooks, columnist
Ian Buruma, Bard College
Lea Carpenter
Noam Chomsky, MIT (emeritus)
Nicholas A. Christakis, Yale University
Roger Cohen, writer
Ambassador Frances D. Cook, ret.
Drucilla Cornell, Founder, uBuntu Project
Kamel Daoud
Meghan Daum, writer
Gerald Early, Washington University-St. Louis
Jeffrey Eugenides, writer
Dexter Filkins
Federico Finchelstein, The New School
Caitlin Flanagan
Richard T. Ford, Stanford Law School
Kmele Foster
David Frum, journalist
Francis Fukuyama, Stanford University
Atul Gawande, Harvard University
Todd Gitlin, Columbia University
Kim Ghattas
Malcolm Gladwell
Michelle Goldberg, columnist
Rebecca Goldstein, writer
Anthony Grafton, Princeton University
David Greenberg, Rutgers University
Linda Greenhouse
Rinne B. Groff, playwright
Sarah Haider, activist
Jonathan Haidt, NYU-Stern
Roya Hakakian, writer
Shadi Hamid, Brookings Institution
Jeet Heer, The Nation
Katie Herzog, podcast host
Susannah Heschel, Dartmouth College
Adam Hochschild, author
Arlie Russell Hochschild, author
Eva Hoffman, writer
Coleman Hughes, writer/Manhattan Institute
Hussein Ibish, Arab Gulf States Institute
Michael Ignatieff
Zaid Jilani, journalist
Bill T. Jones, New York Live Arts
Wendy Kaminer, writer
Matthew Karp, Princeton University
Garry Kasparov, Renew Democracy Initiative
Daniel Kehlmann, writer
Randall Kennedy
Khaled Khalifa, writer
Parag Khanna, author
Laura Kipnis, Northwestern University
Frances Kissling, Center for Health, Ethics, Social Policy
Enrique Krauze, historian
Anthony Kronman, Yale University
Joy Ladin, Yeshiva University
Nicholas Lemann, Columbia University
Mark Lilla, Columbia University
Susie Linfield, New York University
Damon Linker, writer
Dahlia Lithwick, Slate
Steven Lukes, New York University
John R. MacArthur, publisher, writer
Susan Madrak, writer
Phoebe Maltz Bovy
, writer
Greil Marcus
Wynton Marsalis, Jazz at Lincoln Center
Kati Marton, author
Debra Mashek, scholar
Deirdre McCloskey, University of Illinois at Chicago
John McWhorter, Columbia University
Uday Mehta, City University of New York
Andrew Moravcsik, Princeton University
Yascha Mounk, Persuasion
Samuel Moyn, Yale University
Meera Nanda, writer and teacher
Cary Nelson, University of Illinois at Urbana-Champaign
Olivia Nuzzi, New York Magazine
Mark Oppenheimer, Yale University
Dael Orlandersmith, writer/performer
George Packer
Nell Irvin Painter, Princeton University (emerita)
Greg Pardlo, Rutgers University – Camden
Orlando Patterson, Harvard University
Steven Pinker, Harvard University
Letty Cottin Pogrebin
Katha Pollitt
, writer
Claire Bond Potter, The New School
Taufiq Rahim
Zia Haider Rahman, writer
Jennifer Ratner-Rosenhagen, University of Wisconsin
Jonathan Rauch, Brookings Institution/The Atlantic
Neil Roberts, political theorist
Melvin Rogers, Brown University
Kat Rosenfield, writer
Loretta J. Ross, Smith College
J.K. Rowling
Salman Rushdie, New York University
Karim Sadjadpour, Carnegie Endowment
Daryl Michael Scott, Howard University
Diana Senechal, teacher and writer
Jennifer Senior, columnist
Judith Shulevitz, writer
Jesse Singal, journalist
Anne-Marie Slaughter
Andrew Solomon, writer
Deborah Solomon, critic and biographer
Allison Stanger, Middlebury College
Paul Starr, American Prospect/Princeton University
Wendell Steavenson, writer
Gloria Steinem, writer and activist
Nadine Strossen, New York Law School
Ronald S. Sullivan Jr., Harvard Law School
Kian Tajbakhsh, Columbia University
Zephyr Teachout, Fordham University
Cynthia Tucker, University of South Alabama
Adaner Usmani, Harvard University
Chloe Valdary
Helen Vendler, Harvard University
Judy B. Walzer
Michael Walzer
Eric K. Washington, historian
Caroline Weber, historian
Randi Weingarten, American Federation of Teachers
Bari Weiss
Sean Wilentz, Princeton University
Garry Wills
Thomas Chatterton Williams, writer
Robert F. Worth, journalist and author
Molly Worthen, University of North Carolina at Chapel Hill
Matthew Yglesias
Emily Yoffe, journalist
Cathy Young, journalist
Fareed Zakaria

Institutions are listed for identification purposes only.

 

Final Forms/Counting Death

Wednesday, July 29th, 2020

Given the terrible times we are having today, as COVID-19 ravage the Earth, the death rate increases. It is important to know if a death is due to the virus or of other maladies that plague mankind. I thought that the following article would be of interest.

Dept. of Public Health

April 7, 2014 Issue of The New Yorker

Final Forms

What death certificates can tell us, and what they can’t

By Kathryn Schulz

March 31, 2014

We have developed a vast, macabre bureaucracy to answer the question of why we die.I

 

llustration from Oxford Science / Getty

It starts with a dead body, as so many mysteries do. A middle-aged man is found unconscious and rushed to a hospital. For four days, he lingers in a coma; on the fifth, he dies. The clues are few and dark and point in different directions. The man was a drug addict. He was diabetic. Some of his family members say that he acted strangely the last time they saw him conscious. Others disagree. The lab tests are inconclusive. Everything is inconclusive. If this were a mystery of the Conan Doyle kind, there would be a detective, and there would be a solution. In the event, there is neither. Instead, there is a young doctor, in her first year on the job, and there is a single piece of paper: a death certificate, on which she is meant to record, precisely and for posterity, why this person died.

Not every mystery involves a dead body, but every dead body is a mystery. Death is an assassin with infinite aliases, and the question of what kills us is tremendously complex. It is also tremendously labile. We ask it with clinical curiosity and keen it in private grief; we pose it rhetorically and inquire specifically; we address it to everyone from physicians to philosophers to priests. It is as bare as bone and as reverberant as bell metal: Why do we die?

For millennia, our answers to that question were sharply constrained. Lacking any real understanding of the physiological causes of death, we pointed instead to the entities we knew could make things happen: conscious (or putatively conscious) agents. Sometimes that agent was us. We killed one another, obviously; we hexed one another, allegedly; we brought about mortality in general—from Prometheus to Eve, through hubris and through sin. Alternatively, sometimes the agent was death itself. Many early cosmologies include a Grim Reaper, give or take a costume change: Thanatos in Greek mythology, the Hindu Yama, the Angel of Death in the Bible. As a rule, these were agents in the other sense as well: mere instruments of a higher power. For most of history, no matter how we died, we did so at the bidding of God or of the gods.

A correlative of all this agency was passivity. If an omnipotent being wants to kill you, there’s not much you can do about it except beg for mercy—a popular strategy even today. Only after we started looking to the physical world to determine why we die did premodern fatalism begin to fade. Sentient agents yielded to disease agents, divine intercession to medical intervention. Today, “Why do we die?” is one of the fundamental questions of epidemiology, and we have developed a vast and macabre bureaucracy to answer it.

The atomic unit of that bureaucracy is the death certificate. Of all the ways we have ever devised to grapple with our mortality, it is the strangest, least elegiac, and by far the most ambitious. It emerged by an accident of history and evolved to serve two different masters. In part, it is a public-health measure—though even the doctors who deal with death certificates often forget that, regarding them instead as one more piece of paperwork. In part, it is a form of personal identification: the saddest of diplomas, the most mysterious of passports.

And, in part, it is a clue. Of the roughly fifty million people who will die this year, approximately half will get a death certificate. That figure includes every fatality in every developed nation on earth: man, woman, child, infant. The other half, death’s dark matter, expire in the world’s poorest places, which lack the medical and bureaucratic infrastructures for end-of-life documentation. Yet, even with so many people unaccounted for, this number represents the spread of a remarkable idea: that death should be accounted for—that by documenting every single decedent and every possible cause we can solve its mystery.

The antecedent of the modern death certificate emerged in early-sixteenth-century England, in a form known as a Bill of Mortality. The antecedent of the Bill of Mortality does not exist. No earlier civilization we know of kept systematic track of its dead: not ancient Egyptians, for all their elaborate funerary customs; not the Greeks; not the Romans, those otherwise assiduous centralized bookkeepers.

Even Christianity, one of the world’s most successful purveyors of ideas about death, seldom attended to the specifics of why we die. Churches did traditionally keep records of baptisms and burials—and, practically speaking, those serve as a good proxy for births and deaths. But, as a philosophical matter, they are tellingly different: the church was interested in the fate of the soul, not the body. If the goal of life is to gain access to heaven, and death is in God’s hands, there’s no point, and no grace, in dwelling on the particulars of how we die.

That cosmological indifference coincided with scientific ignorance. Early medicine relied more on folklore than on physiology, and its practitioners were not in the habit of examining bodies, living or dead. Well into the nineteenth century, the limits of medical knowledge were such that doctors sometimes didn’t even know if someone had died, let alone how. The widespread terror of being buried alive, which today seems like a dark little wiggle of the id, once reflected a genuine possibility. In the absence of any scientific way to confirm the end of life, it sometimes happened that those consigned to coffins were only mostly dead.

Compounding all this was political irrelevance. Early states had neither the means nor the motive to track individual deaths—or, for that matter, individual anything. Low literacy rates made individual documentation on a broad scale impractical, and reigning administrative practices made it unnecessary. You don’t need a tax I.D. number if taxes are levied on your entire town, and you don’t need a draft card if conscription is collective. Only in exceptional cases did everyday people need to be able to identify themselves—there was, for instance, the vexing premodern problem of how to tell true messengers from false ones—and, accordingly, individual documentation was rare.

The modern death certificate owes its existence to the cosmological, scientific, and political revolutions that eventually overturned this entire world order. But its prototype emerged in response to something else: death itself, on an epic and horrifying scale. In 1347, the Black Death broke out in Europe. By 1351, a third or more of all Europeans were dead. With a huge percentage of the remaining population infectious and the rest of it terrified, the plague turned the formerly private experience of death into a matter of (extreme) public concern. Italy responded by passing the first modern quarantine laws, tracking the living. England took a different route, and began tracking the dead.

Thus the Bills of Mortality: weekly lists of the plague dead, broken down by parish. The earliest known bill is a single handwritten document, thought to date from 1512, which states that in the city of London, between the sixteenth and the twenty-third of November, thirty-four people died of “the plague” and thirty-two of unspecified “oder dyseases.” No information about the dead appeared on early bills, not even their names. And the bills themselves appeared only sporadically: cropping up when the plague did, fading away again when the crisis passed. Their intended purpose seems to have been to help the healthy steer clear of the most infectious parts of town.

Many of history’s great inventions are really great appropriations—middling ideas if used as intended, brilliant when reoriented or co-opted. In their original form, Bills of Mortality were not a particularly powerful or inspired device. But, in the hundred years after their introduction, two modifications altered both the function of the bills and the future of public health. In 1603, the bills began appearing weekly rather than episodically, and did so continuously for the next two hundred and thirty-three years. In 1629, during a lull in the plague, the court of King James I ordered parish clerks to begin listing deaths from other causes as well. The first change turned the Bills of Mortality into one of history’s richest data sets. The second turned them into a global first: a state-mandated system for recording why we die.

It was both a short step and a long time from there to the modern death certificate. As an epidemiological document, the death certificate would have to wait for disease to more fully migrate from the metaphysical to the material realm. As personal identification, it would have to wait for the political revolutions of the eighteenth century, which, by reconfiguring the relationship between the individual and the state, made documenting the lives and deaths of every citizen newly desirable. The flip side of democracy is bureaucracy: if everyone counts, everyone must be counted. The flip side of equality is equality: the pauper gets a driver’s license, the President needs one, and you wait in line at the D.M.V. And the flip side of representation is surveillance: by 1851, the French political theorist Pierre-Joseph Proudhon could observe that “to be governed is to be noted, registered, enumerated, accounted for, stamped, measured, classified, audited, patented, licensed, authorized, endorsed, reprimanded, prevented, reformed, rectified, and corrected, in every operation, every transaction, every movement.”

By the time Proudhon wrote those words, the Bills of Mortality were all but extinct. A numeric tally of the anonymous dead had evolved into a list of the named dead, one person per line, and then into a dedicated form: one decedent per page. This was the death certificate, the grave end of cradle-to-grave documentation. Bureaucratically speaking, that ex-post personal identification represented the death certificate’s ultimate end. But, for public-health purposes, the name of the dead didn’t matter. What mattered, and what had evolved as well, was the cause of death.

In the Unetanneh Tokef, a Jewish liturgical poem thought to have been composed in the eleventh century A.D., the poet notes that, at the beginning of every year, it will be determined who shall perish in the coming months, and how:

who by water and who by fire

who by sword and who by beast

who by famine and who by thirst

who by earthquake and who by plague

who by strangling and who by stoning.

That poem nicely captures the way the premodern world parsed death: into a few coarse causes, all reducible to God’s will. “And You shall apportion the destinies of all Your creatures,” the poet writes.

By the time of the Bills of Mortality, the list of things we thought could kill us had expanded dramatically. Yet, reading those bills today, you could be forgiven for failing to recognize them as an advance in public health. It was possible, in seventeenth- and eighteenth-century England, to die of Bleach and of Blasted, of Cramp and of Itch, of Sciatica and of Lethargy. You could be carried off by Cut of the Stone, or King’s Evil, or Planet-struck, or Rising of the Lights. You could succumb to Overjoy, which sounds like a decent way to go, or be Devoured by Lice, which does not. You could die of Stopping of the Stomach, or Head-Ach, or Chin-cough, or Teeth. You could die of HorseshoeHead, though don’t ask me how. You could die of being a Lunatick. You could die of, basically, death: “Suddenly”; “Killed by several Accidents”; “Found dead in the Streets.” You could die of Frighted, and of Grief.

If what we are after is a revolution in our understanding of death, this does not seem like an entirely promising start. But in the mid-seventeenth century a haberdasher named John Graunt got interested in the question of why we die. That interest was neither medical nor philosophical but actuarial. Like many successful shopkeepers, Graunt was a meticulous accountant, and he realized that he could use the Bills of Mortality to crunch the numbers on death. By trawling through twenty years of those bills, Graunt compiled a list of eighty-one causes of death, which he divided into four main categories: chronic diseases, epidemic diseases, conditions that killed children, and “outward griefs”—that is, injuries. With information like that available for the first time, “it becomes necessary to discuss the problem—can lifetime be prolonged by a knowledge of the causes that cut it short?”

The man who asked that question was Graunt’s most important successor, William Farr, one of the founders of epidemiology. In 1836, when Farr was twenty-eight, England replaced the Bills of Mortality with what would become the global prototype of a modern death-registration system, and created the General Register Office to manage it. The office opened in 1837, and Farr became its first Compiler of Abstracts.

Unlike Graunt, Farr was in the game not to keep books but to save lives, and he realized that vital statistics were the language in which public-health questions could be asked and answered—and, crucially, changed. In 1853, at the behest of the newly formed International Statistical Congress, he helped compile a comprehensive list of causes of death, for use in standardizing mortality data worldwide. The resulting classification contained a hundred and thirty-nine ways to die, divided into seven categories, from “Deaths from accident or violence” to “Deaths from old age.” There were still a few ringers on this list—you could die of laryngitis, and of teething—but it was a long way from Blasted and Itch.

The definitive advance, however, came forty years later, when the classification was revised by a committee headed by the French statistician and demographer Jacques Bertillon. Bertillon doubled the categories of the earlier list from seven to fourteen, expanded causes of death from a hundred and thirty-nine to a hundred and sixty-one, and organized them, as we still do today, by anatomical systems: “Diseases of circulatory system,” “Diseases of respiratory system,” and so forth. The result was published in 1893, as the International List of Causes of Death.

A hundred and twenty years later, that list is still with us. Today, it is managed by the World Health Organization, and is known as the International Statistical Classification of Diseases and Related Health Problems—or, more commonly, the ICD-10. The ICD still reflects Bertillon’s original structure, but it has expanded prodigiously in the course of ten revisions. As its new name suggests, that is partly because it now includes entries for nonfatal diseases. (And much more besides. Beginning after the Second World War, the W.H.O. bowed to the desire of hospitals and insurance companies to use the ICD for billing purposes; as a result, it now contains entries for every imaginable health-care interaction, from well visits to warts. That shift displeases some epidemiologists, since, as a report by the Centers for Disease Control has pointed out, public-health priorities no longer drive the management of the list.)

But, even if you strip the classification of everything that can’t kill you, you are left with a staggering number of things that can. The ICD-10 comes in three forest-green volumes (or as a download, or on CD-rom), can be purchased for $562.82 through Barnes & Noble, and runs to twenty-two hundred pages. The first cause of death that it lists is A00.0, “Cholera due to Vibrio cholerae 01.” The last is Y89.9, “Sequelae of unspecified external cause.” Arrayed between them are more than eight thousand other officially sanctioned ways to die. Taken together, those ICD entries are used to code and standardize the causes of death on death certificates.

Contemplating all this, one suspects that we have got about as far as possible from the premodern relationship to death. A single reason for death, divine will, has mutated into ever more numerous and narrow causes; sixty-six anonymous deaths in sixteenth-century London have grown to twenty-five million death certificates per year. Yet the why of death remains elusive—practically, philosophically, above all emotionally. And, the more extensively we attempt to document it through death certificates, the stranger and more troubled that project comes to seem.

Cede any part of your life to the state, no matter how profound, and soon enough it will hold its own in the bureaucratic triathlon of tedium, arcana, and complexity. Consider: a death certificate is a single piece of paper, one-sided. The official instructions for how to fill it out include the “Physicians’ Handbook on Medical Certification of Death” (fifty-seven pages), the “Funeral Directors’ Handbook on Death Registration and Fetal Death Reporting” (sixty pages), and the “Medical Examiners’ and Coroners’ Handbook on Death Registration and Fetal Death Reporting” (a hundred and thirty pages). This is to say nothing of various supplementary guidelines, such as “Instructions for Completion of Death Certificates in the Aftermath of a Hurricane” and “Completing the Cause-of-Death Section of the Death Certificate for Injury and Poisoning.”

Why does a one-page document require two hundred and fifty pages of instructions? The most generous answer is that death certificates are legitimately difficult to fill out. In the United States, the task of doing so often falls to interns or residents—newly minted M.D.s, in their first year or two on the job. (Death certificates, like all paperwork, obey the law of occupational gravity, and residents are on the bottom.) You have already met one of those M.D.s. Sasha Swartzman, a resident in internal medicine at the Oregon Health and Science University, was the doctor on duty when the man at the beginning of this story met his mysterious end. She describes the process of filling out death certificates as “sort of like doing your own taxes. Shouldn’t I be smart enough to know how to do this?”

A nine-year-old is smart enough to fill out ninety per cent of a death certificate. The difficulties arise almost exclusively in the cause-of-death section, which consists of just four lines. On the first, doctors are instructed to enter the “immediate cause” of death, defined on the form as the “final disease or condition resulting in death.” (If you are already pausing to consider the relationship between “immediate” and “final”: let it go.) On the second line, doctors enter whatever caused the condition on the first line, and on the third line they enter whatever caused the condition on the second line. The last line is reserved for the “underlying cause of death”: “the disease or injury that initiated the chain of morbid events that led directly and inevitably to death.” It is this line that will get turned into an ICD code and identified as the thing that killed you.

The National Center for Health Statistics provides this example of how to correctly complete those lines:

Rupture of myocardium (the immediate cause)

Acute myocardial infarction

Coronary artery thrombosis

Atherosclerotic coronary artery disease (the underlying cause).

Clear enough, even if you don’t know a thrombosis from a bass drum. But real death, like real life, is complicated, as Swartzman’s experience with the diabetic drug user shows. In that case, the immediate cause was obvious: the man died of anoxia, lack of oxygen to the brain. But why? He could have gone hypoglycemic. He could have had a seizure. He could have suffered sudden cardiac arrest. He could have overdosed, accidentally or on purpose. “At some point,” Swartzman says, “you just have to make an educated guess as to what might have happened and go with it.”

As that suggests, death certificates, again like tax returns, do not always scrupulously reflect the truth. From the beginning, they have been compromised both by the limits of medical knowledge and by dodgy reporting practices. In 1662, John Graunt complained that syphilis was underreported as a cause of death because medical investigators failed to recognize it “after the mist of a Cup of Ale, and the bribe of a two-grout fee.” Similar treatment befell other causes of death viewed as morally damning or unmentionable in polite company: tuberculosis, breast cancer, alcoholism, aids, suicide. To protect the reputations of the deceased and the sensibilities of survivors, doctors sometimes upgraded those socially awkward deaths to more acceptable ones—issuing, in essence, vanity death certificates. That practice was sufficiently common in nineteen-thirties New York that the city began issuing a confidential medical report of death: a second, separate document stating the real cause of death.

The practice of bowdlerizing death certificates has faded (although not disappeared), but other reporting problems persist. In 2010, researchers from St. Luke’s–Roosevelt Hospital Center and Columbia University surveyed five hundred and twenty-one doctors in thirty-eight residency programs across New York City. Only a third believed death certificates to be accurate. Nearly half reported knowingly listing an inaccurate cause of death, and that number rose to almost sixty per cent among residents with the most experience. Those who intentionally list inaccurate causes typically choose familiar ones, with the result that common causes of death appear even more common, and rare ones more rare. The Framingham Heart Study, an ongoing longitudinal study in Massachusetts, found that death certificates overstate coronary-heart disease as a cause of death by as much as twenty-four per cent in the general population and by a far greater percentage in the elderly.

Why do residents fudge these forms? Part of the problem is inadequate training; in the New York study, only two in five reported receiving any instruction in how to fill out a death certificate, and only one in five had taken the city’s ostensibly mandatory training module. But, when asked, they also pointed to other issues. Sometimes the death-registration system would not accept the cause they felt was correct. Sometimes a hospital administrator overrode them. Sometimes they had never met the patient. Under three per cent reported ever correcting a death certificate in light of new information. Reading about all this, I recalled how a doctor friend of mine had responded when I told her I was interested in death certificates and found myself thinking of them partly as a genre. “Yes,” she snorted. “Fiction.”

The errors that creep into death certificates from inadequate training and other systemic issues are troubling. They overstate leading causes of death, obscure emerging ones, and distort the data we use to allocate funds for research, education, prevention, and treatment. But bad answers are only part of the issue. A more interesting and difficult problem is how we decide what counts as a good answer.

That problem is wonderfully illustrated by a passage from “Huckleberry Finn,” which enlivens an otherwise arid report by the C.D.C. One afternoon, while chatting with the Wilks sisters, the ever-inventive Huck spontaneously invents a new disease—a form of mumps so virulent that, he claims, a neighbor is in danger of dying from it. But mumps can’t kill you, Susan Wilks protests. Oh, yes, this kind can, Huck insists, because it is all “mixed up with other things,” from “yaller janders” to “brain-fever.” Fine, Susan retorts, but in that case it’s not the mumps that will kill the neighbor: “A body might stump his toe, and take pison, and fall down the well, and break his neck, and bust his brains out, and somebody come along and ask what killed him, and some numskull up and say, ‘Why, he stumped his toe.’ ”

This is precisely the problem posed by death certificates: when filling them out, how far back should we chase the causal chain? If a stubbed toe initiates a fatal sequence of events, is it the underlying cause? Alternatively, how far forward should we chase it? If we are someday able to parse “rupture of myocardium” into its sequential parts, will it cease to be a final cause? And how many causal chains should we chase? To the annoyance of statisticians, it is perfectly possible to die of multiple causes; indeed, as more people live into extreme old age, multifactorial deaths might well become the norm. But multiple causes of death do messy things to mortality data—reporting that one person died of three causes makes it look like three hundred per cent of your population died—and death certificates are not optimized for that kind of recording.

Problems like these have troubled philosophers for centuries. It is formidably difficult to distinguish beyond doubt a cause from a non-cause, or a proximal cause from a distal cause, or which of six rock-throwing hoodlums smashed your picture window. Yet in everyday life we draw such distinctions constantly. That is not imprudent. It is expedient. Causal reasoning is motivated reasoning; we do it not to discover the fundamental make-it-happen mechanisms of the world but to achieve some ends. And that is why the stumped-toe problem matters. We identify the causes we care about—and, conversely, we care about the causes we identify.

On death certificates, the causes we identify are constrained in one specific way: to the immediate physical breakdown that triggered the events that killed you. “If someone dies of a heart attack,” Harvey Fineberg, the president of the Institute of Medicine, says, “you don’t say he died of high cholesterol, sedentary life style, and a forty-pack-year history of smoking.” For that matter, he notes, we no longer say that “you died of despair, you died of poverty, you died of heartbreak. But certainly those are all pretty clear risks for premature death.”

That point has been made, and contested, many times before. In a now famous 1993 paper called “Actual Causes of Death in the United States,” the epidemiologists William Foege and Michael McGinnis showed that roughly half of all deaths in the United States in 1990 could be attributed to nine factors not listed on death certificates: tobacco, diet and physical activity, alcohol, microbial agents, toxic agents, firearms, sexual behavior, motor vehicles, and illicit use of drugs. Omitting such causes mattered, they argued, because the conditions listed on death certificates get the lion’s share of U.S. health-care allocations. Yet the non-listed causes might make better investments; the earlier you intervene on a causal chain the easier and cheaper the intervention tends to be. Consider the relative costs, literal and figurative, of anti-smoking campaigns versus smoking-cessation programs versus lung-cancer treatment.

We could, in theory, redesign death certificates to capture more distant links in the causal chain. But it is not clear that we should. For one thing, a harried young doctor completing a death certificate is unlikely to have access to the desired information. For another, there is an inherent trade-off to adding more fields to any form. Thomas Frieden, the director of the C.D.C., puts it concisely: “The quality of the data you collect is inversely proportional to the amount of data you collect from each reporter.” That is, if you increase the number of questions you ask on a death certificate, you decrease the accuracy of the answers. “There’s lots more information, different information, better information I’d love to have,” Frieden acknowledges. “But whether the juice is worth the squeeze is the question.”

In Bernard Malamud’s short story “Take Pity,” a census-taker named Davidov asks a man named Rosen how an acquaintance of his died. When Rosen shrugs off the question, the census-taker grows irritable:

“How did he die?” Davidov spoke impatiently. “Say in one word.”

“From what he died?—he died, that’s all.”

“Answer, please, this question.”

“Broke in him something. That’s how.”

“Broke what?”

“Broke what breaks.”

Thus does the mandate of data collection—say it in one word—meet the mystery of dying. That encounter is improbable, uncomfortable, and, as exemplified by death certificates, one of the most felicitous in history. In the past two centuries, global life expectancy has more than doubled, from twenty-eight years to seventy-one. In the United States, the infant-mortality rate in 1900 hovered around one in three; today, it is barely six in a thousand. Death certificates did not bring all this about unilaterally, of course. But it is a measure of their importance that, without them, we wouldn’t even know these numbers.

Still, that importance, like a life, has a limit. The C.D.C. will tell you that a death certificate, in addition to its primary functions, “provides family members closure, peace of mind, and documentation of the cause of death.” But death certificates and family members are like Davidov and Rosen. Both may ask why someone died, but the causes that count as good answers are irreconcilably different. As the bereaved, we ask because we want to know if a loved one suffered or was at peace, or if her death was meaningful, or whether we could have prevented it, or how the universe could have permitted it.

On all those questions, a death certificate is mute. Instead, it provides the pathological basis of death, determined by some combination of fact, convention, and guesswork, and described in terms that most non-doctors struggle to understand. That is the kind of answer it should give; a death certificate is not Auden’s elegy for Yeats, meant to both solemnize and lift our grief.

Nor is a death certificate likely to provide peace of mind in its other capacity. Among forms of personal identification, the death certificate is the one that undoes the work of all the rest, removing someone we love from Social Security rolls and voting registers and all the other ranks of the living. That process might be necessary, but it is hardly soothing. The bureaucratization of death that began with the Bills of Mortality has evolved over time into a massively complex checkpoint at the border between the living and the dead: Charon’s T.S.A. At its behest, we supply death certificates to cell-phone companies to induce them to terminate contracts; to airlines to release frequent-flier miles; to Netflix, no kidding, to cancel accounts. We track down fax machines to send copies to six separate government offices, and send another to an attorney via registered mail. In short, we spend vast amounts of energy using death certificates to convince various entities of what is, to us, the most devastatingly obvious fact in the world: that someone we love is no more.

The primary purpose of a death certificate is to explain why we die. But when we are in the pitch of grief—or, for that matter, in the full sunshine of joy—what form, what blank, what cause, whether final, immediate, or underlying, could possibly answer that question to anyone’s satisfaction? Why do we die? For all the medical advances of modernity, there is a sense in which the ancient fatalists had it right. Broke what breaks. We die because we were born; because we are mortal; because that is, after all, life. ♦

Published in the print edition of the April 7, 2014, issue of The New Yorker.

 

US-China Confrontation Will Define Global Order

Friday, May 8th, 2020

China is the source of COVID-19 that is presently destroying our world. China is America’s most serious enemy both economically and militarily. Somehow, after the VIRUS are gone, we will need to deal with the Chinese. Victor Davis Hanson lays out some interesting ideas.

Victor Davis Hanson: US-China Confrontation Will Define Global Order

Monday, May 20, 2019

Hoover Institution, Stanford University

The United States is at a crossroads with an increasingly aggressive China, which could define America’s security and the international order for decades to come, Hoover scholar Victor Davis Hanson says.

Hanson, the Martin and Illie Anderson Senior Fellow at the Hoover Institution, studies military history and the classics. Last year, Hanson won the Edmund Burke Award, which honors people who have made major contributions to the defense of Western civilization. He is the author of the 2019 book The Case for Trump, and 2017’s The Second World Wars. He was recently interviewed on US policy toward China:

What is the Trump strategy behind these tariffs, short term and long term?

Hanson: Short term, Trump feels that he can take the hit of reciprocal Chinese tariffs, given that quietly his opposition, the Democrats, have been raging about Chinese cheating for decades, and, second, that the US economy is so huge and diverse that China simply cannot cause serious damage.

Remember the United States is a country one-third the size of China that produces over double China’s annual gross domestic product and fields a military far more formidable with far more allies—while enjoying a far more influential global culture and a far more sophisticated system of higher education and technological innovation. China’s Asian neighbors and our own European Union allies quietly are hoping Trump can check and roll back Chinese mercantilism, while publicly and pro forma chiding or even condemning Trump’s brinksmanship and his resort to fossilized strategies such as tariffs and loud jawboning.

Long term, Trump believes that if present trends are not reversed, China could in theory catch and surpass the US. And as an authoritarian, anti-democratic superpower, China’s global dominance would not be analogous to the American-led postwar order, but would be one in which China follows one set of rules and imposes a quite different set on everyone else—perhaps one day similar to the system imposed on its own people within China.

Is China a more formidable rival now than Russia was during the Cold War, and if so, why?

Hanson: Yes. Its population is five times greater than that of even the old Soviet Empire’s. Its economy is well over twenty times larger, and over a million Chinese students and business people are in European and American universities and colleges and posted abroad with Chinese companies. So, unlike the old Soviet Union, China is integrated within the West, culturally, economically, and politically. The Soviets—like Maoist China—never leased Western ports, or battled Hollywood over   unflattering pictures, or posed as credible defenders of Asian values or owned large shares of Western companies or piled up huge trade surpluses with Western nations. Soviet propaganda and espionage were crude compared to current Chinese efforts.

What is China doing in terms of cheating on trade and intellectual property as the Trump administration says, and how can the United States stop this behavior? 

Hanson: China does not honor patents and copyright laws. It still exports knock-off and counterfeit products. It steals research and development investment through a vast array of espionage rings. It manipulates its currency.

Its government companies export goods at below the cost of production to grab market share. It requires foreign companies to hand over technology as a price of doing business in China. And, most importantly, it assumes, even demands, that Western nations do not emulate its own international roguery—or else.

The result is a strange paradox in which the United States and Europe assume that China is an international commercial outlaw, but the remedy is deemed worse than the disease. So, many Western firms make enormous profits in China through joint projects, and so many academic institutions depend on China students, and so many financial institutions are invested in China, that to question its mercantilism is to be derided as a quaint nationalist, or a dangerous protectionist, or a veritable racist. China is an astute student of the Western science of victimology and always poses as a  target of Western vindictiveness, racism, or puerile jealousy.

Remedies? First, we must give up the 40-year fantasies that the richer China gets, the more Western and liberal it will become; or that the more China becomes familiar with the West, the greater its admiration and respect for Western values; or that China has so many internal problems that it cannot possibly pose a threat to the West; or that Western magnanimity in foreign policy and trade relations will be appreciated and returned in kind. Instead, the better paradigm is imperial Japan between 1930 and 1941, when Tokyo absorbed Asian allies; had sent a quarter-million students and attachés to the West to learn or steal technology and doctrine; rapidly Westernized; declared Western colonial powers and the US as tired and spent, and without any legitimate business in the Pacific; and considered its own authoritarianism a far better partner to free market capitalism than the supposedly messy and clumsy democracies of the West.

How is China able now to leverage its arguably less powerful military to confront the United States globally?

Hanson: Global naval dominance is not in the Chinese near future. Its naval strategy is more reminiscent of the German Kriegsmarine of 1939 to 1941, which sought to deny the vastly superior Royal Navy access at strategic points without matching its global reach. China is carving out areas where shore batteries and coastal fleets can send showers of missiles to take out a multibillion-dollar American carrier. And its leasing of 50 and more strategically located ports might serve in times of global tensions as transit foci for armed merchant ships. But for now they do not have the capabilities of the American carrier or submarine fleet or expeditionary Marine forces—so the point is to deny America reach, not to emulate its extent.

Why are the current administration policies different than those in the past in confronting China on many different fronts and levels?

Hanson: Trump believes that economic power is the key to global influence and clout. Without it, a military wilts on the vine. A country with GDP growth at a 3 percent annual clip, energy independence, full employment, and increasing labor productivity and trade symmetry can renegotiate Chinese mercantilism and reassure China’s Asian neighbors that they need not appease its aggression. Past administrations might have agreed that China violated copyright and patent laws, dumped subsidized goods, appropriated technology, and ran a massive global espionage apparatus, but they considered remedies either impossible or dangerous and so essentially negotiated a slowing of the supposed predestined Chinese global hegemony. Trump was willing to confront China to achieve fair rather than free trade and take the ensuing heat that he was some sort of tariff-slapping Neanderthal.

Any other thoughts?

Hanson: I think Secretary of State Mike Pompeo’s State Department is the first to openly question the idea that China will eventually rule the world and has offered a strategic plan to check its trade and political agendas. In this regard, a number of Hoover Institution scholars, currently working with Hoover fellow Kiron Skinner, director of policy planning at the US Department of State, are offering alternatives to orthodox American approaches of the past, with the caveat that the most dangerous era in interstate relations is the transition from de facto appeasement to symmetry—given that the abnormalities of the  past had become considered “normal,” and the quite normal efforts of a nation to recalibrate to a balanced relationship are damned as dangerously “abnormal.”

Victor Davis Hanson is also the chairman of the Role of Military History in Contemporary Conflict Working Group at the Hoover Institution. 
:

 

Return to the Blog

Sunday, May 3rd, 2020

After a four-month hiatus, I am coming home to my blog. The hiatus was an attempt to finish, to my own satisfaction, my latest novel, Donovan’s Run. Although my attempt was partially successful, there still remain more to be done. However, given the enormous amount changes that are occurring all our world at this time, I thought it would be appropriate to return to the blog and inject it with some important elements now moving through our lives. So, I will thank in advance the few good friends who will continue to read this material and tell me when it is good, when it is bad, and when it is indifferent.

Abu Bakr al–Baghdadi is Dead

Monday, October 28th, 2019

 

Pres. Trump was quite elated by the death of Abu Bakr al–Baghdadi. However, caution is necessary and much more needs to be done as clearly outlined in Richard Viguere’s  latest issue of ConservativeHQ.com.

The Death Of Abu Bakr al-Baghdadi Was A Great Victory, But…

| 10/28/19

President Trump is to be commended for authorizing the successful operation to kill ISIS leader Abu Bakr al-Baghdadi, and he’s right the American military personnel who carried out the mission “are the very best anywhere in the world.”

But, now that al-Baghdadi is dead is the world a safer place as the President claimed? We’re skeptical, to say the least.

The problem is that we have been fighting the war Islam declared on the West by counting casualties and holding territory, while doing next to nothing to defeat the ideology of Islamism.

The truth is that the real enemy in the Near East is political Islam, and the only way to defeat it is to drop the fiction that “Islam is a religion of peace” and use all our national power to present an alternative worldview that undermines and eventually destroys Sharia-supremacism and Iranian “Absolute Wilayat al-Faqih” (Guardianship of the Jurist).

None of the generals who have been tasked with fighting and winning the wars in Syria and Iraq, and certainly none of the politicians who have advocated United States involvement in them, have been willing to accept and confront that truth, and as a consequence the war that was supposed to be a three month intervention to defeat the “JV forces” of the Islamic State became a seven year sinkhole of American lives and treasure.

One proof of this problem lies here; the United States designated Hezbollah as a terrorist organization way back in 1997, but we have done nothing to defeat its ideology.

Indeed, as David Daoud of the Foundation for the Defense of Democracy noted, Hezbollah remains the most successful and most prominent Iranian revolutionary export. And Mr. Daoud is not the only one to hold that view.

Gilbert Achcar of the University of London has called Hezbollah “the most prestigious member of the regional family of Khomeinism.” The Lebanon-based terrorist group is cut from the same ideological cloth as the Islamic Republic, which, according to former CIA intelligence analyst Kenneth Pollack, is Hezbollah’s model and inspiration. Eitan Azani, the deputy executive director of the Institute for Counter-Terrorism at IDC Herzliya, has said that Khomeini and his successors serve as Hezbollah’s ultimate source of religious, political, and ideological guidance and authority. Hezbollah fully accepts the concept of Absolute Wilayat al-Faqih, and openly acknowledges Khomeini as its faqih, leading Augustus Richard Norton of Boston University to call Khomeini Hezbollah’s “undisputed, authoritative leader.”

Yet, when Iranians have protested the failures of Khomeinism and Absolute Wilayat al-Faqih the United States has done little or nothing concrete to use that popular discontent to undermine the regime, once again substituting holding worthless real estate in the Middle East and killing a few thousand ignorant jihadis for fighting and winning the real war – which is the defeat of Sharia-supremacism and Iranian Absolute Wilayat al-Faqih.

Done right, U.S. intervention in the Syrian civil war might have offered the possibility of a strategic defeat of Iran. If the United States acted to tip the balance of power in the civil war, Iran would have been weakened by the collapse of Bashar al-Assad’s regime, its single Arab ally and a vital link to their important clients – Lebanon’s Hezbollah militia.

Isolated, Iran would have become more vulnerable to international pressure to limit its nuclear program. As dean of the Paul H. Nitze School of Advanced International Studies at Johns Hopkins University, Vali Nasr observed for Bloomberg, if “Iran’s regional influence faded, those of its rivals — U.S. allies Turkey, Qatar and Saudi Arabia — would expand.”

However, America through its generals and diplomats never fought that war, because they never engaged it on the most important battlefield – the battlefield of secularism versus Sharia-supremacism and Iranian Absolute Wilayat al-Faqih.

With Abu Bakr al-Baghdadi dead it is probably safe to leave others to mop-up the scattered remnants of ISIS and hold the territory once occupied by the Islamic State. So, let’s savor the victory, and praise our military, but let’s not fool ourselves into thinking that killing al-Baghdadi and a few thousand jihadis and retaking some desolate ground in the Near East ends the war political Islam, and Iran in particular, have declared on the West.

George Rasley is editor of Richard Viguerie’s ConservativeHQ.com and is a veteran of over 300 political campaigns. A member of American MENSA, he served on the staff of Vice President Dan Quayle, as Director of Policy and Communication for then-Congressman Adam Putnam (FL-12) then Vice Chairman of the Oversight and Government Reform Committee’s Subcommittee on National Security and Foreign Affairs, and as spokesman for Rep. Mac Thornberry former Chairman of the House Armed Services Committee.

 


William S. Frankl, MD, All Rights Reserved