January 27th, 2017
A most interesting article attempting to explain the destruction of the dinosaurs. If correct, we better hurry up and build all those space ships necessary to get us out of here soon, or in the far future, when the next asteroid comes our way.
Lights Out:Asteroid Triggered Freezing Darkness That Killed Dinos
By:Laura Geggel, Senior Writer
The study was published online Jan. 13 in the journal Geophysical Research Letters. Original article on Live Science.
When a giant asteroid careened into Earth about 66 million years ago, the enormous collision led to the formation of an airborne “curtain” of sulfate molecules that blocked the sun’s light and led to years of freezing cold and darkness, a new study finds.
The finding shows how these droplets, or aerosols, of sulfuric acid formed high in the atmosphere, and likely contributed to the deaths of 75 percent of all animals on Earth, including nonavian dinosaurs such as Tyrannosaurus rex and long-necked sauropods, the researchers said.
Earlier studies suggested that the dino-killing asteroid kicked up dust and debris that hung in the air and blocked sunlight in the short term. But by using computer simulations, the researchers of the new study showed how droplets of sulfuric acid contributed to long-term cooling. [Wipe Out: History’s Most Mysterious Extinctions]
Moreover, the sudden, drastic drop in temperature likely caused the surface of the oceans to cool, which would have massively disturbed the marine ecosystems, the researchers said.
“The big chill following the impact of the asteroid that formed the Chicxulub crater in Mexico is a turning point in Earth history,” the study’s lead researcher Julia Brugger, a climate scientist at the Potsdam Institute for Climate Impact Research (PIK) in Germany, said in a statement. “We can now contribute new insights for understanding the much debated ultimate cause for the demise of the dinosaurs at the end of the Cretaceous era.”
Brugger and her colleagues employed a type of computer simulation typically used for climate modeling. The model showed that gases containing sulfur evaporated during the violent impact. These sulfuric molecules were the main ingredients that blocked the sun’s light on Earth and led to plummeting temperatures, they said.
For instance, before the asteroid hit, the tropics were an average temperature of 81 degrees Fahrenheit (27 degrees Celsius). But after the massive impact, the average temperature was 41 F (5 C), the researchers said,”It became cold, I mean, really cold,” Brugger said. Globally, temperatures fell at least 47 F (26 C). For at least three years following the asteroid’s crash, the average annual temperature fell below freezing, and the polar ice caps grew in size.
“The long-term cooling caused by the sulfate aerosols was much more important for the mass extinction than the dust that stays in the atmosphere for only a relatively short time,” study co-researcher Georg Feulner, a climate scientist at PIK, said in the statement. “It was also more important than local events like the extreme heat close to the impact, wildfires or tsunamis.”
In all, it took 30 years for Earth’s climate to recover, the researchers said.
As the air cooled, so did the ocean’s surface waters. This cold water became denser and thus heavier, and sank into the depths of the ocean. Meanwhile, warmer water from the deeper ocean rose, bringing up nutrients that likely led to giant algal blooms, the researchers said.
It’s possible these algal blooms produced toxic substances that affected life along the coasts, the researchers said. But regardless of whether they were toxic or not, the ocean’s massive mixing would have disrupted the marine ecosystem, and likely contributed to the extinction of its species, including the ammonites and the reptilian sea beasts known as plesiosaurs.
The new research illustrates what might happen to Earth if another asteroid were to cross its path, the researchers said.“It is fascinating to see how evolution is partly driven by an accident like an asteroid’s impact — mass extinctions show that life on Earth is vulnerable,” Feulner said. “It also illustrates how important the climate is for all life-forms on our planet. Ironically today, the most immediate threat is not from natural cooling but from human-made global warming.”
January 17th, 2017
The following is a most interesting article about aging, immortality, and what the future might hold for humanity. It’s a must read.
THE HUFFINGTON POST
On the Verge of Immortality, Or Are We Stuck with Death? A New Direction For Research Could Provide the Answers—and More
Bernard Starr, PhD
How long can human beings live? Is there an outside limit? Do we know enough about aging to break through possible biological barriers? Is the current approach to curing “age associated diseases” like Alzheimer’s flawed? Experts are sharply divided.
In 1962 eminent biologist Leonard Hayflick discovered that normal human fetal cells replicate a limited number of times. This phenomenon promptly acquired the moniker the “Hayflick Limit.” Later, biologists Calvin Harley and Carol Greider provided the molecular explanation for the Hayflick limit with their discovery that telomeres, the DNA biological material in every cell of our bodies, diminish each time cells divide.
In contrast, cancer cells, which are immortal, produce an enzyme called telomerase that maintains the length of telomeres and enables cancer cells to replicate without limit. The strategy of extending the life of normal cells by injecting telomerase has proven thorny, as reported by Dr. Elizabeth Blackburn, co-discoverer of telomerase: “too much telomerase can help confer immortality onto cancer cells and actually increase the likelihood of cancer, whereas too little telomerase can also increase cancer by depleting the healthy regenerative potential of the body..telomerase shots are not the magical anti-aging potion….”
The finite capacity of normal human fetal cells to divide (on average about 50 times) suggested to Hayflick that aging is responsible for the end of normal cell replication and eventually death. Other researchers translated Hayflick’s findings into a maximum human lifespan of 120 years.
A 2016 study at the Albert Einstein School of Medicine came up with a similar human lifespan limit of 115 years. The investigators drew their conclusion from surveys of longevity and mortality records in more than forty countries since 1900. While their findings showed an impressive increase in the number of people living beyond age 100 in recent decades, rarely did centenarians live longer than 115 years. One exception, Frenchwoman Jeanne Calment, died at age 122. She was a media sensation because she exceeded the traditional limit for longevity.
The dramatic increase in life expectancy from 18 years (at birth) in prehistoric times to an average of 79 in the U.S. today (and 1-4 years longer in more than 25 other countries) is not due to breakthroughs in our understanding of the biology of aging. Rather, it’s been achieved through the reduction in infant mortality, public health measures such as clean water, improved sanitation, better nutrition, healthy life styles, and the remarkable boost when antibiotics and vaccines were introduced.
But is the Hayflick Limit fixed, or is it a biological barrier that can be penetrated? Opinions vary.
At one extreme, Cambridge University trained Dr. Aubrey de Grey, Chief Science Officer of the SENS Research Foundation for the study of aging claims that emerging breakthroughs in the biology of aging have brought human lifespan to the verge of vastly extended longevity—and perhaps immortality. The first person to live to 1,000 years is likely walking the earth right now, he declares.
I met Aubrey de Grey several years ago at a screening of the film To Age or Not To Age, sponsored by the International Longevity Center. He was one of the researchers featured in the documentary. Afterwards I approached him with a question.
“Do you think civilization is ready for immortality?” I asked, since immortality has obvious implications for the social, economic, and political functioning of society.
De Grey didn’t like my question. He immediately launched into a lengthy rant. “Do you know how many people die each day and that it’s not necessary,” he remarked. “We have the means and knowledge…” I quickly realized that de Grey champions another version of right to life. So sure is he that death is not inevitable that he recoils at the idea that we dare think otherwise.
Dr. Leonard Hayflick takes a strong stand against De Grey’s position on life extension. And he has little respect for those touting “cures for aging.” The “fountain of youth” business, he says, is the first or second oldest profession.
What does Hayflick think of the work of MIT biologist Dr. Leonard Guarente I wanted to know. In 2016 Guarente generated a lot of fanfare when his newly formed company, Elysium, introduced a nutritional supplement called Basis. The main ingredient of Basis, nicotinamide riboside (NR), raises the body’s levels of nicotinamide adenine dinucleotide (NAD), which in turn, Guarente claims, can slow the aging process by boosting mitochondria, the energy dynamo of cells that diminishes with age. While Guarente’s Basis and anti-aging products of other companies may improve some aspects of bodily functioning, do they put the brakes on aging? Hayflick is doubtful if not dismissive of that notion.
I interviewed Dr. Hayflick on the telephone on October 27th and 29th 2016. He spoke from his home in Northern California. The strength of his voice, not to mention his convictions, belie his eighty-eight years. And he anticipates many productive years ahead, based on the principle that the best way to insure longevity is to pick your parents carefully. His mother lived to 106.
While he agrees that biology plays a role in longevity, Hayflick rejects claims that a genetic aging code is about to be broken, thus opening the floodgates for unlimited lifespans. In stark contrast to those who argue that researchers have accumulated a trove of knowledge about aging, Hayflick insists that “We know very little if not zero about the fundamental cause of aging.”
He emphasizes that all the advances in average life expectancy that have been derived from prevention and cures for diseases have not told us anything about the fundamental etiology of aging. “We do not know why cells age,” Hayflick told me. And until we expand our knowledge of the fundamental cause of aging he does not foresee significantly extending average life expectancy; he is even less hopeful about extending human lifespan beyond the current limit.
Hayflick says that if cures are miraculously found for the leading causes of death, that will add about 13 years to average life expectancy. But, he points out, those cures will not increase the lifespan beyond the current limit. He warns: “People will continue to die as a result of aging.” The explanation for why they are dying, he insists, will only be found by unraveling the mystery of the cause of molecular and cellular aging.
“How likely is that to happen?” I asked him. “Very unlikely,” he admitted. Hayflick laments that two to three percent at most of the $1.27 billion that the National Institute of Aging (NIA) spends annually on aging research is allocated to fundamental biological research. That’s why “little work is being done on the basic understanding of aging—not only in this country but worldwide.”
According to Transparency Market Research, the anti-aging market is projected to reach $91.7 billion globally by 2019. Most of that money will be for anti-aging products and services with possibly only a tiny percentage for basic biological research.
Dr. Jan Vijg, Chair in Molecular Genetics at the Albert Einstein School of Medicine in New York City, and a lead researcher on the recent longevity study, confirmed in an interview on November 16, 2016, that a miniscule amount of funding goes to basic biological research, where many of the questions about aging are more likely to find answers. Vijg agrees with Hayflick about the dearth of knowledge about cellular aging. He says we know a lot about factors such as genomes (the DNA of genes) that affect cellular senescence but the question of why cells age remains largely unanswered.
On the positive side, Vijg notes that scientists in the field of aging are increasingly focusing on the biology of aging, not just the cure of diseases. He told me that he has recently applied for a large grant for the study of drugs that target aging rather than specific diseases. Hayflick, he acknowledges, “was the original defender of this position to study aging per se and now he’s been proven correct.”
If that direction is endorsed by a growing consensus of scientists, why the dearth of funding, I asked?
Dr. Vijg points to an entrenched establishment driven by the public, special interests, and lobbyists who want immediate results. People accept aging and death as natural facts of life, Vijg says, but they don’t accept diseases as natural and thus they want cures for them. Basic research may seem abstract and remote. Few laypeople grasp that unraveling the underlying biology of aging could produce faster and more successful results.
Token funding for basic research on the biology of aging makes no sense, Hayflick argues, when it’s clear that aging is the condition that increases vulnerability to age-associated diseases. Physicians and other experts on aging talk glibly, he says, about age-associated diseases such as cancer, cardiovascular, Alzheimer’s, and other illnesses for which the elderly are at greater risk. And then they immediately utter the mantra that the greatest risk factor for age-associated diseases is aging. “But,” he adds, “they never ask themselves why all these major causes of death are occurring in older people.” If you try to answer that question logically, he continues, “you come to the conclusion that there must be something in old cells that provides the milieu or the opportunity for age-associated diseases that does not occur in young cells.” Isn’t it therefore highly probable, he conjectures, that “old cells may provide the condition that allows for the emergence of all age-associated diseases?”
If Hayflick’s analysis is correct, shouldn’t a significant part of the fifty percent of the NIA budget for aging research, which Hayflick says is designated for the treatment and cure of Alzheimer’s (Vijg estimates an even higher percentage), be shifted to research on molecular and cellular aging, where a cure may be found?
Hayflick gets emotional in his frustration that researchers are not aggressively pursuing a strategy to understand why old cells are different from young cells: “Why in the hell aren’t we studying the fundamental biology of aging if that is the major risk factor for age-associated diseases? Why are we ignoring it almost 100 percent?”
While unlocking the keys to cellular aging might enable vast numbers of people to live closer to the limit of life expectancy, Hayflick still cautions that it will not extend lifespan beyond its current limit. What then does he say about the limit? Is it fixed or can it be extended. And if it is possible to increase it, by how much?
Here Hayflick’s analysis turns to an overarching law of nature. He explains that cells, like all things animate and inanimate, are subject to the second law of thermodynamics, which states that energy dissipates or spreads out when not constrained. Applied to aging, this means that entropy (energy dissipation) increases over time—and the increase in entropy forecasts the inevitability of death. Sounds pessimistic, but is that the end of the story? Maybe not.
Vijg acknowledges entropy as a limiting factor, but he believes it could be slowed if we had a better understanding of entropy at the cellular level. He also expresses great faith in science and therefore will not rule out future discoveries that could lead to a significant increase in human lifespan. Hayflick as well will not bet against science, but he adds this stern caveat: “First we must invest substantially in the study of the basic biology of aging.”
Note: The first and second laws of thermodynamics were introduced by Rudolf Clausius and William Thomson around 1850.
Bernard Starr,PhD, is Professor Emeritus at the City University of New York (Brooklyn College), where he directed a graduate program in gerontology. He is founder and editor of a number of publications in the field of aging: The Springer Publishing Company Series on Adulthood and Aging, the Springer Series on Lifestyles and Issues in Aging, and the cutting edge Annual Review of Gerontology and Geriatrics. For seven years he was writer, producer and host of an award winning radio commentary, The Longevity Report, on WEVD-AM Radio in NYC. During the same period— for three years—he wrote op-ed articles for the Scripps Howard News Service on healthcare, the “boomers,” and issues of an aging society.
Follow Bernard Starr on Twitter: www.twitter.com/starrprobe
Leonard Hayflick has studied the fundamental biology of aging for over 50 years. He discovered that cultured normal human cells are mortal and age and that only cancer cells are immortal thus upsetting a 60-year old dogma.
Hayflick is a Fellow of the American Association for the Advancement of Science, an Honorary Member of the Tissue Culture Association and, a Life Member of the British Society for Research on Ageing. According to the Institute of Scientific Information, he is one of the most cited contemporary scientists in the world “in the fields of biochemistry, biophysics, cell biology, enzymology, genetics and molecular biology.” Dr. Hayflick is the author of over 280 scientific papers, book chapters and edited books of which four papers are among the 100 most cited scientific papers of the two million papers published in the basic biomedical sciences from 1961 to 1978.
Dr. Hayflick is the author of the popular book, “How and Why We Age” published in August 1994 by Ballantine Books, NYC and available in 1996 as a paperback. This book has been translated into nine languages and is published in Japan, Brazil, Russia, Spain, Germany, the Czech Republic, Poland, Israel and Hungary. It was a selection of The Book-of-the-Month Club and has sold over 50,000 copies world-wide.
January 15th, 2017
Another unforgettable moment in the almost 8 years of President Barack Hussain Obama
Vision to America News
Obama Awards Himself with Distinguished Public Service Medal
January 5, 2017
By: Joe Newby
On Wednesday, Warner Todd Huston reported at Breitbart.com that President Obama added another medal to his collection when he had Defense Secretary Ash Carter award him with the Department of Defense Medal for Distinguished Public Service.
According to Huston:
Secretary Carter awarded his boss with the medal on January 4 during the Armed Forces Full Honor Farewell Review for the President held at Conmy Hall, Joint Base Myer-Henderson Hall in Virginia.
Carter insisted that the medal was a token of appreciation for Obama’s service as commander in chief, the Associated Press reported.
After spending the last few weeks throwing roadblocks in the path of President-elect Donald Trump and his transition team, Obama nonetheless claimed in his remarks to the members of the military in attendance that “We’ve got to make sure that during this transition period that there is a seamless passing of the baton, that there’s continuity.”
“Thus, President Obama officially declares himself the greatest public servant during his own tenure,” a post at Conservative Treehouse said.
Defense Secy Carter presents Pres Obama with Dept of Defense Medal for Distinguished Public Service. — Mark Knoller (@markknoller) January 4, 2017
January 15th, 2017
A Remarkably Complete Analysis of the Recent Unexpected and Highly Anti-Israeli Resolition That the Obama Administration Allowed to Be Passed.
Obama’s Betrayal of Israel
by Guy Millière
January 13, 2017
• President Obama’s decision not to use the US veto in the UN Security Council and to let pass Resolution 2334, effectively sets the boundaries of a future Palestinian state. The resolution declares all of Judea, Samaria and East Jerusalem — home to the Old City, the Western Wall and the Temple Mount — the most sacred place in Judaism — “occupied Palestinian territory,” and is a declaration of war against Israel.
• Resolution 2334 nullified any possibility of further negotiations by giving the Palestinians everything in exchange for nothing — not even an insincere promise of peace.
• The next act is the Orwellian-named “peace conference,” to be held in Paris on January 15. It has but one objective: to set the stage to eradicate Israel.
• In this new “Dreyfus trial,” the accused will be the only Jewish state and the accusers will be the OIC and officials from Islamized, dhimmified, anti-Israel Western states. As in the Dreyfus trial, the verdict has been decided before it even starts. Israel will be considered guilty of all charges and condemned. A draft of the declaration to be published at the end of the conference is already available.
• The declaration rejects any Jewish presence beyond the 1949 armistice lines — thereby instituting apartheid. It also praises the “Arab Peace Initiative,” which calls for returning of millions of so-called “refugees” to Israel, thus transforming Israel into an Arab Muslim state where a massacre of Jews could conveniently be organized.
• The declaration is most likely meant serve as the basis for a new Security Council resolution on January 17 that would recognize a Palestinian state inside the “1967 borders,” and be adopted, thanks to a second US abstention, three days before Obama leaves office. The betrayal of Israel by the Obama administration and by Obama himself would then be complete.
• The US Congress is already discussing bills to defund the UN and the Palestinian Authority. If Europeans think that the incoming Trump administration is as spineless as the Obama administration, they are in for a shock.
• Khaled Abu Toameh noted that the Palestinian Authority sees Resolution 2334 as a green light for more murders and violence.
• Daniel Pipes recently wrote that it is time to acknowledge the failure of a “peace process” that is really a war process. He stresses that peace can only come when an enemy is defeated.
• Resolution 2334 and the Paris conference, both promoted by Obama, are, as the great historian Bat Ye’or wrote, simply a victory for jihad.
The Middle East is in chaos. More than half a million people have been killed in the Syrian war and the number is rising. Bashar al-Assad’s army used chemical weapons and barrel bombs against civilians; Russia has bombed schools and hospitals.
Syrians, Christians, Yazidis, Libyans, Yemenis and Egyptians all face lethal treats. Iranian leaders still shout “Death to Israel” and “Death to America” while buying nuclear equipment with money from lifted sanctions. Turkey is sliding toward an Islamist dictatorship, and unable to stem attacks against it.
The only democratic and stable country in the region is Israel, and that is the country U.S. President Barack Obama, in the final weeks of his term, chooses to incriminate. His decision not to use the US veto in the UN Security Council, to let pass Resolution 2334, effectively sets the boundaries of a future Palestinian state. The resolution also declares all of Judea, Samaria and East Jerusalem, home to the Old City, the Western Wall and the Temple Mount — the most sacred place in Judaism — “occupied Palestinian territory,” and is a declaration of war against Israel.
UNSC Resolution 2334 nullified any possibility of further negotiations, by giving the Palestinians everything in exchange for nothing — not even an insincere promise of peace. US Secretary of State John Kerry’s speech five days later confirmed Obama’s support for the resolution. Kerry, like US Ambassador to the UN Samantha Power, used the existence of Jewish towns and villages in Judea and Samaria as a pretext to endorse the position of Palestinian leaders, who want to ethnically cleanse Jews from these areas. But this was just a prelude.
The next act is the Orwellian-named “peace conference,” to be held in Paris on January 15. It has but one objective: to set the stage to eradicate Israel.
Organized by François Hollande, a failed French President who will leave power in four months, it was supported from the start by the Obama administration. Israeli Defense Minister Avigdor Lieberman called it “the new Dreyfus trial.” The accused will be the only Jewish state and the accusers will be the Organization of Islamic Cooperation (OIC) and officials from Islamized, dhimmified, anti-Israel Western states. As in the Dreyfus trial, the verdict is known before it starts. Israel will be considered guilty of all charges and condemned to what its accusers hope will be the beginning of its end.
Is Barack Obama planning another betrayal of Israel at next week’s Paris “peace conference,” organized by French President François Hollande? Pictured: Obama and Hollande in Washington, May 18, 2012. (Image source: White House)
Some commentators have compared what will happen in Paris to the 1942 Wannsee Conference in Nazi Germany, because the aim seems clearly to be the “final solution” of the “Jewish problem” in the Middle East. A draft of the declaration to be published at the end of the conference is already available. It affirms unreserved support for the “Palestinian Statehood strategy” and the principle of intangibility (that the borders cannot be modified) of the “1967 borders,” including East Jerusalem, the Old City and the Western Wall.
The draft declaration rejects any Jewish presence beyond these borders — thereby instituting apartheid — and praises the “Arab Peace Initiative,” which calls for returning millions of so-called “refugees” to Israel, and thus the transforming of Israel into an Arab Muslim state — where a massacre of the Jews could conveniently be organized.
The declaration is most likely meant to be the basis for a new UN Security Council resolution that would endorse the recognition of a Palestinian state in the “1967 borders” as defined in the declaration. The new resolution could be adopted by a second US abstention at the Security Council on January 17, three days before Obama leaves office. The betrayal of Israel by the Obama administration and by Obama himself would then be complete.
On January 20, however, Donald J. Trump is to take office as President of the United States. Trump sent a message on December 23: “Stay strong Israel, January 20th is fast approaching!” He added explicitly that the U.S. “cannot continue to let Israel be treated with such total disdain and disrespect.”
On January 5, the US House of Representatives approved a text harshly criticizing Resolution 2334. Congress is already discussing defunding the UN and the Palestinian Authority. If Europeans and members of UN think the incoming Trump administration is as spineless as the Obama administration, they are in for a shock.
Wall Street Journal columnist Bret Stephens recently wondered if the creation of a Palestinian state would alleviate the current Middle East chaos. His answer was that it would not, and that the creation of a Palestinian state would be seen as a victory for jihadists. He also noted that the Palestinian Authority still behaves like a terrorist entity; that an Israeli withdrawal from Judea and Samaria would encourage Hamas and lead to the creation of another terrorist Islamic state in the West Bank, and that an Israeli withdrawal is something that most Palestinians do not even want:
“[A] telling figure came in a June 2015 poll conducted by the Palestinian Center for Public Opinion, which found that a majority of Arab residents in East Jerusalem would rather live as citizens with equal rights in Israel than in a Palestinian state.”
Khaled Abu Toameh, an Arab journalist who has never yet been wrong, noted that the Palestinian Authority sees Resolution 2334 as a green light for more violence, murders and confrontation. He added that if presidential elections by the PA were held today, Hamas leader Ismail Haniyeh would win by a comfortable margin.
In another important article, Middle East scholar Daniel Pipes writes that it is time to acknowledge the failure of a “peace process” that is really a war process. He stressed that peace can only come when an enemy is defeated. He predicts that for peace to come, Israel must win unambiguously, and the Palestinians pass through “the bitter crucible of defeat, with all its deprivation, destruction, and despair.”
Jihadi indoctrination, as well as the financial aid given to Palestinian terrorists, have been paid for by the United States, France, and other Western European nations. That too should stop.
Resolution 2334 and the Paris peace conference, both promoted by Obama, are, as the great historian Bat Ye’or wrote, simply victories for jihad.
Dr. Guy Millière, a professor at the University of Paris, is the author of 27 books on France and Europe.
January 15th, 2017
My friend, Sol Shalit, sent me a reminder that Jan 11, 2017 was the birthday of William James who was born 175 years ago in New York City. Sol included this wonderful tribute to James’ life and work. I think it is worth republishing for others to read.
It’s the birthday of William James, (books by this author) born in New York City (1842). As a young man, he studied art, then went on to Harvard University and earned a medical degree there. But he was never a practicing doctor — instead, he stayed on as a member of the Harvard faculty. He said: “I originally studied medicine in order to be a physiologist, but I drifted into psychology and philosophy from a sort of fatality. I never had any philosophic instruction, the first lecture on psychology I ever heard being the first I ever gave.”
In 1872, a group of Harvard intellectuals, including James, began a conversation group. Charles Sanders Pierce wrote: “It was in the earliest seventies that a knot of us young men in Old Cambridge, calling ourselves, half-ironically, half-defiantly, ‘The Metaphysical Club,’ — for agnosticism was then riding its high horse, and was frowning superbly upon all metaphysics, —used to meet, sometimes in my study, sometimes in that of William James.” Members came from various academic disciplines, including law, medicine, and philosophy. William’s younger brother, Henry James, wrote about Oliver Wendell Holmes: “He, my brother, and various other long-headed youths have combined to form a metaphysical club, where they wrangle grimly and stick to the question. It gives me a headache merely to know of it.”
William James’ most famous contribution to philosophy is an idea called pragmatism. Pragmatism was first conceived of by Charles Sanders Peirce, but it didn’t catch on. James himself had a hard time understanding Peirce. He wrote to his brother: “I am amused that you should have fallen into the arms of C.S. Peirce, whom I imagine you find a rather uncomfortable bedfellow, thorny and spinous, but the way to treat him is after the fabled ‘nettle’ receipt: grasp firmly, contradict, push hard, make fun of him, and he is as pleasant as anyone; but be overawed by his sententious manner and his paradoxical and obscure statements, wait upon them as it were, for light to dawn, and you will never get a feeling of ease with him any more than I did for years, until I changed my course and treated him more or less chaffingly. I confess I like him very much in spite of his peculiarities, for he is a man of genius and there’s always something in that to compel one’s sympathy.”
Unlike Peirce, William James was not a philosophical genius, and he didn’t see anything wrong with taking a complex concept and oversimplifying it for the sake of making it more accessible. The term “pragmatism” was first used in a lecture James gave at the University of California Berkeley in 1898. But James was quick to give the credit for the term to Peirce, who he said had thought of it about 20 years earlier.
According to James, pragmatism valued the practical outcome of an idea above the idea itself. He saw a huge divide in philosophy between what he called “tough-minded” and “tender-minded” ways of looking at the world. He associated a “tough-minded” view with science, empirical evidence, atheism, pessimism, skepticism, and materialism. “Tender-minded,” on the other hand, went along with idealism, optimism, religion, dogma, and free will. James thought that pragmatism was a way of getting beyond this divide, and plenty of other dualities that caused conflict.
The way that pragmatism bridged these divides was to ask, with every idea, what the practical outcome of two opposing sides would be. If there was no significant difference in a practical outcome, then there was no significant conflict between two sides. One of James’ examples was the conflict that philosophers perceived between free will and determinism. James pointed out that there was no clear practical difference between having free will and believing in determinism — therefore, there was no fundamental conflict.
James also said that pragmatism was a philosophy of truth. He said, “The true is the name of whatever proves itself to be good in the way of belief, and good, too, for definite assignable reasons.” In James’ pragmatism, “truth” was a large concept — something could be true because it was actually experienced in a direct way, or it could be true because it contributed to overall happiness. So he allowed for a lot of religious and spiritual beliefs to coexist with empirical thinking, because religion was true in the sense that it added meaning to life. He said: “If theological ideas prove to have a value for concrete life, they will be true, for pragmatism, in the sense of being good for so much. How much more they are true, will depend entirely on their relations to the other truths that also have to be acknowledged.” That was another idea of his — that abstract ideas (like religious beliefs) were fine, and could coexist with empirical observations, as long as they did not “clash with other vital beliefs.” So until they started to get in the way, they were true enough.
James’ pragmatism was based in empiricism, in the sense that experience should be the ultimate context for everything. But unlike some of the more rigid empirical philosophers like David Hume, who thought experience was only what was experienced by the senses, James said that experience could also include metaphysical ideas, religion, or anything at all that was part of our experience as human beings.
In the aftermath of the Civil War, it made sense that Americans embraced pragmatism — a new approach, a practical approach, and an attempt to reconcile seemingly opposing sides. Pragmatism was popularized by James, Peirce, and John Dewey, one of Peirce’s students. Dewey lived until 1952, and he had a long and prolific career. By the time he died at the age of 92, he had published 40 books and hundreds of articles. Dewey called his philosophy “instrumentalism” rather than “pragmatism,” but he is generally considered the third major pragmatist. He helped make the philosophy seem even more relevant to Americans, writing about education, art, civic life, and government.
Even though it is such a complex philosophy, today we use the word pragmatism in an offhand way, to mean “practicality.”
Another term that James coined and popularized was “stream-of-consciousness,” which he meant as a psychological term. He said, “It is a fact that in each of us, when awake (and often when asleep), some kind of consciousness is always going on. There is a stream, a succession of states, or waves, or fields (or of whatever you please to call them), of knowledge, of feeling, of desire, of deliberation, etc., that constantly pass and repass, and that constitute our inner life. The existence of this stream is the primal fact, the nature and origin of it form the essential problem, of our science.” But eventually he settled on “stream-of-consciousness,” an idea that other scholars lifted from psychology and used to talk about literature.