• Home page of novelist William S. Frankl, M.D.
  • About author William S. Frankl, M.D.
  • Books by novelist William S. Frankl, M.D.
  • Reviews of the writing of author William S. Frankl, M.D.
  • Blog of author William (Bill) S. Frankl, M.D.
  • Contact author William S. Frankl, M.D.
Title: Blog by Novelist William S. Frankl, MD

Archive for May, 2018

The Hoover Institution/Victor Davis Hanson

Sunday, May 13th, 2018

Victor Davis Hanson was one of the top conservative thinkers of the 20th century and remains so, as well, in our early 21st century. He has just received a highly coveted award from the Hoover Instite at Stanford University.

Victor Davis Hanson Wins Edmund Burke Award
Wednesday, May 2, 2018
Hoover Institution, Stanford University

This week Victor Davis Hanson won the 2018 Edmund Burke Award, which honors people who have made major contributions to the defense of Western civilization.

The honor is given annually by The New Criterion, a monthly journal of the arts and intellectual life. Edmund Burke was an 18th century Irish political philosopher who is credited with laying the foundations of modern conservatism.

Hanson, the Martin and Illie Anderson Senior Fellow at the Hoover Institution, studies and writes about the classics and military history. He received the sixth Edmund Burke Award for Service to Culture and Society at an April 26 dinner in New York City.

“I was honored to receive the award because Edmund Burke is often identified as both a defender of republican values and traditions and a foe of both autocracy and the radical mob rule of the French Revolution. I grew up on a farm and still live there most of the week. I’ve learned over a lifetime from rural neighbors and friends that agrarianism can inculcate a natural conservatism that I think Burke and others saw as an essential check on radicalism and an independence necessary to resist authoritarianism,” Hanson wrote in an email afterwards.

He noted that “candor, truth, and defiance in the face of historical and unfounded attacks on the West are essential.”

Western civilization has always been the only nexus where freedom, tolerance, constitutional government, human rights, economic prosperity, and security can be found together in their entirety, Hanson added.

“We can see that in the one-way nature of migrations from non-West to the West and in the alternatives on the world scene today. The great dangers to the West, ancient and modern, have always been its own successes, or rather the combination of the affluence that results from free-market capitalism and the entitlement accruing from consensual government. The result is that Westerners can become complacent, hypercritical of their own institutions, and convinced that they are not good if not perfect, or that the sins of mankind are the unique sins of the West,” he said.

This complacence, he said, and the idea that “utopia is attainable often results in amnesia” about the past and a sort of ignorance about the often brutal way the world works outside the West.

“Obviously if we do not defend our unique past and culture, who else will?” he said.

In his remarks on April 26, Roger Kimball, the editor and publisher of The New Criterion, said “Victor cuts across the chattering static of the ephemeral, bringing us back to a wisdom that is as clear-eyed and disabused as it is generous and serene.”

Hanson is also the chairman of the Role of Military History in Contemporary Conflict Working Group at the Hoover Institution.

Is Mount Mantap Just “tired” Or “completely exhaused?”

Sunday, May 13th, 2018

While President Trump and Secretary of State Pompeo are touting how great it is that North Korea is dismantling its nuclear test site, it might not at all be due to a turn toward peace and a nuclear free Korean Peninsula. Please read below what the real reason is likely to be.

Live Science
Why is North Korea Shutting Down Its Nuclear Test Site?
by Yasemen Saplakoglu
April 27, 2018

A train of mining carts and new structure are seen at the West Portal spoil pile within the Punggye-ri Nuclear Test Site in North Korea on April 20, 2018. The testing site sits on Mount Mantap, which seems to have “tired mountain syndrome.”
Credit: DigitalGlobe/38 North via Getty Images

Last week, North Korea announced that it will cease all nuclear testing and will shut down its main testing facility at Mount Mantap. Although some believe the decision came because of easing tensions between the country and the world, others think Mount Mantap may have come down with a bad case of “tired mountain syndrome.”

But what exactly is tired mountain syndrome, and how does a mountain “catch” it?

It turns out that repeated nuclear blasts can weaken the rock around underground nuclear test sites, eventually making them unsafe or unusable — which might have happened with North Korea’s preferred testing grounds. [North Korea: A Hermit Country from Above (Photos)]
Powerful explosions

The hermit country’s latest nuclear test, conducted in September 2017 at Punggye-ri, was at least 17 times more powerful than the bomb that was dropped on Hiroshima, Japan, in 1945, according to The Washington Post.

In fact, the explosion registered as a magnitude-6.3 earthquake, and before-and-after satellite shots showed visible movement at Mount Mantap — a 7,200-foot-high (2,200 meters) mountain under which deeply buried tunnels house most of the tests. Some geologists think that the mountain is cracking under the pressure.

“You can take a piece of rock and set it on the ground, take a hammer, tap it; nothing will happen,” said Dale Anderson, a seismologist at Los Alamos National Laboratory. You keep tapping it — and, say — the 21st time, “it will break and crack open.”

When a nuclear explosion goes off inside a mountain, it breaks the surrounding rock, and the energy propagates out like a wave (imagine throwing a pebble into a lake). But as more explosions go off around the same — but not exact — spot, rocks that are farther away also begin to crumble under repeated stress.

“The accumulated effect of these explosions that weaken rocks and create that fracturing [farther away from the point of explosion] is what we call tired mountain syndrome,” Anderson told Live Science.

Tired mountain syndrome can also stymie scientists trying to measure how strong an explosion is, he said. The propagating energy scatters around these fractured rocks before reaching the sensors, so the explosion registers as a lot weaker than it actually is, he added.

But this effect “has nothing to do with being able to use the facility,” Anderson said.

In fact, a country can keep using the site but must adjust the mathematical equations it uses so that the final magnitude of the explosion takes tired mountain syndrome into account.
Toxic seepage

If nuclear test sites are shut down, Anderson said, it’s usually a direct consequence of the syndrome. Mountains with this condition become much more permeable, meaning that more pathways open up for gas and liquid to travel through the rock. This means there’s a greater chance for radioactive gas — with the most concerning being xenon — to escape the rock and seep out to the surface, Anderson said.

“Mother nature has already fractured the rock,” Anderson said. “When an explosion goes off, sometimes damage [from it] will connect with natural fractures, and you can conceivably get a pathway up to the surface, and gases will seep out.”

The process by which gas could be pulled up and through the rock is called barometric pumping.

A group of Chinese geologists said on Wednesday (April 25) that they believe the nuclear test site had collapsed and that Mount Mantap was in “fragile fragments,” according to The Washington Post. But William Leith, the senior science adviser for earthquake and geologic hazards at the U.S. Geological Survey — who with one other scientist coined the term to describe a Soviet nuclear testing site in 2001— doesn’t think it is. In an interview with CBC Radio in October, when asked if the mountain in North Korea was tired, he said, “I would say, ‘not very tired.’ And that’s because they’ve only had, as far as we know, six underground nuclear explosions, and there’s a lot of mountain left there.”

In comparison, he and his colleagues first used the term to describe Degelen Mountain in the former Soviet Union (now Kazakhstan), which was battered by more than 200 explosions.

North Korea’s mountain may be tired — but whether it’s completely exhausted is difficult to say.

Quantum Mechanics

Sunday, May 13th, 2018

LIVE SCIENCE

What Is Quantum Mechanics?
By Robert Coolman
9/26/2014

Quantum mechanics is the branch of physics relating to the very small.

It results in what may appear to be some very strange conclusions about the physical world. At the scale of atoms and electrons, many of the equations of classical mechanics, which describe how things move at everyday sizes and speeds, cease to be useful. In classical mechanics, objects exist in a specific place at a specific time. However, in quantum mechanics, objects instead exist in a haze of probability; they have a certain chance of being at point A, another chance of being at point B and so on.
Three revolutionary principles

Quantum mechanics (QM) developed over many decades, beginning as a set of controversial mathematical explanations of experiments that the math of classical mechanics could not explain. It began at the turn of the 20th century, around the same time that Albert Einstein published his theory of relativity, a separate mathematical revolution in physics that describes the motion of things at high speeds. Unlike relativity, however, the origins of QM cannot be attributed to any one scientist. Rather, multiple scientists contributed to a foundation of three revolutionary principles that gradually gained acceptance and experimental verification between 1900 and 1930. They are:

Quantized properties: Certain properties, such as position, speed and color, can sometimes only occur in specific, set amounts, much like a dial that “clicks” from number to number. This challenged a fundamental assumption of classical mechanics, which said that such properties should exist on a smooth, continuous spectrum. To describe the idea that some properties “clicked” like a dial with specific settings, scientists coined the word “quantized.”

Particles of light: Light can sometimes behave as a particle. This was initially met with harsh criticism, as it ran contrary to 200 years of experiments showing that light behaved as a wave; much like ripples on the surface of a calm lake. Light behaves similarly in that it bounces off walls and bends around corners, and that the crests and troughs of the wave can add up or cancel out. Added wave crests result in brighter light, while waves that cancel out produce darkness. A light source can be thought of as a ball on a stick being rhythmically dipped in the center of a lake. The color emitted corresponds to the distance between the crests, which is determined by the speed of the ball’s rhythm.

Waves of matter: Matter can also behave as a wave. This ran counter to the roughly 30 years of experiments showing that matter (such as electrons) exists as particles.
Quantized properties?

In 1900, German physicist Max Planck sought to explain the distribution of colors emitted over the spectrum in the glow of red-hot and white-hot objects, such as light-bulb filaments. When making physical sense of the equation he had derived to describe this distribution, Planck realized it implied that combinations of only certain colors (albeit a great number of them) were emitted, specifically those that were whole-number multiples of some base value. Somehow, colors were quantized! This was unexpected because light was understood to act as a wave, meaning that values of color should be a continuous spectrum. What could be forbidding atoms from producing the colors between these whole-number multiples? This seemed so strange that Planck regarded quantization as nothing more than a mathematical trick. According to Helge Kragh in his 2000 article in Physics World magazine, “Max Planck, the Reluctant Revolutionary,” “If a revolution occurred in physics in December 1900, nobody seemed to notice it. Planck was no exception …”

Planck’s equation also contained a number that would later become very important to future development of QM; today, it’s known as “Planck’s Constant.”

Quantization helped to explain other mysteries of physics. In 1907, Einstein used Planck’s hypothesis of quantization to explain why the temperature of a solid changed by different amounts if you put the same amount of heat into the material but changed the starting temperature.

Since the early 1800s, the science of spectroscopy had shown that different elements emit and absorb specific colors of light called “spectral lines.” Though spectroscopy was a reliable method for determining the elements contained in objects such as distant stars, scientists were puzzled about why each element gave off those specific lines in the first place. In 1888, Johannes Rydberg derived an equation that described the spectral lines emitted by hydrogen, though nobody could explain why the equation worked. This changed in 1913 when Niels Bohr applied Planck’s hypothesis of quantization to Ernest Rutherford’s 1911 “planetary” model of the atom, which postulated that electrons orbited the nucleus the same way that planets orbit the sun. According to Physics 2000 (a site from the University of Colorado), Bohr proposed that electrons were restricted to “special” orbits around an atom’s nucleus. They could “jump” between special orbits, and the energy produced by the jump caused specific colors of light, observed as spectral lines. Though quantized properties were invented as but a mere mathematical trick, they explained so much that they became the founding principle of QM.
Particles of light?

In 1905, Einstein published a paper, “Concerning an Heuristic Point of View Toward the Emission and Transformation of Light,” in which he envisioned light traveling not as a wave, but as some manner of “energy quanta.” This packet of energy, Einstein suggested, could “be absorbed or generated only as a whole,” specifically when an atom “jumps” between quantized vibration rates. This would also apply, as would be shown a few years later, when an electron “jumps” between quantized orbits. Under this model, Einstein’s “energy quanta” contained the energy difference of the jump; when divided by Planck’s constant, that energy difference determined the color of light carried by those quanta.

With this new way to envision light, Einstein offered insights into the behavior of nine different phenomena, including the specific colors that Planck described being emitted from a light-bulb filament. It also explained how certain colors of light could eject electrons off metal surfaces, a phenomenon known as the “photoelectric effect.” However, Einstein wasn’t wholly justified in taking this leap, said Stephen Klassen, an associate professor of physics at the University of Winnipeg. In a 2008 paper, “The Photoelectric Effect: Rehabilitating the Story for the Physics Classroom,” Klassen states that Einstein’s energy quanta aren’t necessary for explaining all of those nine phenomena. Certain mathematical treatments of light as a wave are still capable of describing both the specific colors that Planck described being emitted from a light-bulb filament and the photoelectric effect. Indeed, in Einstein’s controversial winning of the 1921 Nobel Prize, the Nobel committee only acknowledged “his discovery of the law of the photoelectric effect,” which specifically did not rely on the notion of energy quanta.

Roughly two decades after Einstein’s paper, the term “photon” was popularized for describing energy quanta, thanks to the 1923 work of Arthur Compton, who showed that light scattered by an electron beam changed in color. This showed that particles of light (photons) were indeed colliding with particles of matter (electrons), thus confirming Einstein’s hypothesis. By now, it was clear that light could behave both as a wave and a particle, placing light’s “wave-particle duality” into the foundation of QM.
Waves of matter?

Since the discovery of the electron in 1896, evidence that all matter existed in the form of particles was slowly building. Still, the demonstration of light’s wave-particle duality made scientists question whether matter was limited to acting only as particles. Perhaps wave-particle duality could ring true for matter as well? The first scientist to make substantial headway with this reasoning was a French physicist named Louis de Broglie. In 1924, de Broglie used the equations of Einstein’s theory of special relativity to show that particles can exhibit wave-like characteristics, and that waves can exhibit particle-like characteristics. Then in 1925, two scientists, working independently and using separate lines of mathematical thinking, applied de Broglie’s reasoning to explain how electrons whizzed around in atoms (a phenomenon that was unexplainable using the equations of classical mechanics). In Germany, physicist Werner Heisenberg (teaming with Max Born and Pascual Jordan) accomplished this by developing “matrix mechanics.” Austrian physicist Erwin Schrödinger developed a similar theory called “wave mechanics.” Schrödinger showed in 1926 that these two approaches were equivalent (though Swiss physicist Wolfgang Pauli sent an unpublished result to Jordan showing that matrix mechanics was more complete).

The Heisenberg-Schrödinger model of the atom, in which each electron acts as a wave (sometimes referred to as a “cloud”) around the nucleus of an atom replaced the Rutherford-Bohr model. One stipulation of the new model was that the ends of the wave that forms an electron must meet. In “Quantum Mechanics in Chemistry, 3rd Ed.” (W.A. Benjamin, 1981), Melvin Hanna writes, “The imposition of the boundary conditions has restricted the energy to discrete values.” A consequence of this stipulation is that only whole numbers of crests and troughs are allowed, which explains why some properties are quantized. In the Heisenberg-Schrödinger model of the atom, electrons obey a “wave function” and occupy “orbitals” rather than orbits. Unlike the circular orbits of the Rutherford-Bohr model, atomic orbitals have a variety of shapes ranging from spheres to dumbbells to daisies.

In 1927, Walter Heitler and Fritz London further developed wave mechanics to show how atomic orbitals could combine to form molecular orbitals, effectively showing why atoms bond to one another to form molecules. This was yet another problem that had been unsolvable using the math of classical mechanics. These insights gave rise to the field of “quantum chemistry.”
The uncertainty principle

Also in 1927, Heisenberg made another major contribution to quantum physics. He reasoned that since matter acts as waves, some properties, such as an electron’s position and speed, are “complementary,” meaning there’s a limit (related to Planck’s constant) to how well the precision of each property can be known. Under what would come to be called “Heisenberg’s uncertainty principle,” it was reasoned that the more precisely an electron’s position is known, the less precisely its speed can be known, and vice versa. This uncertainty principle applies to everyday-size objects as well, but is not noticeable because the lack of precision is extraordinarily tiny. According to Dave Slaven of Morningside College (Sioux City, IA), if a baseball’s speed is known to within a precision of 0.1 mph, the maximum precision to which it is possible to know the ball’s position is 0.000000000000000000000000000008 millimeters.
Onward

The principles of quantization, wave-particle duality and the uncertainty principle ushered in a new era for QM. In 1927, Paul Dirac applied a quantum understanding of electric and magnetic fields to give rise to the study of “quantum field theory” (QFT), which treated particles (such as photons and electrons) as excited states of an underlying physical field. Work in QFT continued for a decade until scientists hit a roadblock: Many equations in QFT stopped making physical sense because they produced results of infinity. After a decade of stagnation, Hans Bethe made a breakthrough in 1947 using a technique called “renormalization.” Here, Bethe realized that all infinite results related to two phenomena (specifically “electron self-energy” and “vacuum polarization”) such that the observed values of electron mass and electron charge could be used to make all the infinities disappear.

Since the breakthrough of renormalization, QFT has served as the foundation for developing quantum theories about the four fundamental forces of nature: 1) electromagnetism, 2) the weak nuclear force, 3) the strong nuclear force and 4) gravity. The first insight provided by QFT was a quantum description of electromagnetism through “quantum electrodynamics” (QED), which made strides in the late 1940s and early 1950s. Next was a quantum description of the weak nuclear force, which was unified with electromagnetism to build “electroweak theory” (EWT) throughout the 1960s. Finally came a quantum treatment of the strong nuclear force using “quantum chromodynamics” (QCD) in the 1960s and 1970s. The theories of QED, EWT and QCD together form the basis of the Standard Model of particle physics. Unfortunately, QFT has yet to produce a quantum theory of gravity. That quest continues today in the studies of string theory and loop quantum gravity.

Robert Coolman is a graduate researcher at the University of Wisconsin-Madison, finishing up his Ph.D. in chemical engineering. He writes about math, science and how they interact with history. Follow Robert @PrimeViridian. Follow us @LiveScience, Facebook & Google+.
s

Einstein/Spooky Action At A Distance

Sunday, May 13th, 2018

If one finds this difficult to understand, read it over several times and then read the next post on Quantum Mechanics. Read that over slowly and more than several times.

Understanding all this is worth the effort. It is in some ways looking in to the far distant future.

Live Science

Biggest Test Yet Shows Einstein Was Wrong About ‘Spooky Action at a Distance’

By Mindy Weisberger, Senior Writer | May 9, 2018 04:53pm ET

A groundbreaking quantum experiment recently confirmed the reality of “spooky action-at a distance” — the bizarre phenomenon that Einstein hated — in which linked particles seemingly at a distance communicate faster than the speed of light. And all it took was 12 teams of physicists in 10 countries, more than 100,000 volunteer gamers and over 97 million data units — all of which were randomly generated by hand. The volunteers operated from locations around the world, playing an online video game on Nov. 30, 2016, that produced millions of bits, or “binary digits” — the smallest unit of computer data.

Physicists then used those random bits in so-called Bell tests, designed to show that entangled particles, or particles whose states are mysteriously linked, can somehow transfer information faster than light can travel, and that these particles seem to “choose” their states at the moment they are measured. [What Is Quantum Mechanics?]

Their findings, recently reported in a new study, contradicted Einstein’s description of a state known as “local realism,” study co-author Morgan Mitchell, a professor of quantum optics at the Institute of Photonic Sciences in Barcelona, Spain, told Live Science in an email.

“We showed that Einstein’s world-view of local realism, in which things have properties whether or not you observe them, and no influence travels faster than light, cannot be true — at least one of those things must be false,” Mitchell said.

This introduces the likelihood of two mind-bending scenarios: Either our observations of the world actually change it, or particles are communicating with each other in some manner that we can’t see or influence.

“Or possibly both,” Mitchell added.

Einstein’s worldview — Is it true?

Since the 1970s, physicists have tested the plausibility of local realism by using experiments called Bell tests, first proposed in the 1960s by Irish physicist John Bell.To conduct these Bell tests, physicists compare randomly chosen measurements, such as the polarization of two entangled particles, like photons, that exist in different locations. If one photon is polarized in one direction (say, up), the other will be going sideways only a certain percentage of the time. If the number of times that the particle measurements mirror each other goes above that threshold — regardless of what the particles are or the order in which the measurements are selected — that suggests the separated particles “choose” their state only at the moment they are measured. And it implies that the particles can instantly communicate with each other — the so-called spooky action at a distance that bothered Einstein so much.

These synched responses thereby contradict the notion of genuinely independent existence, a view that forms the foundation of the principle of local realism upon which the rules of classical mechanics are based. But, time after time, tests have shown that entangled particles do demonstrate correlated states that exceed the threshold; that the world is, indeed, spooky; and that Einstein was wrong. [The 18 Biggest Unsolved Mysteries in Physics]

Volunteers in 190 countries played a game that provided researchers with more than 97,000 random bits, which the scientists applied to measurements for entangled particles.

However, Bell tests require that the choice of what to measure should be truly random. And that’s hard to show, since unseen factors can influence researchers’ selections, and even computers’ random data generation isn’t truly random. This creates a flaw in Bell tests known as the freedom-of-choice loophole — the possibility that “hidden variables” could influence the settings used in the experiments, the scientists reported. If the measurements aren’t truly random, the Bell tests can’t definitively rule out local realism.

For the new study, the researchers wanted to gather an enormous amount of human-produced data, to be certain they were incorporating true randomness in their calculations. That data enabled them to conduct a broader test of local reality than had ever been done before, and at the same time, it allowed them to close the persistent loophole, the researchers claimed.

“Local realism is a question we can’t fully answer with a machine,” Morgan said in a statement. “It seems we ourselves must be part of the experiment, to keep the universe honest.”
Random number generators

Their effort, dubbed the Big Bell Test, engaged players — or “Bellsters” — in an online tapping game called Big Bell Quest. Players quickly and repeatedly tapped two buttons on a screen, with respective values of one and zero. Their choices streamed to laboratories on five continents, where the participants’ random choices were used to select measurement settings for comparing entangled particles, the researchers reported.

During the Big Bell Test initiative on Nov. 30, 2016, over 100,000 people used an online game to generate data for a global physics experiment.

Each of the laboratories performed different experiments, using different particles — single atoms, groups of atoms, photons and superconducting devices — and their results showed “strong disagreement with local realism” in a variety of tests, according to the study, which was published online today (May 9) in the journal Nature.

The experiments also demonstrated an intriguing similarity between humans and quantum particles, related to randomness and free will. If the Bell tests’ human-influenced measurements were truly random — not influenced by the entangled particles themselves — then the behaviors of both humans and particles were random, Mitchell explained.
“If we are free, so are they,” he said.
Original article on Live Science.

A South African Hero

Sunday, May 13th, 2018

On May 9, in 1994, South Africa’s newly elected parliament chose Nelson Mandela , as the country’s first democratically elected president. More than 22 million South Africans had voted in the election, the first time that black citizens had been allowed to participate. Mandela was the overwhelming winner.
Before his election day win, Mandela served nearly 30 years of his life as a political prisoner of the South African government under apartheid. As a leader of the African National Congress party, Mandela had long resisted the racist Nationalist government via peaceful demonstrations, boycotts, and acts of civil disobedience. When he organized a paramilitary group to further resist after the government’s massacre of peaceful black activists, he was charged with treason and eventually sentenced to life in prison.

Mandela spent the majority of his sentence in a small cell without a bed or plumbing. He worked in a quarry, and was allowed to write and receive only one letter every six months. His visitations were limited to 30 minutes per year. Even in prison, he led a movement of civil disobedience that resulted in officials improving the facility’s conditions for inmates.

Mandela served as South Africa’s president from 1994 until 1999, though he was politically active all the way up to his death in 2013 at the age of 95.

 

Garrison Keillor’s/Writers Almanac, May, 2017


William S. Frankl, MD, All Rights Reserved