Monthly Archives: April 2014

April 30, 1917 (a Monday)

American Friends Service Committee logo.

American Friends Service Committee logo.

On this date, the American Friends Service Committee (AFSC) was founded during what was later known as World War I to give young conscientious objectors ways to serve without joining the military or taking lives. They drove ambulances, ministered to the wounded, and stayed on in Europe after the armistice to rebuild war-ravaged communities.

Following that modest beginning, AFSC has responded in numerous ways to human suffering such as:

  • Feeding thousands of children in Germany and Austria after World War I
  • Helping distressed Appalachian mining communities find alternative means to make a living in the 1930s
  • Negotiating with the Gestapo in Germany to aid Jewish refugees
  • After World War II, sending aid teams to India, China, and Japan
  • Giving aid to civilians on both sides of the Vietnam War and providing draft counseling to thousands of young men
  • Sponsoring conferences for young diplomats in emerging African democracies
  • Establishing economic development programs in Asia, Africa, and Latin America from the 1970s to the present
  • Providing extensive support to the modern U.S. civil rights movement and public school desegregation
  • Working with numerous communities such as Native Americans, immigrants, migrant workers, prisoners, and low-income families on education and justice issues
  • Building peaceful communities all over the world

In 1947, along with British Quakers, AFSC received the Nobel Peace Prize, which recognized their work “…from the nameless to the nameless….”

The Anthropocene Begins: April 28, 1784

Figure from Watt's 1784 patent for a steam locomotive.

Figures from Watt’s 1784 patent for a steam locomotive.

On this date, James Watt’s patent for a steam locomotive was granted. What is especially noteworthy is that this date therefore can be considered as the beginning of the Anthropocene, a new geologic epoch defined by the massive impact of humankind on the planet, according to Dutch chemist and Nobel laureate Paul Crutzen, who coined the term. That impact will endure in the geologic record long after our cities have crumbled.

In 2000, in IGBP Newsletter 41, Crutzen and Eugene F. Stoermer, to emphasize the central role of mankind in geology and ecology, proposed using the term “anthropocene” for the current geological epoch. In regard to its start, they said:

To assign a more specific date to the onset of the “anthropocene” seems somewhat arbitrary, but we propose the latter part of the 18th century, although we are aware that alternative proposals can be made (some may even want to include the entire holocene). However, we choose this date because, during the past two centuries, the global effects of human activities have become clearly noticeable. This is the period when data retrieved from glacial ice cores show the beginning of a growth in the atmospheric concentrations of several “greenhouse gases”, in particular CO2 and CH4. Such a starting date also coincides with James Watt’s invention of the steam engine in 1784.

_____________________________________________________

Welcome to the Anthropocene
_____________________________________________________

April 28, 1975 (a Monday)

On this date, Peter Gwynne, at the time the science editor of Newsweek, pulled together some interviews from scientists and wrote a nine-paragraph story, entitled “The Cooling World“, about how the planet was getting cooler. Ever since, Gwynne’s “global cooling” story – and a similar Time Magazine piece – have been brandished gleefully by those who say it shows global warming is not happening, or at least that scientists – and often journalists – don’t know what they are talking about.

Fox News loves to cite it. So does Rush Limbaugh. Sen. James Inhofe, R-Okla., has quoted the story on the Senate floor. That one article in 1975 was so brilliant, that it has managed to disprove over 33,000 scientifically researched papers written since.

His piece has been used by Forbes as evidence of what the magazine called “The Fiction of Climate Science.” It has been set to music on a YouTube video. It has popped up in a slew of finger-wagging blogs and websites dedicated to climate denial.

But, revisionist lore aside, it was hardly a cover story. It was a one-page article on page 64. It was, Gwynne concedes, written with a bit of hyperbole that sometimes marked the magazine’s prose: “There are ominous signs the earth’s weather patterns have begun to change dramatically…” the piece begins, and warns of a possible “dramatic decline in food production.”

Although the story observed – accurately – that there had been a gradual decrease in global average temperatures from about 1940, by about 1980 it was clear that Earth’s average temperature was headed upward.

Even today, “there is some degree of uncertainty about natural variability,” acknowledged Mark McCaffrey, programs and policy director of the National Center for Science Education based in Oakland, California. “If it weren’t for the fact that humans had become a force of nature, we would be slipping back into an ice age, according to orbital cycles.”

But earth’s glacial rhythms are “being overridden by human activities, especially burning fossil fuels,” McCaffrey noted. The stories about global cooling “are convenient for people to trot out and wave around,” he said, but they miss the point:

What’s clear is we are a force of nature. Human activity – the burning of fossil fuels and land change – is having a massive influence. We are in the midst of this giant geoengineering experiment.

April 27, 3993 B.C.E.

Johannes Kepler (1610)

On this date, God created the universe, according to the German mathematician, astronomer, and mystic Johannes Kepler, considered a founder of modern science. He used biblical chronology to arrive at his date. Also, a rare astronomical conjunction in 1604 helped Kepler become first to derive the supposed birth year of Christ, that is now universally accepted.

As for Kepler’s calculation about the universe’s birthday, scientists in the 20th century developed the Big Bang theory, which showed that his calculations were off by about 13.7 billion years.

April 27, 1819 (a Tuesday)

Scales of Justice

On this date, one Jesse Boorn of Manchester, Vermont was arrested and brought before the Justice of the Peace for examination. The examination lasted from Tuesday until Saturday. Thus began America’s first known wrongful murder conviction case.

When Russel Colvin had disappeared in 1812, suspicion of foul play had fallen on his brothers-in-law, Jesse and Stephen Boorn, who held Colvin in disdain. Seven years later, an uncle of the suspects, Amos Boorn, had a dream in which Colvin appeared to him and said that he had been slain. Colvin did not identify his killers in the dream but said that his remains had been put in a cellar hole on the Boorn farm. Uncle Amos said the dream was repeated three times. The cellar hole was excavated but no remains were found. Shortly afterward, a dog unearthed some large bones from beneath a nearby stump. Three local physicians examined the bones and summarily declared them human.  The patience of the community snapped and action was demanded.

Artist's depiction of the alleged murder of Russel Colvin.

This is when officials took Jesse Boorn into custody. They would have arrested Stephen Boorn as well, but he had moved to New York. While in custody, Jesse’s cellmate, forger Silas Merill, told authorities that Jesse had confessed to him. In return for agreeing to testify against Jesse, Merrill was released from jail. Faced with mounting evidence against him, Jesse admitted to the murder, but placed principal blame on Stephen, who legally was beyond the reach of the local authorities. However, a Vermont constable met up with Stephen, and Stephen agreed to return to Vermont with him to clear his name. After his return to Vermont, Stephen confessed as well, although he claimed to have acted in self-defense.

The local physicians then changed their minds that the found bones were human, and declared them animal. Nevertheless, the prosecution pressed ahead with its case and both of the Boorn brothers were convicted and sentenced to death. The Vermont legislature commuted Jesse’s sentence to life in prison, but denied relief to Stephen. Shortly before Stephen was to be hanged on January 28, 1820, Colvin was found living in New Jersey. On Colvin’s return to Vermont, both brothers were released.

References:

April 26, 1983 (a Tuesday)

The faked education crisis.

On 26 April 1983, in a White House ceremony, Ronald Reagan took possession of A Nation At Risk: The Imperative For Educational Reform. The product of nearly two years’ work by a blue-ribbon commission, it reported poor academic performance at nearly every level and warned that the education system was “being eroded by a rising tide of mediocrity.”

A true Cold War document written in the hyperbole of the time, the opening paragraph begins:

Our Nation is at risk. Our once unchallenged preeminence in commerce, industry, science, and technological innovation is being overtaken by competitors throughout the world. This report is concerned with only one of the many causes and dimensions of the problem, but it is the one that under girds American prosperity, security, and civility. . . the educational foundations of our society are presently being eroded by a rising tide of mediocrity that threatens our very future as a Nation and a people. What was unimaginable a generation ago has begun to occur — others are matching and surpassing our educational attainments. If an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war. As it stands, we have allowed this to happen to ourselves.

A Nation At Risk inaugurated a series of attacks on public schools. “That was the ‘rising tide’ we got engulfed with — the rising tide of negative reports,” said Paul Houston recently, executive director of the American Association of School Administrators. “It was an overstatement of the problem, and it led to sort of hysterical responses,” he says. For one, it took liberties with the link between economic development and overall education rates. Yes, the connection makes intuitive sense, he says — but when the dot-com boom made millionaires of ordinary Americans in the 1990s, “no one came to my office and thanked me.” A Nation at Risk also led to “a cottage industry of national reports by people saying how bad things are.”

In 1990, Admiral James Watkins, the Secretary of Energy, commissioned the Sandia Laboratories in New Mexico to document the decline in A Nation at Risk with actual data. When the systems scientists broke down the SAT test scores into subgroups they discovered contradictory data. While the overall average scores had declined between , the subgroups of students had increased due to a statistical anomaly known as “Simpson’s paradox“! The results of the so-called Sandia Report discredited much of A Nation At Risk.

Nevertheless, the Republican administration of Bush the First, finding it politically unacceptable, suppressed the report, which was never officially released. Education Week published an article on the Sandia report in 1991, but unlike A Nation at Risk, the Sandia Report critique received almost no attention. The report was finally published as “Perspectives on Education in America” in 1993 in the Journal of Educational Research, but was ignored by the mass media. This was no doubt due, in part, to the statistical illiteracy of most Americans.

The mindset among the American public that “public schools are broken” can trace its roots back to A Nation at Risk.

The mass hysteria sparked by A Nation At Risk has continued unabated for nearly three decades, fueled by politicians and Wall Street. In fact, the U.S. Department of Education released a report in 2008 entitled, A Nation Accountable: Twenty-five Years After A Nation at Risk, stating:

If we were “at risk” in 1983, we are at even greater risk now. The rising demands of our global economy, together with demographic shifts, require that we educate more students to higher levels than ever before. Yet, our education system is not keeping pace with these growing demands [emphasis added].

The US Department of Misinformation

It is more than a little ironic that the over-the-top rhetoric of A Nation at Risk has now spawned a testing craze that, in fact, puts the nation’s children, and thus our future, truly at risk.  The public school system in the United States is under assault today as never before, but not by foreign powers — it is being destroyed by our own politicians and business tycoons.

Suggested reading:

April 26, 1989 (a Wednesday)

The April 27th march was a protest to the April 26th editorial.

On this date, Deng Xiaoping, the powerful leader of the Communist Party Elders of China, denounced the student demonstrations in Beijing in an editorial published in the People’s Daily. He called the protests dongluan (meaning “turmoil” or “rioting”) by a “tiny minority.” These highly emotive terms were associated with the atrocities of the Cultural Revolution. Rather than tamping down the students’ fervor, Deng’s editorial further inflamed it. The government had just made the second of several grave mistakes that would lead to the Tiananmen Square Massacre of June, 1989.

April 26, 1831 (a Tuesday)

Charles Darwin by G Richmond.

On this date, Charles Darwin graduated from Christ’s College, Cambridge with a B.A. degree.

April 25, 1953 (a Saturday)

James Watson (left) and Francis Crick in 1959.

On this date, James Watson and Francis Crick published an article in the journal Nature describing the structure of DNA in terms of the now-familiar double helix. Watson was working at the Cavendish Laboratory, University of Cambridge, in early October 1952. He met Francis Crick there and they agreed that, working together, they should be able to discover the structure of DNA that had eluded others. Crick brought to the project his knowledge of x-ray diffraction, while Watson brought knowledge of phage and bacterial genetics. In April 1953 they jointly published their theory, complete with a diagram of “two helical chains coiled round the same axis.” Watson (age 25 at the time), was born in Chicago; Crick (age 36 at the time), was born in Northampton, England. Their discovery won them both, with Maurice Wilkins, the Nobel Prize in Physiology or Medicine in 1962.

Human Genome Project director Francis Collins says, even 50 years later, it’s impossible to overstate the importance of knowing the structure of DNA:

It is so intertwined in every bit of what we do experimentally, in terms of perceiving our own position in the scheme of life on this planet. It has become one of those givens that is so central to your thinking that you stop thinking about it, but if somebody took it away from you, your whole intellectual foundation would collapse, and it would be unimaginable what we would be doing now if we didn’t know about the double helix.

Furthermore, DNA is not just an instruction book for the present and something to pass on to future generations – it is also record of our genetic past. No longer do researchers look for clues to human history merely in fossil bones and stone tools, they also seek “genetic fossils” in the DNA of living peoples.

April 24, 1863 (a Friday)

Abraham Lincoln

On this date, the Union Army of the United States issued General Order No. 100, signed and authorized by President Abraham Lincoln, which provided a code of conduct for federal soldiers and officers when dealing with Confederate prisoners and civilians during the American Civil War.  There was no document like it in the world at the time, and other countries soon adopted the code. In fact, its influence can be seen on the Geneva Convention.

The German-American jurist and political philosopher Francis Lieber was the principle civilian proponent and principle author of the order, and so it has come to be known as the Lieber Code of 1863.  It is also known as Instructions for the Government of Armies of the United States in the Field, or Lieber Instructions.  Its main sections were concerned with, among other things, how prisoners of war should be treated.  More specifically, it forbade the use of torture to extract confessions and described the rights and duties of prisoners of war and of capturing forces, to wit, Article 16:

Military necessity does not admit of cruelty–that is, the infliction of suffering for the sake of suffering or for revenge, nor of maiming or wounding except in fight, nor of torture to extort confessions.

Lieber consistently opposed the abuse of prisoners, and he quickly dispensed with the notion that captured Southern soldiers should be treated as criminals, traitors, or bandits. Instead, they were to be housed humanely and fed “plain and wholesome food.” Torture and public humiliation were forbidden, and chivalry was very much alive: To reward exemplary bravery and honor, captors could even return sidearms to enemy officers.

The irony of a Republican predecessor opposing torture of enemy combatants nearly 150 years before Bush the Second condoned the practice has not been lost on critics of Bush II. Of course, apologists for the more recent Republican president are fond of pointing out that in other areas, such as habeas corpus, Lincoln was hardly a paragon protector of rights and legal ethics. I fail to see how that exonerates Bush II for his deplorable behavior.

As David Bosco, an assistant professor at the American University School of International Service and a contributing writer to Foreign Policy magazine, has written in an article in The American Scholar entitled “Moral Principle vs. Military Necessity“:

Lieber and Lincoln proudly published their code, flawed and ambiguous though it was. The nation’s current leadership has preferred secret memoranda and strained interpretations. Too often now, the noble effort to expand and codify the international law that Lieber gloried in no longer appeals to the world’s most powerful state. For the good of international law and of the United States, that must change.

April 22, 1927 (a Friday): The Political Flood

Flood refugees on the levee in Greenville, Miss. in 1927. (Courtesy Mississippi Department of Archives and History, accession no.: PI/CI/G74.4, no. 46).

Flood refugees on the levee in Greenville, Miss. in 1927. (Courtesy Mississippi Department of Archives and History, accession no.: PI/CI/G74.4, no. 46).

And the rains came. They came in amounts never seen by any white man, before or since. They fell throughout the entire Mississippi River Valley, from the Appalachians to the Rockies. They caused widespread flooding that made 1927 the worst year ever in the valley. The Great Flood of 1927 at one point covered 26,000 square miles in water ten feet deep. More water, more damage, more fear, more panic, more misery, more death by drowning than any American had seen before, or would again.

On this date, President Calvin Coolidge issued a proclamation to the nation. He declared, “The Government is giving such aid as lies within its powers …. But the burden of caring for the homeless rests upon the agency designated by Government charter to provide relief in disaster — the American National Red Cross.” He made no mention of emergency appropriations. Rather, Coolidge, as President of the United States and the Red Cross, asked for the public to donate $5 million [$55.9 million in 2005 dollars] to the Red Cross. Additionally, the President created a quasi governmental commission to assist the Red Cross in the relief effort. Coolidge appointed Herbert Hoover, Secretary of Commerce, as chairman.

The flood propelled Secretary of Commerce Herbert Hoover, who was in charge of flood relief operations, into the national spotlight and set the stage for his election to the Presidency.

The flood had the unlikely effect of contributing to both the election of Herbert Hoover as President, and his defeat four years later. He was much lauded for his masterful handling of the refugee camps, but later concerns over the treatment of blacks in those camps caused him to make promises to the African-American community which he later broke, losing the black vote in his re-election campaign.

Flood refugees near Greenville, Miss. in 1927. (Courtesy U.S. Army Corps of Engineers, Memphis District)

Flood refugees near Greenville, Miss. in 1927. (Courtesy U.S. Army Corps of Engineers, Memphis District)

Several reports on the terrible situation in the refugee camps, including one by the Colored Advisory Commission led by Robert Russa Moton, were kept out of the media at the request of Herbert Hoover, with the promise of further reforms for blacks after the presidential election.

However, once elected President in 1928, Hoover ignored Robert Moton and the promises he had made to his black constituency. In the following election of 1932, Moton withdrew his support for Hoover and switched to the Democratic Party. In an historic shift, African Americans began to abandon the Republicans, the party of Abraham Lincoln and the Emancipation Proclamation, and turned to Franklin Delano Roosevelt’s Democratic Party instead.

The flood of 1927 changed America. It put Herbert Hoover in the White House, even while his duplicity in dealing with blacks helped begin the shift of black voters from the Republicans to the Democrats. It inspired Congress to pass a law putting responsibility for the Mississippi in Federal hands, making it easier for both Congress and the public to accept an even larger Federal presence during the New Deal years. And the pressures the flood brought to bear on the delicate racial fabric of the Deep South caused ruptures that could never be mended.

April 22, 1989 (a Saturday)

At picture right, it’s Zhou Yongjun on the steps of the Great Hall of the People on April 22, 1989. Zhou is flanking Guo Haifeng, who is holding a scroll with the students’ demands to reform China.

On this date, the state funeral for Hu Yaobang, the reform-minded Chinese Communist leader whom the students were honoring, was held.  A small handful of student leaders, including Zhou Yongjun and Guo Haifeng, appeared on the steps of the Great Hall of the People in Beijing, clutching their petition for Chinese reform.  They knelt down on the steps in the classic Chinese tradition of waiting for the emperor to receive their petition.  Chinese government officials refused to receive the delegation.  This moment became an iconic photo of the Tiananmen Square protests.

The Seven Point Peition

The “Seven Point Petition” had been drafted on April 18, 1989.In the early morning, hundreds of students from Peking University had gathered around the Monument to People’s Heroes at Tiananmen Square. They had spent the previous night there guarding the wreaths and flowers dedicated to the newly deceased Hu Yaobang.  Wang Dan, Guo Haifeng, Li Jinjin, and Zhang Boli all had been among the crowd and proposed to write a formal petition to the government. After much discussion, the following became their seven demands for government reform:

  • Reevaluate and praise Hu Yaobang’s contributions
  • Negate the previous anti-“spiritual pollution” and anti-“Bourgeois Liberation” movements
  • Allow unofficial press and freedom of speech
  • Publish government leaders’ income and holdings
  • Abolish the “Beijing Ten-Points” [restricting public assembly and demonstrations]
  • Increase education funding and enhance the compensation for intellectuals
  • Report this movement faithfully

The rebuff of the students would prove to be the first of several grave mistakes by the government that led to the Tiananmen Square Massacre in June, 1989.  On April 23, 1989, Zhou was elected the first President of the Autonomous Students Federation of Beijing Universities.  The students had decided that they needed a central organization to speak for the whole array of Beijing schools that were represented in Tiananmen Square.

Retrospective

Hua Tianyou, professor at the Central Academy of Fine Arts, sculpted the May 4th Movement of 1919, a relief on the Monument to the Peoples Heroes, in Beijing in June 1953.

On May 4, 1919, thousands of students from 13 Beijing universities had gathered in Tiananmen Square to protest their government’s weak response to the Treaty of Versailles, which included terms that many felt were unfair to China. That movement soon spread to Shanghai and from students to workers, paving the way for the formation of the Communist Party. Party leaders had viewed the May Fourth movement as so critical to the Communist revolution that in 1958, when they unveiled the Monument to the People’s Heroes in the center of Tiananmen Square, the faces of the 1919 protestors were carved into one side.

In 1989, when students once again converged on the square, they chose the monument as their base. “May Fourth was very important to Chinese history,” says Wang Chaohua, a student organizer who appeared on the government list of 21 most-wanted leaders after the Tiananmen crackdown. “Like the students of May Fourth, we wanted to propose something new.” In both 1919 and 1989, says Wang, who recently completed a doctorate in Asian languages and literature at the University of California at Los Angeles, “political authorities did not command the public imagination. The vacuum was filled by intellectual energy.

April 21, 1843 (a Friday)

Walther Flemming

On this date, the German biologist Walther Flemming was born. By using aniline dyes, he was able to find a structure which strongly absorbed basophilic dyes, which he named chromatin. He reported that chromatin was correlated to threadlike structures in the cell nucleus. The Belgian scientist Edouard Van Beneden (1846-1910) had independently observed them, too, and they were later named the chromosomes (meaning “colored body”) by German anatomist Wilhelm von Waldeyer-Hartz (1836-1921).

Illustration of the book *Zell-substanz, Kern und Zelltheilung* (1882) by Walther Flemming

Flemming’s greatest accomplishment was first describing mitosis in animal cells (1879), one of the major discoveries in the history of science. As one of the first cytologists, Flemming investigated the process of cell division and the distribution of chromosomes to the daughter nuclei, a process he named mitosis from the Greek word for thread. Dividing cells had been observed almost forty years earlier by Carl Nageli, but he misinterpreted evidence of mitosis as something abnormal in the dead cells he had observed. Flemming observed cell division in salamander embryos, where cells divide at fixed intervals, and of course the staining technique he had developed allowed him to observe the chromosomes clearly. The Polish-German botanist Eduard Strasburger (1844-1912) independently identified a similar process of mitosis in plant cells.

Ultimately, Flemming described the whole process of mitosis, from chromosome doubling to their equal partitioning into the two resulting cells, in a book published in 1882. His terms, like prophase, metaphase, and anaphase, are still used to describe the steps of mitosis. His work helped form the basis of the chromosomal theory of inheritance.

April 19, 1895 (a Friday)

Anatomical regions of the human body.

The Basle Nomina Anatomica (BNA) was published in Latin after its unanimous approval on this date at the IX Congress of the Anatomische Gesellschaft in Basel, Switzerland.

In the late nineteenth century some 50,000 terms for various human body parts had been in use. The same structures were described by different names, depending (among other things) on the anatomist’s school and national tradition. Vernacular translations of Latin and Greek, as well as various eponymous terms, were barriers to effective international communication. There was disagreement and confusion among anatomists regarding anatomical terminology. Work on a new international system of anatomical terminology had begun in 1887 and culminated with publication of the BNA. It reduced the number of anatomical terms from 50,000 down to 5,528.

April 18, 1865 (a Tuesday)

Karl Wilhelm von Naegeli, the man who discouraged Gregor Mendel from further work on genetics.

Karl Wilhelm von Naegeli, the man who discouraged Gregor Mendel from further work on genetics.

On this date, Austrian monk Gregor Mendel, 42, sent the results of his seven-year study of peas to the eminent biologist Karl Wilhelm von Nägeli in Munich. Within these results were the basic laws of genetics, which the humble Mendel discovered single-handedly in a small garden at the Brünn monastery (now in Brno, Czech Republic). Nägeli failed to see the importance of the work; he suggested that Mendel try new experiments with different plants. Nägeli’s cool reception, coupled with the failure of the new experiments, were instrumental in Mendel’s abandoning serious research.

April 17, 1975 (a Thursday)

Khmer Rouge fighters celebrate as they enter Phnom Penh on April 17, 1975.

On this date, Phnom Penh in Cambodia fell under the control of the Khmer Rouge, the guerrilla group led by Pol Pot that was funded and fueled by Chinese Communists.  Pol Pot immediately directed a ruthless program to “purify” Cambodian society of capitalism, Western culture, religion, and all foreign influences.  He wanted to turn Cambodia into an isolated and totally self-sufficient Maoist agrarian state.  Foreigners were expelled, embassies closed, and the currency abolished.  Markets, schools, newspapers, religious practices, and private property were forbidden.   Members of the Lon Nol government, public servants, police, military officers, teachers, ethnic Vietnamese, Christian clergy, Muslim leaders, members of the Cham Muslim minority, members of the middle-class, intellectuals, and the educated were identified and executed.  Anyone who opposed was killed.

An undated photograph shows forced laborers digging canals in Kampong Cham province, part of the massive agrarian infrastructure the Khmer Rouge planned for the country.

The Khmer Rouge forced all city residents into the countryside and to labor camps. During the three years, eight months, and 20 days of Pol Pot’s rule, Cambodia faced its darkest days; an estimated 2 million Cambodians or 30% of the country’s population died by starvation, torture, or execution. Almost every Cambodian family lost at least one relative during this most gruesome holocaust.

Skulls of victims of the Khmer Rouge at the Killing Fields.

Perhaps the most notorious of the atrocities that occurred under the rule of Pol Pot occurred at Security Prison 21 (S-21), formerly the Tuol Svay Prey High School (named after a royal ancestor of King Norodom Sihanouk of Cambodia) in Phnom Penh.  The five buildings of the complex were converted in August 1975 into a prison and interrogation center by the Khmer Rouge regime.  All the classrooms were converted into cells. The windows were enclosed in iron bars and covered in barbed wire. The classrooms on the ground floor were divided into tiny cells, 0.8 x 2 meters each, for one prisoner. Female prisoners were housed on the middle floors and the upper-story classrooms were converted into mass cells.

S-21 Tuol Sleng Prison was formerly a school.

One of the administration offices belonged to Comrade Duch, a former teacher and the infamous commandant of S-21 who recently stood trial and eventually apologized for his crimes. Alongside Duch was a workforce of 1,720 staff, comprising prison warders, office personnel, interrogators, and general workers. Many of the sub-units of the prison were staffed by children between the ages of 10 and 15 who were specially selected and trained for their role. They became increasingly dissocialized and evil, and were exceptionally cruel and disrespectful towards the adult prisoners and staff. Children also formed the majority of the medical staff and were untrained.

From 1975 to 1979, an estimated 17,000 people were imprisoned at S-21 (some estimates suggest a number as high as 20,000, though the real number is unknown); there were only twelve known survivors.  At any one time, the prison held between 1,000 to 1,500 inmates.  They were repeatedly tortured and coerced into naming family members and close associates, who were in turn arrested, tortured, and killed.

Thousands of children died in S-21 Tuol Sleng.

The Khmer Rouge required that the prison staff make a detailed dossier for each prisoner.  Included in the documentation was a photograph.  Since the original negatives and photographs were separated from the dossiers in the 1979-1980 period, most of the photographs remain anonymous today.  The photographs are currently exhibited at the Tuol Sleng Genocide Museum, located at the former site of S-21 in Phnom Penh. (Tuol Sleng in Khmer [tuəl slaeŋ] means “Hill of the Poisonous Trees” or “Strychnine Hill”.)

Prisoner Pon Ny, in leg chain (undated).

Every morning, all prisoners were ordered to pull their shorts down to the ankles so they could be inspected. Despite remaining shackled, they were then ordered to exercise by moving their legs and arms up and down. Prisoners were inspected four times a day to check their shackles weren’t loose.

Toilets consisted of small iron and plastic buckets and prisoners had to ask permission of the guards before relieving themselves. If they didn’t, they were beaten or whipped with electrical wire as punishment. They had to stay silent at all times unless being interrogated and risked electrocution if they disobeyed any of the many regulations.

Bathing consisted of a tube of running water poked through a window to splash water on them for a short time. This happened only every two or three days at most, sometimes as rarely as fortnightly. Unhygienic living conditions caused many prisoners to become infected with skin rashes and other diseases and no medicine was given for treatment.

On January 7, 1979, Vietnam invaded and freed the Cambodian people from Khmer Rouge’s reign of terror. Six hundred thousand Cambodians fled to Thai border refugee camps. Fearful to return back to Cambodia, many Cambodians had no choice but to emigrate to the United States, France, or Australia.

Today, many people and organizations are educating the world about the Cambodian Killing Fields. Only through awareness will the world remember the lessons of the genocide, honor the memories of the 2 million killed, and promote peace and tolerance so as to not to relive the same dark days.

Suggested reading:

  • Haing Ngor and Roger Warner, Survival in the Killing Fields (New York, NY: Carroll & Graf Publishers, 2003). [First published in 1987 as A Cambodian Odyssey by Macmillan Publishing Company.]

April 16, 1953 (a Thursday)

On this date in Washington D.C., President of the United States (and former General of the Army) Dwight D. Eisenhower delivered his “The Chance for Peace” speech to the American Society of Newspaper Editors, which was also broadcast nationwide by radio. In his address, he contemplated a world permanently perched on the brink of war and he appealed to Americans to assess the consequences likely to ensue.

There were two themes to this speech. Delivered in the wake of Joseph Stalin’s death, the speech offered the new Soviet leadership a five-point plan for ending the Cold War.  As seen from the perspective of the U.S.S.R., Eisenhower was “demanding unconditional surrender.” The president’s peace plan quickly vanished without a trace.

However, a second theme was woven into his speech and it is this:  Spending on arms and armies is inherently undesirable. Even when seemingly necessary, it constitutes a misappropriation of scarce resources. By diverting social capital from productive to destructive purposes, war and the preparation for war deplete, rather than enhance, a nation’s strength. And while assertions of military necessity might camouflage the costs entailed, they can never negate them altogether.

Every gun that is made, every warship launched, every rocket fired signifies, in the final sense, a theft from those who hunger and are not fed, those who are cold and are not clothed.

This world in arms is not spending money alone. It is spending the sweat of its laborers, the genius of its scientists, the hopes of its children.

Eisenhower also spoke in specifics:

The cost of one modern heavy bomber is this: a modern brick school in more than 30 cities. It is two electric power plants, each serving a town of 60,000 population. It is two fine, fully equipped hospitals.

It is some 50 miles of concrete highway. We pay for a single fighter with a half million bushels of wheat. We pay for a single destroyer with new homes that could have housed more than 8,000 people.

Eisenhower gave a memorable metaphor and sounded a note of hope:

This, I repeat, is the best way of life to be found on the road the world has been taking.

This is not a way of life at all, in any true sense. Under the cloud of threatening war, it is humanity hanging from a cross of iron. These plain and cruel truths define the peril and point to the hope that comes with this spring of 1953 [emphasis added].

Unfortunately, despite Ike’s popularity and eloquence, Americans had no intention of choosing between guns and butter: they wanted both. The 1950s brought new bombers and new schools, fleets of warships and tracts of freshly built homes spilling into the suburbs. Pentagon budgets remained high throughout the Eisenhower era, averaging more than 50 percent of all federal spending and 10 percent of GDP, figures without precedent in the nation’s peacetime history. In 1952, when Ike was elected, the U.S. nuclear stockpile numbered some 1,000 warheads. By the time he passed the reins to John F. Kennedy in 1961, it consisted of more than 24,000 warheads, and it rapidly ascended later that decade to a peak of 31,000.

This arms buildup was driven by an unstated alliance of interested parties: generals, defense officials, military contractors, and members of Congress, all of whom shared a single perspective. In his 1956 book, The Power Elite, C. Wright Mills, a professor of sociology at Columbia, named this perspective “military metaphysics”. Those embracing this mind-set no longer considered genuine, lasting peace to be plausible. Rather, peace was at best a transitory condition, “a prelude to war or an interlude between wars.”
____________________________________________________
____________________________________________________

The beneficiaries of military spending rationalized the arms buildup with — and  vigorously promoted a belief in — the existence of looming national peril.  Whether or not the threat was real, every ominous advance in Soviet capabilities was justification for opening the military-spending spigot wider.  The discovery during the 1950s of a “bomber gap” and later a “missile gap,” for example, provided political ammunition to air-power advocates quick to charge that the nation’s very survival was at risk.  That both “gaps” were fictitious was irrelevant.  Ultimately, appropriations poured forth.

[On 7 February 1961, The New York Times ran a two-column headline on its front page that read: KENNEDY DEFENSE STUDY FINDS NO EVIDENCE OF A ‘MISSILE GAP’. Ike had always insisted there was no such gap; now JFK’s own Secretary of Defense, the former Ford Motor Company president Robert S. McNamara, had no choice but to admit that Eisenhower had been right. Not only could the United States survive a full-scale Soviet ICBM attack, it could come out of it with enough weapons left to destroy every city in the U.S.S.R., kill 180 million Soviet citizens, and take out 80 percent of Soviet industrial capacity. Indeed, studies conducted in 1963 indicated that actual Soviet ICBM strength in 1961 amounted to only 3.5 percent of the official U.S. estimate.]

Knowing at the time that the United States enjoyed an edge in bomber and missile capabilities, Eisenhower understood precisely who benefited from fear-mongering.  Yet to sustain the illusion he was fully in command, Ike remained publicly silent about what went on behind the scenes. Only on the eve of his departure from office did he inform the nation as to what the federal government’s new obsession with national security had wrought.

A half century after Eisenhower summoned us to shoulder the responsibilities of citizenship, we still refuse to do so.  In Washington, an aura of never-ending crisis still prevails — and with it, military metaphysics.  Ike is someone we should have listened to then and — with the U.S. today mired in perpetual war and flirting with insolvency and long term economic stagnation — we should listen to even more intently now.

Suggested reading:

April 15, 1857 (a Wednesday)

On this date, a 3-kg carbonaceous chondrite fell at Kaba, near Debrecen, Hungary. The arrival of this meteorite was described as follows in the book The Geologist (1859) by Samuel Joseph Mackie (pp. 285-6):

About 10 pm an inhabitant of Kaba, sleeping in the open air, was awakened by a noise, different from that of thunder, as he described it, and perceived in the serene sky a luminous globe, of dazzling brightness, following a parabolic course during four seconds. This phenomenon was observed by several inhabitants of the same place. As one of them was riding out the next morning, his horse was frightened by the sight of a black stone, deeply bedded in the soil of the road, the ground around it being depressed and creviced. When dug out the meteorite weighed about 7 pounds. The finder broke off some fragments, and the remainder, weighing 5-1/4 lbs., was deposited in the Museum of the Reformed College at Debreczin.

Samples of the Kaba meteorite and the Cold Bokkeveld meteorite were examined and found to contain organic substances by Friedrich Wöhler, who inferred a biological origin. Ironically, it was Wöhler who had shown that it was possible to make organic chemicals by inorganic means. However, it was only later appreciated that complex carbon molecules can be manufactured in space by purely chemical processes.

April 15, 1452

Leonardo's self portrait

On this date, Leonardo da Vinci was born at Anchiano near Vinci in the Florence area of Italy. It is well known that in one of his unpublished notebooks, Leonardo concluded that some fossil sea shells were the remains of shellfish.

Although “fossil” is now a common and widely used word, whose meaning is known to practically everyone, the general acceptance of the idea that fossils are the remains of ancient organisms required millennia to achieve. One reason for this is that the great age of Earth also was not widely appreciated until relatively recently. Without an Earth eons old the idea of ancient life and the idea of fossils are meaningless.

The use of fossils in understanding the distant past can be traced back to at least the sixth century B.C.E., when Xenophanes of Colophon lived. Xenophanes described the occurrence of clam shells in rocks outcropping in mountainous parts of Attica. He recognized that these lithified clam shells were closely similar to clams that were then living along the coastline of the Aegean Sea. To account for the occurrence of these lithified clam shells far from the present sea, he argued that they were the preserved remains of clams that had lived at an earlier time when Attica was covered by an ocean. Hippolytus of Rome (c. 170 – c. 236) in his Refutation of all Heresies (1.14.5-6) records that Xenophanes studied the fossils to be found in quarries:

Xenophanes declared that the sea is salty because many mixtures flow together in it… He believes that earth is being mixed into the sea and over time it is being dissolved by the moisture, saying that he has the following kind of proofs, that sea shells are found in the middle of the earth and in mountains, and the impressions of a fish and seals have been found at Syracuse in the quarries, and the impression of a laurel leaf in the depth of the stone in Paros, and on Malta flat shapes of all marine life. He says that these things occurred when all things were covered with mud long ago and the impressions were dried in the mud.

However, in 750 BCE there were no quantitative methods for verifying this hypothesis, and so Xenophanes’ rather modern-sounding explanation for these clams could not be tested, and disappeared from view. This interpretation of fossils did not reappear in history until Leonardo da Vinci, although he did not contribute to the understanding of fossils since his views were never published.

April 15, 1989 (a Saturday)

Hu Yaobang (r.) and Deng Xiaoping – Sept 1981

On this date, former Chinese Communist Party General Secretary Hu Yaobang, deposed in 1987, died of a massive heart attack. People began to gather in Tiananmen Square to commemorate Hu and voice their discontents. This was the beginning of events that would lead to the Tiananmen Square massacrein June.

Hu Yaobang was a reformist, who served as General Secretary from 1980 to 1987. He advocated rehabilitation of people persecuted during the Cultural Revolution, greater autonomy for Tibet, rapprochement with Japan, and social and economic reform. As a result, he was forced out of office by the hardliners in January of 1987, and made to offer humiliating public “self-criticisms” for his allegedly bourgeois ideas.

Chinese Students Demonstrate After Hu Yaobang’s Death, photo dated 21-22 April 1989.

One of the charges leveled against Hu was that he had encouraged (or at least allowed) wide-spread student protests in late 1986. As General Secretary, he refused to crack down on such protests, believing that dissent by the intelligentsia should be tolerated by the Communist government.

Official media made just brief mention of Hu’s death, and the government at first did not plan to give him a state funeral. In reaction, university students from across Beijing marched on Tiananmen Square, shouting acceptable, government-approved slogans, and calling for the rehabilitation of Hu’s reputation. Bowing to this pressure, the government decided to accord Hu a state funeral after all.

Subverting the Truth: April 13, 1917 (a Friday)

On this date, Woodrow Wilson, the 28th U.S. president, created the Committee on Public Information (CPI) as an independent agency by Executive Order 2594. The CPI blended advertising techniques with a sophisticated understanding of human psychology, and its efforts represent the first time that a modern government disseminated propaganda on such a large scale. It is fascinating that this phenomenon, often linked with totalitarian regimes, emerged in a democratic state.

'Enlist U.S. Army' is the caption of this World War I propaganda poster for enlistment in the US Army.

‘Enlist U.S. Army’ is the caption of this World War I propaganda poster for enlistment in the US Army.

George Creel, director of the CPI, recruited publicity agent Edward L. Bernays, journalist Walter Lippmann, and others to carry out its mission of reversing negative public sentiment about the Great War, now known as World War I. Bernays was influential in promoting the idea that America’s war efforts were primarily aimed at “bringing democracy to all of Europe”.

The CPI used a number of techniques to dehumanize the enemy and to promote anti-German sentiment in the United States with the goal of encouraging people to support the war “over there”. Atrocities committed by the other side were reported in detail and sometimes with unreliable facts, while questions about the activity of American forces and their allies were suppressed.

The committee’s propaganda and censorship worked beyond all expectations. Mobs lynched German Americans. Nearly 5,000 were jailed for being of German descent. Businesses barred people with German names from working for them. People were coerced into buying war bonds to prove their loyalty. Many people changed their names. For example, Mueller became Miller.

After WW I, Bernays took the techniques he learned in the CPI directly to Madison Avenue and became an outspoken proponent of propaganda as a tool for democratic government. “It was, of course, the astounding success of propaganda during the war that opened the eyes of the intelligent few in all departments of life to the possibilities of regimenting the public mind,” wrote Bernays in Propaganda, published in 1928. “It was only natural, after the war ended, that intelligent persons should ask themselves whether it was not possible to apply a similar technique to the problems of peace.” He also wrote:

The conscious and intelligent manipulation of the organized habits and opinions of the masses is an important element in democratic society. Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country.… We are governed, our minds are molded, our tastes formed, our ideas suggested, largely by men we have never heard of. This is a logical result of the way in which our democratic society is organized. Vast numbers of human beings must cooperate in this manner if they are to live together as a smoothly functioning society.… In almost every act of our daily lives, whether in the sphere of politics or business, in our social conduct or our ethical thinking, we are dominated by the relatively small number of persons…who understand the mental processes and social patterns of the masses. It is they who pull the wires which control the public mind. (Propaganda, 1928)

Bernays was the nephew of Sigmund Freud. Freud divided the mind into the conscious mind, which consists of all the mental processes of which we are aware, and the unconscious mind, which contains irrational, biologically-based instincts for the primitive urges for sex and aggression. Combining the ideas of Gustave Le Bon (The Crowd: A Study of the Popular Mind, 1895) and Wilfred Trotter (Instincts of the Herd in Peace and War, 1916) on crowd psychology with the ideas of his uncle, Bernays was one of the first to attempt to manipulate public opinion by appealing to, and attempting to influence, the unconscious. He felt this manipulation was necessary in society, which he regarded as irrational and dangerous as a result of the “herd instinct”.

Bernays’ basic idea was that human behavior is driven more by emotion than by logic and that by harnessing that emotion at a group level you could get people to do what you wanted them to do. In Propaganda, he wrote, “If we understand the mechanism and motives of the group mind, is it not possible to control and regiment the masses according to our will without their knowing about it?”

Bernays believed that to maintain order the populace must be kept docile, and in order to be kept docile, people must be kept content, happy. Or at least, be told that they’re happy. The real irony is that in order to convince them of their contentment, Bernays’ method manipulated their mindsets in such a way as to ensure that they could never be contented. He ensured that people would instead be in endless pursuit of happiness. From that point on, no matter how competent a product might be, it could never satisfy people indefinitely. Only their endless search for the elusive one that might satisfy them indefinitely could possibly keep them placid.

The creation of consumerism didn’t mean people were satisfied; instead they were offered satisfaction as a goal to aim for. A goal where the posts can be continually and cunningly moved just beyond reach. In 1929 Charles F. Kettering, director of General Motors, wrote in an article entitled “Keep the Consumer Dissatisfied” that the “key to economic prosperity is the organized creation of dissatisfaction…If everyone were satisfied no one would want to buy the new thing.”
____________________________________________________

Edward Bernays: “Torches of Freedom”
____________________________________________________

Bernays’ method served a greater purpose than domestic tranquility. The Great War spurred the development of mass production techniques to supply huge quantities of war material. After the war, industry could produce consumer goods in much greater quantities and for less. For example, Henry Ford pioneered the mass production of automobiles — in the 1920s, his assembly lines dramatically lowered the cost of an automobile so that millions could afford them. However, those running the corporations were worried about overproduction — that people might actually stop buying things once they had what they needed. “We must shift America from a needs, to a desires culture,” wrote Wall Street banker Paul Mazur (Harvard Business Review, 1927). “People must be trained to desire, to want new things even before the old had been entirely consumed. We must shape a new mentality in America. Man’s desires must overshadow his needs.” Bernays claimed he was the first to tell car companies they could sell cars as a symbol of male sexuality.

In his work for major corporations, one of Bernays’ most spectacular successes was to help break the taboo against women smoking. George Hill, the President of the American Tobacco corporation, asked Bernays to find a way to break it. A.A. Brille was one of the first psychoanalysts in America. And for a large fee he told Bernays that cigarettes were a symbol of the penis and of male sexual power. Women smoking challenged male sexual identity so much that men were sub-consciously keeping women from smoking. He told Bernays that if he could find a way to connect cigarettes with the idea of challenging male power, then women would smoke because then they would have their own penises.

That gave Bernays the idea to hire beautiful young girls to burst out of several different churches along the route of the 1929 Easter Day Parade in New York City and light up. He carefully instructed them to walk arm in arm at the front of the parade, puffing away. Bernays saw that it was news, not advertising, that would get the message to the people and told the press that there was going to be a protest that day on “lighting the torch of freedom”. Half the city’s reporters and photographers were there when they rounded the corner on main street. It was his phrase that hit the headlines – squarely positioning smoking with female independence and liberty.

From that moment on, smoking was seen as a sign of freedom for women. This was a classic appeal to the emotional rather than the rational. It is quite clear that smoking does not make you free (probably a more appropriate slogan for the washing machine or the pill), but the association made women feel powerful, and it stuck. The numbers of women taking up the habit shot through the roof.

Who knew socks could seem so sexy? Interwoven advertisement, circa 1927, by Joseph Christian Leyendecker.

Who knew socks could seem so sexy? Interwoven advertisement, circa 1927, by Joseph Christian Leyendecker.

After this success, Lehman Brothers and other big New York banks financed the development of department stores, confident that they could use the techniques pioneered by Bernays to persuade people to purchase a range of products that left to themselves they may very well not have bothered with. This period also saw the introduction of the techniques of product placement and psuedo-scientific product endorsement so familiar to us today. Buying things because they say something about us, or make us feel a certain way, was a complete transformation in the 1920s when most selling was done on the basis of information and function. Bernays spent a lifetime helping companies connect with the “irrational emotion” of their customer.

But the peacetime application by the government of what was, after all, a tool of war, began to trouble Americans who suspected that they had been misled. In The New Republic, John Dewey questioned the paternalistic assumptions of those who disguised propaganda as news. “There is uneasiness and solicitude about what men hear and learn,” wrote Dewey, and the “paternalistic care for the source of men’s beliefs, once generated by war, carries over to the troubles of peace.” Dewey argued that the manipulation of information was particularly evident in coverage of post-Revolutionary Russia.

The objective for Bernays was to provide government and media outlets with powerful tools for social persuasion and control. In an article entitled “The Engineering of Consent” (1947) he argued, “The engineering of consent is the very essence of the democratic process, the freedom to persuade and suggest.” But all of this had little, if anything, to do with real democracy. Adolf Hitler learned from the CPI; he wrote in Mein Kampf (1925) admiringly that “the war propaganda of the English and Americans was psychologically correct…There, propaganda was regarded as a weapon of the first order, while in our country it was the last resort of unemployed politicians and a comfortable haven for slackers. And, as was to be expected, its results all in all were zero.” In fact, so impressed was Nazi propaganda minister Joseph Goebbels with Bernays’ early works Crystallizing Public Opinion (1923) and Propaganda that he relied heavily upon them for his own dubious inspiration in the 1930s. Apparently, that Bernays was a Jew mattered little to Goebbels.

Ironically, Bernays’ propaganda campaign for the United Fruit Company (today’s United Brands) in the 1950s had consequences just as evil and terrifying as if he’d worked directly for the Nazis — it led directly to the CIA-supported overthrow of the democratically-elected government of Guatemala.

The term “banana republic” actually originated in reference to United Fruit’s domination of corrupt governments in Guatemala and other Central American countries. The company brutally exploited virtual slave labor in order to produce cheap bananas for the lucrative U.S. market. When a mildly reformist Guatemala government attempted to reign in the company’s power, Bernays whipped up media and political sentiment against it in the early years of the Cold War.

“Articles began appearing in the New York Times, the New York Herald Tribune, the Atlantic Monthly, Time, Newsweek, the New Leader, and other publications all discussing the growing influence of Guatemala’s Communists,” wrote Larry Tye in The Father of Spin: Edward L. Bernays and the Birth of PR (1998). “The fact that liberal journals like the Nation were also coming around was especially satisfying to Bernays, who believed that winning the liberals over was essential. . . . At the same time, plans were under way to mail to American Legion posts and auxiliaries 300,000 copies of a brochure entitled ‘Communism in Guatemala — 22 Facts.'” ____________________________________________________

Edward Bernays: How to Sell a War
____________________________________________________

Bernays’ efforts led directly to a brutal military coup. Tye wrote that Bernays “remained a key source of information for the press, especially the liberal press, right through the takeover. In fact, as the invasion was commencing on June 18 [1954], his personal papers indicate he was giving the ‘first news anyone received on the situation’ to the Associated Press, United Press, the International News Service, and the New York Times, with contacts intensifying over the next several days.”

The result, tragically, was decades of tyranny under a Guatemalan government whose brutality rivaled the Nazis as it condemned hundreds of thousands of people (mostly members of the country’s impoverished Maya Indian majority) to dislocation, torture and death. “The propaganda war Bernays waged in Guatemala set the pattern for future U.S.-led campaigns in Cuba and, much later, Vietnam,” according to Tye. Bernays apparently never regretted his work for United Fruit.

Democratic theory, as interpreted by Jefferson and Paine, was rooted in the Enlightenment belief that free citizens could form respectable opinions about issues of the day and use these opinions to guide their own destiny. In 1820, Jefferson wrote in a letter to William C. Jarvis:

I know of no safe depository of the ultimate power of the society but the people themselves; and if we think them not enlightened enough to exercise that control with a wholesome discretion, the remedy is not take it from them, but to inform their discretion by education.

Communication between citizens was assumed to be a necessary element of the democratic process. But during World War I, America’s leaders felt that citizens were not making the correct decisions quickly enough, so they flooded the channels of communication with dishonest messages that were designed to stir up emotions and provoke hatred of Germany. The war came to an end, but propaganda did not.

It was the idea of Bernays to sell warfare as the spreading of democracy, an idea that rules the American thought process to this very day. The amazing power of this campaign can be seen a full hundred years later, as the most common reason given by heads of state for military intervention abroad is to “bring democracy”, whether it is Europe, Asia, or the Middle East. The region is irrelevant, the only goal is to bend the will and thoughts of society about the necessity of a particular event. Democracy works so well as a rallying cry because it boosts the mental image of the recipient of the propaganda, giving the illusion that the collective group is already extremely lucky to “have democracy”, and also that those who want to bring democracy elsewhere are performing a noble and needed thing, for the benefit of humanity. Thus, once the propaganda has taken hold in the collective mind, anyone putting forth a different viewpoint, is seen as “against democracy”, or against the essential tenets of the society in which they reside.

Just as troubling, Bernays realized that the same technique could be used for selling products, by appealing to the emotions rather than to the intellect. He helped to shift America from a needs-based economy to a culture of desire. (No, you do not logically need a new car — but just think of how much better you are going to feel when you have one!) In the November 1924 issue of The Atlantic Monthly, journalist Samuel Strauss lamented, “Something new has come to confront American democracy… [T]he American citizen’s first importance to his country is no longer that of citizen but that of consumer.” Rail and airline passengers become “consumers” of the service called “transport”; one attends university classes as a consumer (of the degree, not the knowledge); and a visit to a doctor is for the purpose of consuming medical care.

More recently, soon after the September 11 attacks, members of the Bush administration exhorted Americans to demonstrate their patriotism by maintaining high levels of consumer spending. House Minority Leader Dick Gephardt proclaimed that Americans were “not giving up on America, they’re not giving up on our markets.” Treasury Secretary Paul O’Neill said, “We’re going to show we have backbone.” President Bush declared that the American economy was “open for business,” and Vice President Cheney urged Americans to “stick their thumb in the eye of the terrorists and…not let what’s happened here in any way throw off their normal level of economic activity.” Interestingly, in his memoir Decision Points (pp 443-4), which was published in 2010, Bush commented, “Later, I would be mocked and criticized for telling Americans to ‘go shopping’ after 9/11. I never actually used that phrase, but that’s beside the point. In the threat-filled months after 9/11, traveling on airplanes, visiting tourist destinations, and, yes, going shopping, were acts of defiance and patriotism.”

Newsweek cover, 23 March 2009.

Newsweek cover, 23 March 2009.

Treating people as consumers and convincing them that this is their existential role has profound political implications. First, it objectifies one’s fellow citizens. He/she is not a person but a provider of a commercial service on demand, such as transportation, a college degree, or medical care. Second, and implied by the first, no social interaction is expected between the “provider” and the “consumer”. Third, since people are buyers, it is in their interest that they buy at the lowest possible price. The consequence of the three is that the transaction, be it for transport, schooling, or medical aid, is an exchange in which the buyer views the seller as a thing that conveys a commodity. Finally, for a consumer, paying taxes to the government is an involuntary reduction in the income available to spend on commodities. The government thereby denies consumers part of what brings them fulfillment — income to spend on commodities — which is why so many people today view the government not as “us” but as “them”. In a nutshell, the governing impulse of the consumer is “I want.”

The word “citizen” has its roots in the word “city” – an inhabitant of a city, a member of a community. As a member of a community, being a citizen means being part of something bigger than oneself by participating in it. Whether we want to acknowledge it or not, everything and everyone is interconnected, interdependent. To a citizen, the provider of a commercial service is a fellow worker and participant in civil society. The transaction between the two is an exchange in which the buyer views the seller as a fellow citizen, an equal with basic human rights, among which is being paid decently. It takes no great insight to realize that obtaining commodities as cheaply as possible implies driving down one’s own income. And a citizen is not “buying healthcare”, but is making sure everyone in his/her community is healthy, because if there is sickness, it is bad for everyone, including oneself. It is equally obvious that minimizing taxes implies minimizing those activities and functions, such as public education, that create a society from a collection of isolated individuals. The governing impulse of the citizen is “we need.”

So, the next time you hear a news reporter on television or radio inform you that the cost of healthcare reform is “borne by the taxpayer”, or improved wages for teachers “will increase our taxes”, realize that you are being fed a not-very-subtle political message: you live alone; you need feel no responsibility for other members of society; and collective action for social improvement reduces your happiness. In other words, you are a consumer, not a citizen.

References:

  • Alan Axelrod, Profiles of Folly: History’s Worst Decisions and Why They Went Wrong (New York: Sterling Publishing, 2008).
  • Edward L. Bernays, Crystallizing Public Opinion (New York: Boni and Liveright, 1923).
  • —————–, Propaganda (New York: Liveright Publishing Corporation, 1928).
  • —————–, The engineering of consent. Annals of the American Academy of Political and Social Science No. 250, p. 113 (March 1947).
  • The Century of the Self, 2002. Film. Directed by Adam Curtis. England: BBC Four. Transcript here.
  • Sigmund Freud. (1912) A note on the unconscious in psychoanalysis, in The Standard Edition [SE] of the Complete Psychological Works of Sigmund Freud (Vintage, 1999) vol. 12: 260-6.
  • —————–. (1915) The unconscious, in SE (Vintage, 1999) vol. 14: 159-204.
  • —————–. (1916-1917) Introductory lectures on psychoanalysis, in SE (Vintage, 1999) vol. 22: 1-182.
  • Thomas Jefferson, letter to William Charles Jarvis, September 28, 1820. Quoted in “A Short Exercise for the Fourth of July”, Putnam’s Monthly Magazine of American Literature, Science and Art, vol. 10, no. 55, pp. 103-4 (July 1857).
  • Stewart Justman. Freud and his nephew. Social Research 61: 457–476 (1994).
  • Charles F. Kettering. Keep the Consumer Dissatisfied. Nation’s Business 17, no. 1: 30–31, 79 (January 1929).
  • Paul Mazur, American Prosperity: Its Causes and Consequences (New York, NY: Viking Press, 1928), pp. 24, 44, 47, 50.
  • John Stauber and Sheldon Rampton, Toxic Sludge is Good For You: Lies, Damn Lies and the Public Relations Industry (Common Courage Press, 2002). [But remember: there are some very important people counting on you, and they really would prefer that you didn’t ever hear about this book, much less buy it.]
  • Samuel Strauss. Things Are in the Saddle. The Atlantic Monthly 134: 577-88 (November 1924).
  • Larry Tye, The Father of Spin: Edward L. Bernays and the Birth of PR (Crown, 1998).
  • Woodrow Wilson: “Executive Order 2594 – Creating Committee on Public Information,” April 13, 1917. Online by Gerhard Peters and John T. Woolley, The American Presidency Project. Retrieved from http://www.presidency.ucsb.edu/ws/?pid=75409.

April 12, 1748 (a Friday)

Antoine Laurent de Jussieu (1748-1836)

On this date, the French botanist Antoine Laurent de Jussieu was born in Lyon. He proposed the first natural system of classifiying flowering plants (angiosperms), much of which remains in use today.

In his study of flowering plants, Genera plantarum (1789), Jussieu adopted a methodology based on the use of multiple characters to define groups, an idea derived from Scottish-French naturalist Michel Adanson. This was a significant improvement over the original system of Linnaeus, who classified plants into families based on the number of stamens and pistils. Jussieu did keep Linnaeus’ binomial nomenclature, resulting in a work that was far-reaching in its impact; many of the present-day plant families are still attributed to Jussieu. For example, Morton’s 1981 History of botanical science counts 76 of Jussieu’s families conserved in the ICBN, versus just 11 for Linnaeus.

April 10, 1901 (a Wednesday)

On this date, Duncan MacDougall, MD, performed his first experiment to test a hypothesis, to wit, “If personal continuity after the event of death is a fact, if the psychic functions continue to exist as a separate individuality after the death of brain and body, then it must exist as a substantial material entity.” This implies that this entity should have mass, so MacDougall asked himself, “Why not weigh on accurate scales a man at the very moment of death?”

The following is an extract of a letter written by Dr. MacDougall to a Richard Hodgson, MD and dated 10 November 1901, describing MacDougall’s first experiment. The letter was published in May 1907, along with a report of his subsequent experiments, in the Journal of the American Society for Psychical Research:

duncan_macdougall_1907_-_21pp_page_13-14

Interestingly, in a commentary published with the report, the editor of the Journal wrote that he “does not share the hopes which many entertain regarding the possibility of ‘weighing a soul,’ but this does not preclude his [MacDougall’s] recognition of the value of experiment, whatever its outcome. The main point is to have a definite conclusion established, whether it be negative or affirmative.”

According to The New York Times, MacDougall was a “reputable physician” and “at the head of a Research Society which for six years has been experimenting in this field.”

References:

April 8, 563 B.C.E. (?)

Colored lanterns in S. Korea at the Lotus Lantern Festival celebrating Buddha’s birthday.

Shakyamuni Buddha, the historical founder of Buddhism, was born Prince Siddhartha Gotama in the foothills of the Himalayas over 2,500 years ago. His birthday is traditionally celebrated on the first full moon day of the sixth month (Vesakha) of the Indian lunar calendar (which would be the fourth month of the Chinese lunar calendar) except in years in which there’s an extra full moon, and then Buddha’s birthday falls in the seventh month. Well, except where it starts a week earlier. And in Tibet it’s usually a month later…….

Oh, and in Japan, Buddha’s Birthday is always celebrated on April 8.

Confused?

Since the occurrence of the full moon varies from year to year, naturally the actual date varies from year to year (except in Japan).  In Southeast Asia, the day is called Vesak Puja or Visakha or Wesak.   “Puja” means “religious service,” so “Vesak Puja” can be translated “the religious service for the month of Visakha.”  This full moon day is the most commonly observed date for Buddha’s birthday.  Upcoming dates for Vesak Puja include:

  • 2010: May 21
  • 2011: May 10
  • 2012: May 28
  • 2013: May 17
  • 2014: May 6
  • 2015: May 25

In South Korea, Buddha’s birthday is a gala week-long celebration that ends on the first full moon day of the lunar month Vesakha.  Throughout Korea, city streets and temples are decorated with lanterns. At Jogyesa Temple in Seoul, the first day begins with religious ceremonies, followed by a street fair near the temple. In the evening a gala lantern parade stretches for miles through the heart of Seoul.  Here are upcoming dates for the celebration in South Korea:

  • 2010: May 15-May 21
  • 2011: May 4-May 10

Buddha’s birthday in Japan.

In Japan, Buddha’s birthday is always celebrated on April 8, although it is not a national holiday.  This day is called Hana Matsuri or “Flower Festival.” In China, the first celebration of the Buddha’s birth is said to have taken place on April 8 in the latter Chao dynasty (C.E. 319–355) and in Japan it was first held in 660 at the Ganko-ji temple near Nara by order of Empress Suiko. On this day, the statue of the infant Buddha is placed in a flower-decorated shrine symbolizing the beautiful Lumbini garden where the Buddha was born. Sometimes it is carried on a white elephant in a parade, recalling the legendary elephant that brought the Buddha from heaven to the womb of his mother, Queen Maya. People gather around the shrine and pour sweet tea on the statue of the infant Buddha as a substitute for the nectar which is said to have been sprinkled by celestial beings at the time of his birth. The service is therefore called the Kambutsu (Anointing the Buddha) Service.

Celebrating in Tibet.

The entire fourth month of the Tibetan calendar, which usually begins in May and ends in June, is called Saga Dawa (meaning “fourth month”). The seventh day of Saga Dawa is the date of the historical Buddha’s birth for Tibetans. However, the Buddha’s birth, enlightenment and entry into Nirvana at his death are observed together on the 15th day of Saga Dawa, called Saga Dawa Duchen. This is the single most important holiday for Tibetan Buddhism, usually observed with pilgrimages and other visits to temples and shrines. The highlight of Saga Dawa Duchen is the raising of a huge pole which is festooned with prayer flags galore, as pilgrims circumambulate the central ring area with prayer wheels in motion.

April 8, 1805 (a Monday)

Hugo von Mohl

On this date, the German botanist Hugo von Mohl was born in Stuttgart. In 1823, he entered the University of Tübingen. After graduating with distinction in medicine he went to Munich, where he met a distinguished circle of botanists and found ample material for research. Unmarried, Mohl’s pleasures were in his laboratory and library, and in perfecting optical apparatus and microscopic preparations, for which he showed extraordinary manual skill. He suggested using the term protoplasm for the ground substance of cells – the nucleus had already been recognized by Robert  Brown and others, but Mohl showed in 1844 that the protoplasm is the source of those movements which at that time excited so much attention.

The origin of the cell was unknown in Mohl’s time. Schwann had regarded cell growth as a kind of crystallization, beginning with the deposit of a nucleus about a granule in the intercellular substance – the “cytoblastema”, as Schleiden called it. But Mohl, as early as 1835, had called attention to the formation of new vegetable cells through the division of a pre-existing cell. Ehrenberg, another high authority of the time, contended that no such division occurs, and the matter was still in dispute when Schleiden came forward with his discovery of “free cell-formation” within the parent cell, and this for a long time diverted attention from the process of division which Mohl had described. All manner of schemes of cell-formation were put forward during the ensuing years by a multitude of observers, and gained currency notwithstanding Mohl’s reiterated contention that there are really but two ways in which the formation of new cells takes place – namely, “first, through division of older cells; secondly, through the formation of secondary cells lying free in the cavity of a cell.”

But gradually the researches of such accurate observers as Unger, Nageli, Kolliker, Reichart, and Remak tended to confirm Mohl’s opinion that cells spring only from cells, and finally Rudolf Virchow brought the matter to demonstration about 1860. His Omnis cellula e cellula became from that time one of the accepted facts of biology.

Mohl’s early investigations on the structure of palms, cycads, and tree ferns permanently laid the foundation of all later knowledge of this subject.  His later anatomical work was chiefly on the stems of dicotyledons and gymnosperms. He first explained the formation and origin of different types of bark, and corrected errors relating to lenticels. Following his early demonstration of the origin of stomata (1838), Mohl wrote a classical paper on their opening and closing (1850). He received many honors during his lifetime, and was elected foreign fellow of the Royal Society of London in 1868.