Truth-telling

Veridical, today’s ‘Word-of-the-Day’, brings to mind its use in India’s two national epics, the ‘RaamaayaNam’ & the ‘Mahaabhaaratam’. In both epics one of the highest moral values one could practice was ‘truth-saying’. The etymology of today’s word, i.e. ‘veri’ = truth and ‘dic’ = saying, is analogous to the Sanskrit word सत्य + वद (truth + saying). Neither the word’s use nor indeed the action of ‘truth-telling’ are very common in these mendacious times but the parallel etymology interested me.

विष्णुः

धर्मे चार्थे च कामे च मोक्षे च भरतर्षभ |

यदिहास्ति तदन्यत्र। यन्नेहास्ति न तत् क्वचित्॥(१/६२/५३)

dharme cārthe ca kāme ca mokṣe ca bharatarṣabha/

yad ihāsti tad anyatra yan nehāsti na tat kvacit//(1.62.53)

“O King! In matters pertaining to Dharma (righteousness), Artha (economics), Kama (desires), and Moksha (liberation), whatever has been said here may be found elsewhere, but whatever is not found here does not exist anywhere else!

Advertisements
Posted in Uncategorized | Leave a comment

Facts about the FED

Here’s a good article about the FED:

The Incredible Debt Spider: It’s Time to End the Private Fed

By Rand Clifford
January 17, 2013 “Information Clearing House” – The “Federal Reserve Bank” (Fed) is not part of the United States Government. The Fed is a private, for-profit corporation ultimately owned by eight elite banking families:
1. Rothschild’s of London and Berlin
2. Lazard Brothers of Paris
3. Israel Moses Seaf of Italy
4. Kuhn, Loeb & Co. of Germany and New York
5. Warburg & Company of Hamburg, Germany
6. Lehman Brothers of New York
7. Goldman, Sachs of New York
8. Rockefeller Brothers of New York
Magically, the Fed has exclusive rights to the dollar, and for them it’s even better than money growing on trees. Whether paying nominal U.S. mint printing costs for Federal Reserve Notes (like those papering your wallet), or simply entering digits into their banks’ computers, they lend this “money” back to us at a profit, making the following seem inevitable:
“If the American people ever allow private banks to control the issue of their money, first by inflation and then by deflation, the banks and corporations that will grow up around them, will deprive the people of their property until their children will wake up homeless on the continent their fathers conquered.”
A problem with this quote attributed to Thomas Jefferson is that he never said it, apparently. But one great thing about truth, it rings no matter who says it.

Here’s some truth from Wright Patman, Texas Democratic congressman from 1929 to 1976:

…it is absolutely wrong for the Government to issue interest-bearing obligations. It is not only wrong: it is extravagant. It is not only extravagant, it is wasteful. It is absolutely unnecessary.

Congressman Patman was chairman of the House of Representatives Committee on Banking and Currency for forty years; for twenty of those years he introduced legislation to repeal the Federal Reserve Banking Act of 1913.
Patman also said:

I have never yet had anyone who could, through the use of logic and reason, justify the Federal Government borrowing the use of its own money. I believe the time will come when people will demand that this be changed. I believe the time will come in this country when they will actually blame you and me and everyone else connected with the Congress for sitting idly by and permitting such an idiotic system to continue.

That “idiotic system” is the Fed.
$454,393,280,417.03…for the year 2011, that’s how much Americans paid a consortium of twelve private banks for creating dollars out of nothing, and loaning them to the government at interest.
There may be no greater con job than the Fed. Even the name is a con—any influence the United States Government has over the Fed is a dog and pony show. The Fed is Top Dog, and there are no reserves to pony up.

Public Awareness
Consider Americans accepting control of their money supply by debt predators—European establishments rank with old money. Seems incredible, but mainstream corporate media (MSM) is controlled by those same old spiders, so they also control the mass American mind. Obviously, even the government is in their web.
Many people aware of America’s submission to Rothschild-controlled central banking call the whole bleeding mess The Crime of the Century.
In June of 2012, when Congress renewed the Fed’s charter for another century, MSM made it a non-event. That’s right, renewal of the Fed’s charter for another 100 years but MSM was loading up for something popularly seismic on their radar: Hostess Twinkies in jeopardy.
Shook the very foundation of consumerism, rumbled from coast to coast. Twenty-dollar Twinkies…fifty?
Twinkies will probably be ushered back via carpet bombing of American attention span with irresistible cream-filled news. And the Fed will barely cast shadows.
No Laws, Only Power
If you had the privilege of creating dollars from thin air, and loaning them to government at interest, you might do anything to protect that privilege. Control of dollars means enough power to keep the privilege going until, as congressman Patnam said, “…the time will come when people demand that this be changed.”
We may be long past that time by not understanding all the warnings, including what presidents before and after Lincoln have told us about foreign central banking.
President Andrew Jackson put it this way in 1835:

You are a den of vipers. I intend to rout you out, and by the Eternal God I will rout you out. If the people only understood the rank injustice of our money and banking system, there would be revolution before morning.

The U.S. has had three Rothschild-controlled central banks. The First Bank of the United States (1791-1811); The Second Bank of the United States (1816-1836); “The Fed” (until death do us part)? Jackson vetoed renewal of the charter for the Second Bank of the United States several years early, July 10, 1832. Not long after his “…den of vipers” declaration, Jackson survived an assassination attempt when both of the assassin’s pistols misfired; later he told his vice president, “The bank, Mr. Van Buren, is trying to kill me….”
Money is power. A monopoly to create “credit money” from nothing and loan it at compound interest, that’s pure power.
Rothschilds have long been reasonably candid about their predation. In 1838, Amschel Bauer Mayer Rothschild said, “Let me issue and control a nation’s money and I care not who makes its laws.”
1912, the year before the Fed doomed the U.S. to a bitter end, Nathaniel Meyer Rothschild told a group of international bankers, “Give me control of the credit of a nation, and I care not who makes the laws.”
Earth is home to 196 recognized “sovereign” nations. In 2000, 189 nations had a Rothschild-controlled central bank. But since advent of the immortal “War on Terror”, Iraq, Afghanistan, Libya, and Sudan have been captured, leaving only Iran, Cuba, and North Korea still free.
What makes challenging the Rothschild Privilege so dangerous is that protecting their privilege is well within Rothschild Power. Money trumps morality, and most everything else.
The four assassinated U.S. presidents, Lincoln, Garfield, McKinley, Kennedy—they all challenged the Rothschild Privilege….
Lincoln fought Rothschild attempts to get involved in financing the Civil War.
So, through their agent, Treasury Secretary Salmon P. Chase, Rothschilds forced through Congress, in 1864, The National Banking Act.
Lincoln soon warned the American people:

The money power preys upon the nation in time of peace and conspires against it in times of adversity. It is more despotic than monarchy, more insolent than autocracy, more selfish than bureaucracy. I see in the near future a crisis approaching that unnerves me, and causes me to tremble for the safety of our country. Corporations have been enthroned, an era of corruption will follow, and the money power of the country will endeavor to prolong its reign by working upon the prejudices of the people, until the wealth is aggregated in a few hands, and the republic is destroyed.

Lincoln also said, “I have two great enemies, the Southern Army in front of me and the bankers in the rear. Of the two, the one at my rear is my greatest foe.”
Instead of saddling citizens with the 24% to 36% interest demanded by bankers to finance the Civil War for the North, Lincoln came up with “Greenbacks”. $449,338,902 of these full legal tender Treasury Notes were printed. The London Times responded quickly:

If that mischievous financial policy, which had its origin in the North American Republic, should become indurated down to a fixture, then that Government will furnish its own money without cost. It will pay off debts and be without a debt. It will have all the money necessary to carry on its commerce. It will become prosperous beyond precedent in the history of the civilized governments of the world. The brains and the wealth of all countries will go to North America. That government must be destroyed, or it will destroy every monarchy on the globe.

President Garfield warned of the dangers to America should these European Central Bankers ever gain power: “Whoever controls the money of a nation, controls that nation and is absolute master of all industry and commerce. When you realize that the entire system is very easily controlled, one way or another, by a few powerful men at the top, you will not have to be told how periods of inflation and depression originate.” Garfield was assassinated in 1881.
President McKinley began his attack against the Central Bankers with Secretary of State John Sherman (1823-1900). They used the “Sherman Antitrust Act” against the Rothschild supported and funded JP Morgan financial empire known as the Northern Trust, which by the late 1800s owned nearly all of America’s railroads.
McKinley’s fight against the Central Bankers ended with his assassination in 1901. Vice President Theodore “Teddy” Roosevelt took power. Rothschild supported Roosevelt immediately dropped the United States government lawsuits against the Northern Trust.
Kennedy was the other president to create a U.S. money system in defiance of the Rothschild Privilege.
On June 4, 1963, Kennedy signed Executive Order 11110, and the Treasury issued $4.3 billion in U.S. Notes (“Silver Certificates”) into circulation. Five months later, Kennedy was assassinated by a “lone gunman” firing magic bullets.
Congressman in the ’70s and ’80s, Larry P. McDonald fought to expose hidden holdings of international bankers. McDonald died aboard Korean Airlines flight 007 when it was “accidentally” shot down in Soviet Airspace on August 31, 1983.
Senator John Heinz and former Senator John Tower were staunch critics of the Fed, and both served on the powerful Senate Banking and Finance Committee. Heinz was killed in a plane crash on April 4, 1991. Tower also died in a plane crash, the next day.
If these “coincidences” are in fact, coincidence, and not obvious protection of the Rothschild Privilege, that might seem a coincidence somewhat as profound as the Privilege itself.
Only One Solution
Regardless of how much death and destruction a nation is capable of inflicting, and how much “benign global hegemony” they can enforce militarily, if that nation borrows the use of its own money, how can it be regarded as a superpower?
The only real superpower on the planet is Rothschild-controlled central banking and their debt grip on 98% of nations. When the spider spins its fiat money, an equal amount of debt–plus interest–is created. Compound interest.
In the U.S., we will careen from debt crisis to debt crisis while astounding ignorance of the real problem is massaged into the mass American mind by MSM. Threatened government shutdown after threatened government shutdown while austerity dogs the most vulnerable American citizens and debt crushes all but the .001%, unless….
The only way that we might have a future is if we reclaim the right, mandated in the Constitution, of issuing and controlling our own debt-free money. That means a dead Fed is our only hope. Debt is venom, and the Fed will always keep us topped up.
Are we all out of Jacksons, Lincolns, Garfields, McKinleys, Kennedys…or does the U.S. still have some people power?
Perhaps a better way to put it: The U.S. has tremendous people power, but thanks largely to MSM, the people have little idea how to use their power.
Rand Clifford lives in Spokane, Washington. His novels, CASTLING, TIMING, Priest Lake Cathedral, and many earlier articles are published by StarChief Press.

Posted in Government | Leave a comment

Birthday Song in Sanskrit

Birthday song in Sanskrit:

Janmadinamidam ayi priya sakhe
sham tanotu hi sarvada mudam |
prarthayamahe bhava satayu:
eshwara sada twam ca rakshatu |
punya karmana keerthimarjaya
jeevanm tava bhavatu saarthakam ||
जन्मदिन गानम्
जन्मदिनमिदम् अयि प्रिय सखे
शंतनोतु हि सर्वदा मुदम् ।
प्रार्थयामहे भव शतायु:
ईश्वर सदा त्वाम् च रक्षतु ।
पुण्य कर्मणा कीर्तिमार्जय
जीवनम् तव भवतु सार्थकम्

anwaya: artha: ca (prose & meaning )

अयि प्रिय सखे जन्मदिनमिदम् शंतनोतु हि
ayi priya sakhe, janmadinam idam sham tanotu hi|

hey dear friend, let this birthday bring peace to youशतायु: भव
satayu: bhava|
let you live 100 years

ईश्वर: त्वाम् च सदा रक्षतु

eshwara: twam ca sarvada rakshtu.|
always, may God protect you also.

पुण्य कर्मणा कीर्तिमार्जय

punya karmana keerthim arjaya|

gain glory through meritorious acts.

जीवनम् तव भवतु सार्थकम्

jeevanam tava bhavatu saarthakam|

let your life be fruitful.

इति सर्वदा मुदम्  प्रार्थयामहे

iti sarvada mudam prarthayamhe |

thus we all pray happily always.

Posted in Literature, Sanskrit | Tagged | Leave a comment

Imprison Eric Holder!

That black power criminal has got to go ASAP!

Posted in Uncategorized | Leave a comment

A Quick Smile

Here’s a quick smile:

Obama may look like an idiot and talk like an idiot but don’t let

that fool you. He really is an idiot.

Posted in Uncategorized | Leave a comment

Natural History essay – our fate

Natural History in one essay:

 

                                                                     

The State Of   The Species, By Charles C. Mann

 

 

 

 

Reposted from ORION MAGAZINE

 

The problem   with environmentalists, Lynn Margulis used to say, is that they think   conservation has something to do with biological reality. A researcher who   specialized in cells and microorganisms, Margulis was one of the most   important biologists in the last half century—she literally helped to reorder   the tree of life, convincing her colleagues that it did not consist of two   kingdoms (plants and animals), but five or even six (plants, animals, fungi,   protists, and two types of bacteria).

Until   Margulis’s death last year, she lived in my town, and I would bump into her   on the street from time to time. She knew I was interested in ecology, and   she liked to needle me. Hey, Charles, she would call out,   are you still all worked up about protecting endangered species?

Margulis was   no apologist for unthinking destruction. Still, she couldn’t help regarding   conservationists’ preoccupation with the fate of birds, mammals, and plants   as evidence of their ignorance about the greatest source of evolutionary   creativity: the microworld of bacteria, fungi, and protists. More than 90   percent of the living matter on earth consists of microorganisms and viruses,   she liked to point out. Heck, the number of bacterial cells in our body is   ten times more than the number of human cells!

Bacteria and   protists can do things undreamed of by clumsy mammals like us: form giant   supercolonies, reproduce either asexually or by swapping genes with others,   routinely incorporate DNA from entirely unrelated species, merge into   symbiotic beings—the list is as endless as it is amazing. Microorganisms have   changed the face of the earth, crumbling stone and even giving rise to the   oxygen we breathe. Compared to this power and diversity, Margulis liked to   tell me, pandas and polar bears were biological epiphenomena—interesting and   fun, perhaps, but not actuallysignificant.

Does that   apply to human beings, too? I once asked her, feeling like someone whining to   Copernicus about why he couldn’t move the earth a little closer to the center   of the universe. Aren’t we specialat all?

This was just   chitchat on the street, so I didn’t write anything down. But as I recall it,   she answered that Homo sapiens actually might be   interesting—for a mammal, anyway. For one thing, she said, we’re unusually   successful.

Seeing my   face brighten, she added: Of course, the fate of every successful species is   to wipe itself out.

 

OF LICE   AND MEN

Why and how   did humankind become “unusually successful”? And what, to an evolutionary   biologist, does “success” mean, if self-destruction is part of the   definition? Does that self-destruction include the rest of the biosphere?   What are human beings in the grand scheme of things anyway, and where are we   headed? What is human nature, if there is such a thing, and how did we   acquire it? What does that nature portend for our interactions with the   environment? With 7 billion of us crowding the planet, it’s hard to imagine   more vital questions.

One way to   begin answering them came to Mark Stoneking in 1999, when he received a   notice from his son’s school warning of a potential lice outbreak in the   classroom. Stoneking is a researcher at the Max Planck Institute for   Evolutionary Biology in Leipzig, Germany. He didn’t know much about lice. As   a biologist, it was natural for him to noodle around for information about   them. The most common louse found on human bodies, he discovered, is Pediculus   humanus. P. humanus has two subspecies: P. humanus capitis—head   lice, which feed and live on the scalp—and P. humanus corporis—body   lice, which feed on skin but live in clothing. In fact, Stoneking learned,   body lice are so dependent on the protection of clothing that they cannot   survive more than a few hours away from it.

It occurred   to him that the two louse subspecies could be used as an evolutionary   probe. P. humanus capitis, the head louse, could be an   ancient annoyance, because human beings have always had hair for it to   infest. But P. humanus corporis, the body louse, must not be   especially old, because its need for clothing meant that it could not have   existed while humans went naked. Humankind’s great coverup had created a new   ecological niche, and some head lice had rushed to fill it. Evolution then   worked its magic; a new subspecies,P. humanus corporis, arose.   Stoneking couldn’t be sure that this scenario had taken place, though it   seemed likely. But if his idea were correct, discovering when the body louse   diverged from the head louse would provide a rough date for when people first   invented and wore clothing.

The subject   was anything but frivolous: donning a garment is a complicated act. Clothing   has practical uses—warming the body in cold places, shielding it from the sun   in hot places—but it also transforms the appearance of the wearer, something   that has proven to be of inescapable interest to Homo sapiens. Clothing   is ornament and emblem; it separates human beings from their earlier,   un-self-conscious state. (Animals run, swim, and fly without clothing, but   only people can be naked.) The invention of clothing was a sign   that a mental shift had occurred. The human world had become a realm of   complex, symbolic artifacts.

With two   colleagues, Stoneking measured the difference between snippets of DNA in the   two louse subspecies. Because DNA is thought to pick up small, random   mutations at a roughly constant rate, scientists use the number of   differences between two populations to tell how long ago they diverged from a   common ancestor—the greater the number of differences, the longer the   separation. In this case, the body louse had separated from the head louse   about 70,000 years ago. Which meant, Stoneking hypothesized, that clothing   also dated from about 70,000 years ago.

And not just   clothing. As scientists have established, a host of remarkable things   occurred to our species at about that time. It marked a dividing line in our   history, one that made us who we are, and pointed us, for better and worse,   toward the world we now have created for ourselves.

Homo   sapiens emerged on the   planet about 200,000 years ago, researchers believe. From the beginning, our   species looked much as it does today. If some of those long-ago people walked   by us on the street now, we would think they looked and acted somewhat oddly,   but not that they weren’t people. But those anatomically modern   humans were not, as anthropologists say, behaviorally modern.   Those first people had no language, no clothing, no art, no religion, nothing   but the simplest, unspecialized tools. They were little more advanced,   technologically speaking, than their predecessors—or, for that matter, modern   chimpanzees. (The big exception was fire, but that was first controlled by Homo   erectus, one of our ancestors, a million years ago or more.) Our   species had so little capacity for innovation that archaeologists have found   almost no evidence of cultural or social change during our first 100,000   years of existence. Equally important, for almost all that time these early   humans were confined to a single, small area in the hot, dry savanna of East   Africa (and possibly a second, still smaller area in southern Africa).

But now jump   forward 50,000 years. East Africa looks much the same. So do the humans in   it—but suddenly they are drawing and carving images, weaving ropes and   baskets, shaping and wielding specialized tools, burying the dead in formal   ceremonies, and perhaps worshipping supernatural beings. They are wearing   clothes—lice-filled clothes, to be sure, but clothes nonetheless.   Momentously, they are using language. And they are dramatically increasing   their range. Homo sapiens is exploding across the planet.

What caused   this remarkable change? By geologists’ standards, 50,000 years is an instant,   a finger snap, a rounding error. Nonetheless, most researchers believe that   in that flicker of time, favorable mutations swept through our species,   transforming anatomically modern humans into behaviorally modern humans. The   idea is not absurd: in the last 400 years, dog breeders converted village   dogs into creatures that act as differently as foxhounds, border collies, and   Labrador retrievers. Fifty millennia, researchers say, is more than enough to   make over a species.

Homo   sapiens lacks claws,   fangs, or exoskeletal plates. Rather, our unique survival skill is our   ability to innovate, which originates with our species’ singular brain—a   three-pound universe of hyperconnected neural tissue, constantly aswirl with   schemes and notions. Hence every hypothesized cause for the transformation of   humankind from anatomically modern to behaviorally modern involves a physical   alteration of the wet gray matter within our skulls. One candidate   explanation is that in this period people developed hybrid mental abilities   by interbreeding with Neanderthals. (Some Neanderthal genes indeed appear to   be in our genome, though nobody is yet certain of their function.) Another   putative cause is symbolic language—an invention that may have tapped latent   creativity and aggressiveness in our species. A third is that a mutation   might have enabled our brains to alternate between spacing out on imaginative   chains of association and focusing our attention narrowly on the physical   world around us. The former, in this view, allows us to come up with creative   new strategies to achieve a goal, whereas the latter enables us to execute   the concrete tactics required by those strategies.

Each of these   ideas is fervently advocated by some researchers and fervently attacked by   others. What is clear is that something made over our   species between 100,000 and 50,000 years ago—and right in the middle of that   period was Toba.

 

CHILDREN   OF TOBA

About 75,000   years ago, a huge volcano exploded on the island of Sumatra. The biggest   blast for several million years, the eruption created Lake Toba, the world’s   biggest crater lake, and ejected the equivalent of as much as 3,000 cubic   kilometers of rock, enough to cover the District of Columbia in a layer of   magma and ash that would reach to the stratosphere. A gigantic plume spread   west, enveloping southern Asia in tephra (rock, ash, and dust). Drifts in   Pakistan and India reached as high as six meters. Smaller tephra beds   blanketed the Middle East and East Africa. Great rafts of pumice filled the   sea and drifted almost to Antarctica.

In the long   run, the eruption raised Asian soil fertility. In the short term, it was   catastrophic. Dust hid the sun for as much as a decade, plunging the earth   into a years-long winter accompanied by widespread drought. A vegetation   collapse was followed by a collapse in the species that depended on   vegetation, followed by a collapse in the species that depended on the   species that depended on vegetation. Temperatures may have remained colder   than normal for a thousand years. Orangutans, tigers, chimpanzees,   cheetahs—all were pushed to the verge of extinction.

At about this   time, many geneticists believe, Homo sapiens’ numbers shrank   dramatically, perhaps to a few thousand people—the size of a big urban high   school. The clearest evidence of this bottleneck is also its main legacy:   humankind’s remarkable genetic uniformity. Countless people have viewed the   differences between races as worth killing for, but compared to other   primates—even compared to most other mammals—human beings are almost   indistinguishable, genetically speaking. DNA is made from exceedingly long   chains of “bases.” Typically, about one out of every 2,000 of these “bases”   differs between one person and the next. The equivalent figure from two E.   coli (human gut bacteria) might be about one out of twenty. The   bacteria in our intestines, that is, have a hundredfold more innate   variability than their hosts—evidence, researchers say, that our species is   descended from a small group of founders.

Uniformity is   hardly the only effect of a bottleneck. When a species shrinks in number,   mutations can spread through the entire population with astonishing rapidity.   Or genetic variants that may have already been in existence—arrays of genes   that confer better planning skills, for example—can suddenly become more   common, effectively reshaping the species within a few generations as   once-unusual traits become widespread.

Did Toba, as   theorists like Richard Dawkins have argued, cause an evolutionary bottleneck   that set off the creation of behaviorally modern people, perhaps by helping   previously rare genes—Neanderthal DNA or an opportune mutation—spread through   our species? Or did the volcanic blast simply clear away other human species   that had previously blocked H. sapiens’ expansion? Or was   the volcano irrelevant to the deeper story of human change?

For now, the   answers are the subject of careful back-and-forth in refereed journals and   heated argument in faculty lounges. All that is clear is that about the time   of Toba, new, behaviorally modern people charged so fast into the tephra that   human footprints appeared in Australia within as few as 10,000 years, perhaps   within 4,000 or 5,000. Stay-at-home Homo sapiens 1.0, a   wallflower that would never have interested Lynn Margulis, had been replaced   by aggressively expansive Homo sapiens 2.0. Something   happened, for better and worse, and we were born.

One way to   illustrate what this upgrade looked like is to considerSolenopsis invicta, the   red imported fire ant. Geneticists believe thatS. invicta originated   in northern Argentina, an area with many rivers and frequent floods. The   floods wipe out ant nests. Over the millennia, these small, furiously active   creatures have acquired the ability to respond to rising water by coalescing   into huge, floating, pullulating balls—workers on the outside, queen in the   center—that drift to the edge of the flood. Once the waters recede, colonies   swarm back into previously flooded land so rapidly that S. invictaactually   can use the devastation to increase its range.

In the   1930s, Solenopsis invicta was transported to the United   States, probably in ship ballast, which often consists of haphazardly loaded   soil and gravel. As a teenaged bug enthusiast, Edward O. Wilson, the famed   biologist, spotted the first colonies in the port of Mobile, Alabama. He saw   some very happy fire ants. From the ant’s point of view, it had been dumped   into an empty, recently flooded expanse. S. invicta took   off, never looking back.

The initial   incursion watched by Wilson was likely just a few thousand individuals—a   number small enough to suggest that random, bottleneck-style genetic change   played a role in the species’ subsequent history in this country. In their   Argentine birthplace, fire-ant colonies constantly fight each other, reducing   their numbers and creating space for other types of ant. In the United   States, by contrast, the species forms cooperative supercolonies, linked   clusters of nests that can spread for hundreds of miles. Systematically   exploiting the landscape, these supercolonies monopolize every useful   resource, wiping out other ant species along the way—models of zeal and   rapacity. Transformed by chance and opportunity, new-model S.   invictus needed just a few decades to conquer most of the southern   United States.

Homo   sapiens did something   similar in the wake of Toba. For hundreds of thousands of years, our species   had been restricted to East Africa (and, possibly, a similar area in the   south). Now, abruptly, new-model Homo sapiens were racing   across the continents like so many imported fire ants. The difference between   humans and fire ants is that fire ants specialize in disturbed habitats.   Humans, too, specialize in disturbed habitats—but we do the disturbing.

 

THE WORLD   IS A PETRI DISH

As a student   at the University of Moscow in the 1920s, Georgii Gause spent years   trying—and failing—to drum up support from the Rockefeller Foundation, then   the most prominent funding source for non-American scientists who wished to   work in the United States. Hoping to dazzle the foundation, Gause decided to   perform some nifty experiments and describe the results in his grant   application.

By today’s   standards, his methodology was simplicity itself. Gause placed half a gram of   oatmeal in one hundred cubic centimeters of water, boiled the results for ten   minutes to create a broth, strained the liquid portion of the broth into a   container, diluted the mixture by adding water, and then decanted the   contents into small, flat-bottomed test tubes. Into each he dripped   five Paramecium caudatum or Stylonychia mytilus, both   single-celled protozoans, one species per tube. Each of Gause’s test tubes   was a pocket ecosystem, a food web with a single node. He stored the tubes in   warm places for a week and observed the results. He set down his conclusions   in a 163-page book, The Struggle for Existence,published in 1934.

Today The   Struggle for Existence is recognized as a scientific landmark, one   of the first successful marriages of theory and experiment in ecology. But   the book was not enough to get Gause a fellowship; the Rockefeller Foundation   turned down the twenty-four-year-old Soviet student as insufficiently   eminent. Gause could not visit the United States for another twenty years, by   which time he had indeed become eminent, but as an antibiotics researcher.

What Gause   saw in his test tubes is often depicted in a graph, time on the horizontal   axis, the number of protozoa on the vertical. The line on the graph is a   distorted bell curve, with its left side twisted and stretched into a kind of   flattened S. At first the number of protozoans grows slowly, and the graph   line slowly ascends to the right. But then the line hits an inflection point,   and suddenly rockets upward—a frenzy of exponential growth. The mad rise   continues until the organism begins to run out of food, at which point there   is a second inflection point, and the growth curve levels off again as   bacteria begin to die. Eventually the line descends, and the population falls   toward zero.

Years ago I   watched Lynn Margulis, one of Gause’s successors, demonstrate these   conclusions to a class at the University of Massachusetts with a time-lapse   video of Proteus vulgaris, a bacterium that lives in the   gastrointestinal tract. To humans, she said, P. vulgaris is   mainly notable as a cause of urinary-tract infections. Left alone, it divides   about every fifteen minutes. Margulis switched on the projector. Onscreen was   a small, wobbly bubble—P. vulgaris—in a shallow, circular glass   container: a petri dish. The class gasped. The cells in the time-lapse video   seemed to shiver and boil, doubling in number every few seconds, colonies   exploding out until the mass of bacteria filled the screen. In just   thirty-six hours, she said, this single bacterium could cover the entire   planet in a foot-deep layer of single-celled ooze. Twelve hours after that,   it would create a living ball of bacteria the size of the earth.

Such a   calamity never happens, because competing organisms and lack of resources   prevent the overwhelming majority of P. vulgarisfrom reproducing.   This, Margulis said, is natural selection, Darwin’s great insight. All living   creatures have the same purpose: to make more of themselves, ensuring their   biological future by the only means available. Natural selection stands in   the way of this goal. It prunes back almost all species, restricting their   numbers and confining their range. In the human body, P. vulgaris is   checked by the size of its habitat (portions of the human gut), the limits to   its supply of nourishment (food proteins), and other, competing organisms.   Thus constrained, its population remains roughly steady.

In the petri   dish, by contrast, competition is absent; nutrients and habitat seem   limitless, at least at first. The bacterium hits the first inflection point   and rockets up the left side of the curve, swamping the petri dish in a   reproductive frenzy. But then its colonies slam into the second inflection   point: the edge of the dish. When the dish’s nutrient supply is   exhausted, P. vulgaris experiences a miniapocalypse.

By luck or   superior adaptation, a few species manage to escape their limits, at least   for a while. Nature’s success stories, they are like Gause’s protozoans; the   world is their petri dish. Their populations grow exponentially; they take over   large areas, overwhelming their environment as if no force opposed them. Then   they annihilate themselves, drowning in their own wastes or starving from   lack of food.

To someone   like Margulis, Homo sapiens looks like one of these briefly   fortunate species.

 

THE WHIP   HAND

No more than   a few hundred people initially migrated from Africa, if geneticists are   correct. But they emerged into landscapes that by today’s standards were as   rich as Eden. Cool mountains, tropical wetlands, lush forests—all were teeming   with food. Fish in the sea, birds in the air, fruit on the trees: breakfast   was everywhere. People moved in.

Despite our   territorial expansion, though, humans were still only in the initial stages   of Gause’s oddly shaped curve. Ten thousand years ago, most demographers   believe, we numbered barely 5 million, about one human being for every   hundred square kilometers of the earth’s land surface. Homo sapiens was   a scarcely noticeable dusting on the surface of a planet dominated by   microbes. Nevertheless, at about this time—10,000 years ago, give or take a   millennium—humankind finally began to approach the first inflection point.   Our species was inventing agriculture.

The wild   ancestors of cereal crops like wheat, barley, rice, and sorghum have been   part of the human diet for almost as long as there have been humans to eat   them. (The earliest evidence comes from Mozambique, where researchers found   tiny bits of 105,000-year-old sorghum on ancient scrapers and grinders.) In   some cases people may have watched over patches of wild grain, returning to   them year after year. Yet despite the effort and care the plants were not   domesticated. As botanists say, wild cereals “shatter”—individual grain   kernels fall off as they ripen, scattering grain haphazardly, making it   impossible to harvest the plants systematically. Only when unknown geniuses   discovered naturally mutated grain plants that did not shatter—and   purposefully selected, protected, and cultivated them—did true agriculture   begin. Planting great expanses of those mutated crops, first in southern   Turkey, later in half a dozen other places, early farmers created landscapes   that, so to speak, waited for hands to harvest them.

Farming   converted most of the habitable world into a petri dish. Foragers manipulated   their environment with fire, burning areas to kill insects and encourage the   growth of useful species—plants we liked to eat, plants that attracted the   other creatures we liked to eat. Nonetheless, their diets were largely   restricted to what nature happened to provide in any given time and season.   Agriculture gave humanity the whip hand. Instead of natural ecosystems with   their haphazard mix of species (so many useless organisms guzzling up   resources!), farms are taut, disciplined communities conceived and dedicated   to the maintenance of a single species: us.

Before   agriculture, the Ukraine, American Midwest, and lower Yangzi were barely   hospitable food deserts, sparsely inhabited landscapes of insects and grass;   they became breadbaskets as people scythed away suites of species that used   soil and water we wanted to dominate and replaced them with wheat, rice, and   maize (corn). To one of Margulis’s beloved bacteria, a petri dish is a   uniform expanse of nutrients, all of which it can seize and consume.   For Homo sapiens, agriculture transformed the planet into   something similar.

As in a   time-lapse movie, we divided and multiplied across the newly opened land. It   had taken Homo sapiens 2.0, behaviorally modern humans, not   even 50,000 years to reach the farthest corners of the globe. Homo   sapiens 2.0.A—A for agriculture—took a tenth of that time to conquer   the planet.

As any   biologist would predict, success led to an increase in human numbers. Homo   sapiens rocketed around the elbow of the first inflection point in   the seventeenth and eighteenth centuries, when American crops like potatoes,   sweet potatoes, and maize were introduced to the rest of the world.   Traditional Eurasian and African cereals—wheat, rice, millet, and sorghum,   for example—produce their grain atop thin stalks. Basic physics suggests that   plants with this design will fatally topple if the grain gets too heavy,   which means that farmers can actually be punished if they have an   extra-bounteous harvest. By contrast, potatoes and sweet potatoes grow   underground, which means that yields are not limited by the plant’s   architecture. Wheat farmers in Edinburgh and rice farmers in Edo alike   discovered they could harvest four times as much dry food matter from an acre   of tubers than they could from an acre of cereals. Maize, too, was a winner.   Compared to other cereals, it has an extra-thick stalk and a different, more   productive type of photosynthesis. Taken together, these immigrant crops   vastly increased the food supply in Europe, Asia, and Africa, which in turn helped   increase the supply of Europeans, Asians, and Africans. The population boom   had begun.

Numbers kept   rising in the nineteenth and twentieth centuries, after a German chemist,   Justus von Liebig, discovered that plant growth was limited by the supply of   nitrogen. Without nitrogen, neither plants nor the mammals that eat plants   can create proteins, or for that matter the DNA and RNA that direct their   production. Pure nitrogen gas (N2) is plentiful in the air but plants are   unable to absorb it, because the two nitrogen atoms in N2 are welded so   tightly together that plants cannot split them apart for use. Instead, plants   take in nitrogen only when it is combined with hydrogen, oxygen, and other   elements. To restore exhausted soil, traditional farmers grew peas, beans,   lentils, and other pulses. (They never knew why these “green   manures” replenished the land. Today we know that their roots contain special   bacteria that convert useless N2 into “bio-available” nitrogen compounds.)   After Liebig, European and American growers replaced those crops with   high-intensity fertilizer—nitrogen-rich guano from Peru at first, then   nitrates from mines in Chile. Yields soared. But supplies were much more   limited than farmers liked. So intense was the competition for fertilizer   that a guano war erupted in 1879, engulfing much of western South America.   Almost 3,000 people died.

Two more   German chemists, Fritz Haber and Carl Bosch, came to the rescue, discovering   the key steps to making synthetic fertilizer from fossil fuels. (The process   involves combining nitrogen gas and hydrogen from natural gas into ammonia,   which is then used to create nitrogenous compounds usable by plants.) Haber   and Bosch are not nearly as well known as they should be; their discovery,   the Haber-Bosch process, has literally changed the chemical composition of   the earth, a feat previously reserved for microorganisms. Farmers have   injected so much synthetic fertilizer into the soil that soil and groundwater   nitrogen levels have risen worldwide. Today, roughly a third of all the   protein (animal and vegetable) consumed by humankind is derived from   synthetic nitrogen fertilizer. Another way of putting this is to say that   Haber and Bosch enabled Homo sapiensto extract about 2 billion   people’s worth of food from the same amount of available land.

The improved   wheat, rice, and (to a lesser extent) maize varieties developed by plant   breeders in the 1950s and 1960s are often said to have prevented another   billion deaths. Antibiotics, vaccines, and water-treatment plants also saved   lives by pushing back humankind’s bacterial, viral, and fungal enemies. With   almost no surviving biological competition, humankind had ever more   unhindered access to the planetary petri dish: in the past two hundred years,   the number of humans walking the planet ballooned from 1 to 7 billion, with a   few billion more expected in coming decades.

Rocketing up   the growth curve, human beings “now appropriate nearly 40% . . . of potential   terrestrial productivity.” This figure dates from 1986—a famous estimate by a   team of Stanford biologists. Ten years later, a second Stanford team   calculated that the “fraction of the land’s biological production that is   used or dominated” by our species had risen to as much as 50 percent. In   2000, the chemist Paul Crutzen gave a name to our time: the “Anthropocene,”   the era in which Homo sapiens became a force operating on a   planetary scale. That year, half of the world’s accessible fresh water was   consumed by human beings.

Lynn   Margulis, it seems safe to say, would have scoffed at these assessments of   human domination over the natural world, which, in every case I know of, do   not take into account the enormous impact of the microworld. But she would   not have disputed the central idea:Homo sapiens has become a   successful species, and is growing accordingly.

If we follow   Gause’s pattern, growth will continue at a delirious speed until we hit the   second inflection point. At that time we will have exhausted the resources of   the global petri dish, or effectively made the atmosphere toxic with our   carbon-dioxide waste, or both. After that, human life will be, briefly, a   Hobbesian nightmare, the living overwhelmed by the dead. When the king falls,   so do his minions; it is possible that our fall might also take down most mammals   and many plants. Possibly sooner, quite likely later, in this scenario, the   earth will again be a choir of bacteria, fungi, and insects, as it has been   through most of its history.

It would be   foolish to expect anything else, Margulis thought. More than that, it would   be unnatural.

 

AS PLASTIC   AS CANBY

In The   Phantom Tollbooth, Norton Juster’s classic, pun-filled adventure   tale, the young Milo and his faithful companions unexpectedly find themselves   transported to a bleak, mysterious island. Encountering a man in a tweed   jacket and beanie, Milo asks him where they are. The man replies by asking if   they know who he is—the man is, apparently, confused on the subject. Milo and   his friends confer, then ask if he can describe himself.

“Yes,   indeed,” the man replied happily. “I’m as tall as can be”—and he grew   straight up until all that could be seen of him were his shoes and   stockings—“and I’m as short as can be”—and he shrank down to the size of a   pebble. “I’m as generous as can be,” he said, handing each of them a large   red apple, “and I’m as selfish as can be,” he snarled, grabbing them back   again.

In short   order, the companions learn that the man is as strong as can be, weak as can   be, smart as can be, stupid as can be, graceful as can be, clumsy as—you get   the picture. “Is that any help to you?” he asks. Again, Milo and his friends   confer, and realize that the answer is actually quite simple:

“Without a   doubt,” Milo concluded brightly, “you must be Canby.”

“Of   course, yes, of course,” the man shouted. “Why didn’t I think of that? I’m as   happy as can be.”

With Canby,   Juster presumably meant to mock a certain kind of babyish, uncommitted   man-child. But I can’t help thinking of poor old Canby as exemplifying one of   humankind’s greatest attributes:behavioral plasticity. The term   was coined in 1890 by the pioneering psychologist William James, who defined   it as “the possession of a structure weak enough to yield to an influence,   but strong enough not to yield all at once.” Behavioral plasticity, a   defining feature ofHomo sapiens’ big brain, means that humans can   change their habits; almost as a matter of course, people change careers,   quit smoking or take up vegetarianism, convert to new religions, and migrate   to distant lands where they must learn strange languages. This plasticity,   this Canby-hood, is the hallmark of our transformation from anatomically   modern Homo sapiens to behaviorally modern Homo   sapiens—and the reason, perhaps, we were able to survive when Toba   reconfigured the landscape.

Other   creatures are much less flexible. Like apartment-dwelling cats that   compulsively hide in the closet when visitors arrive, they have limited   capacity to welcome new phenomena and change in response. Human beings, by   contrast, are so exceptionally plastic that vast swaths of neuroscience are   devoted to trying to explain how this could come about. (Nobody knows for   certain, but some researchers now think that particular genes give their   possessors a heightened, inborn awareness of their environment, which can   lead both to useless, neurotic sensitivity and greater ability to detect and   adapt to new situations.)

Plasticity in   individuals is mirrored by plasticity on a societal level. The caste system   in social species like honeybees is elaborate and finely tuned but fixed, as   if in amber, in the loops of their DNA. Some leafcutter ants are said to   have, next to human beings, the biggest and most complex societies on earth,   with elaborately coded behavior that reaches from disposal of the dead to   complex agricultural systems. Housing millions of individuals in   inconceivably ramose subterranean networks, leafcutter colonies are “Earth’s   ultimate superorganisms,” Edward O. Wilson has written. But they are   incapable of fundamental change. The centrality and authority of the queen cannot   be challenged; the tiny minority of males, used only to inseminate queens,   will never acquire new responsibilities.

Human   societies are far more varied than their insect cousins, of course. But the   true difference is their plasticity. It is why humankind, a species of   Canbys, has been able to move into every corner of the earth, and to control   what we find there. Our ability to change ourselves to extract resources from   our surroundings with ever-increasing efficiency is what has made Homo   sapiens a successful species. It is our greatest blessing.

Or was our   greatest blessing, anyway.

 

DISCOUNT   RATES

By 2050,   demographers predict, as many as 10 billion human beings will walk the earth,   3 billion more than today. Not only will more people exist than ever before,   they will be richer than ever before. In the last three decades hundreds of   millions in China, India, and other formerly poor places have lifted   themselves from destitution—arguably the most important, and certainly the   most heartening, accomplishment of our time. Yet, like all human enterprises,   this great success will pose great difficulties.

In the past,   rising incomes have invariably prompted rising demand for goods and services.   Billions more jobs, homes, cars, fancy electronics—these are things the newly   prosperous will want. (Why shouldn’t they?) But the greatest challenge may be   the most basic of all: feeding these extra mouths. To agronomists, the   prospect is sobering. The newly affluent will not want their ancestors’   gruel. Instead they will ask for pork and beef and lamb. Salmon will sizzle   on their outdoor grills. In winter, they will want strawberries, like people   in New York and London, and clean bibb lettuce from hydroponic gardens.

All of these,   each and every one, require vastly more resources to produce than simple   peasant agriculture. Already 35 percent of the world’s grain harvest is used   to feed livestock. The process is terribly inefficient: between seven and ten   kilograms of grain are required to produce one kilogram of beef. Not only   will the world’s farmers have to produce enough wheat and maize to feed 3   billion more people, they will have to produce enough to give them all   hamburgers and steaks. Given present patterns of food consumption, economists   believe, we will need to produce about 40 percent more grain in 2050 than we   do today.

How can we   provide these things for all these new people? That is only part of the   question. The full question is: How can we provide them without wrecking the   natural systems on which all depend?

Scientists,   activists, and politicians have proposed many solutions, each from a   different ideological and moral perspective. Some argue that we must   drastically throttle industrial civilization. (Stop energy-intensive,   chemical-based farming today! Eliminate fossil fuels to halt climate change!)   Others claim that only intense exploitation of scientific knowledge can save   us. (Plant super-productive, genetically modified crops now! Switch to   nuclear power to halt climate change!) No matter which course is chosen,   though, it will require radical, large-scale transformations in the human   enterprise—a daunting, hideously expensive task.

Worse, the   ship is too large to turn quickly. The world’s food supply cannot be   decoupled rapidly from industrial agriculture, if that is seen as the answer.   Aquifers cannot be recharged with a snap of the fingers. If the high-tech   route is chosen, genetically modified crops cannot be bred and tested   overnight. Similarly, carbon-sequestration techniques and nuclear power plants   cannot be deployed instantly. Changes must be planned and executed decades in   advance of the usual signals of crisis, but that’s like asking healthy, happy   sixteen-year-olds to write living wills.

Not only is   the task daunting, it’s strange. In the name of nature, we   are asking human beings to do something deeply unnatural, something no other   species has ever done or could ever do: constrain its own growth (at least in   some ways). Zebra mussels in the Great Lakes, brown tree snakes in Guam,   water hyacinth in African rivers, gypsy moths in the northeastern U.S.,   rabbits in Australia, Burmese pythons in Florida—all these successful species   have overrun their environments, heedlessly wiping out other creatures. Like   Gause’s protozoans, they are racing to find the edges of their petri dish.   Not one has voluntarily turned back. Now we are asking Homo sapiens to   fence itself in.

What a   peculiar thing to ask! Economists like to talk about the “discount rate,”   which is their term for preferring a bird in hand today over two in the bush   tomorrow. The term sums up part of our human nature as well. Evolving in   small, constantly moving bands, we are as hard-wired to focus on the   immediate and local over the long-term and faraway as we are to prefer   parklike savannas to deep dark forests. Thus, we care more about the broken   stoplight up the street today than conditions next year in Croatia, Cambodia,   or the Congo. Rightly so, evolutionists point out: Americans are far more   likely to be killed at that stoplight today than in the Congo next year. Yet   here we are asking governments to focus on potential planetary boundaries   that may not be reached for decades. Given the discount rate, nothing could   be more understandable than the U.S. Congress’s failure to grapple with, say,   climate change. From this perspective, is there any reason to imagine   that Homo sapiens,unlike mussels, snakes, and moths, can exempt   itself from the natural fate of all successful species?

To biologists   like Margulis, who spend their careers arguing that humans are simply part of   the natural order, the answer should be clear. All life is similar at base.   All species seek without pause to make more of themselves—that is their goal.   By multiplying till we reach our maximum possible numbers, even as we take out   much of the planet, we are fulfilling our destiny.

From this   vantage, the answer to the question whether we are doomed to destroy   ourselves is yes. It should be obvious.

Should be—but perhaps is not.

 

HARA HACHI   BU

When I   imagine the profound social transformation necessary to avoid calamity, I   think about Robinson Crusoe, hero of Daniel Defoe’s famous novel. Defoe   clearly intended his hero to be an exemplary man. Shipwrecked on an   uninhabited island off Venezuela in 1659, Crusoe is an impressive example of   behavioral plasticity. During his twenty-seven-year exile he learns to catch   fish, hunt rabbits and turtles, tame and pasture island goats, prune and   support local citrus trees, and create “plantations” of barley and rice from   seeds that he salvaged from the wreck. (Defoe apparently didn’t know that   citrus and goats were not native to the Americas and thus Crusoe probably   wouldn’t have found them there.) Rescue comes at last in the form of a   shipful of ragged mutineers, who plan to maroon their captain on the   supposedly empty island. Crusoe helps the captain recapture his ship and   offers the defeated mutineers a choice: trial in England or permanent   banishment to the island. All choose the latter. Crusoe has harnessed so much   of the island’s productive power to human use that even a gaggle of inept   seamen can survive there in comfort.

To get Crusoe   on his unlucky voyage, Defoe made him an officer on a slave ship,   transporting captured Africans to South America. Today, no writer would make   a slave seller the admirable hero of a novel. But in 1720, when Defoe   published Robinson Crusoe, no readers said boo about   Crusoe’s occupation, because slavery was the norm from one end of the world   to another. Rules and names differed from place to place, but coerced labor   was everywhere, building roads, serving aristocrats, and fighting wars.   Slaves teemed in the Ottoman Empire, Mughal India, and Ming China. Unfree   hands were less common in continental Europe, but Portugal, Spain, France,   England, and the Netherlands happily exploited slaves by the million in their   American colonies. Few protests were heard; slavery had been part of the   fabric of life since the code of Hammurabi.

Then, in the   space of a few decades in the nineteenth century, slavery, one of humankind’s   most enduring institutions, almost vanished.

The sheer   implausibility of this change is staggering. In 1860, slaves were,   collectively, the single most valuable economic asset in the United States,   worth an estimated $3 billion, a vast sum in those days (and about $10   trillion in today’s money). Rather than investing in factories like northern   entrepreneurs, southern businessmen had sunk their capital into slaves. And   from their perspective, correctly so—masses of enchained men and women had   made the region politically powerful, and gave social status to an entire   class of poor whites. Slavery was the foundation of the social order. It was,   thundered John C. Calhoun, a former senator, secretary of state, and vice   president, “instead of an evil, a good—a positive good.” Yet just a few years   after Calhoun spoke, part of the United States set out to destroy this   institution, wrecking much of the national economy and killing half a million   citizens along the way.

Incredibly,   the turn against slavery was as universal as slavery itself. Great Britain,   the world’s biggest human trafficker, closed down its slave operations in   1808, though they were among the nation’s most profitable industries. The   Netherlands, France, Spain, and Portugal soon followed. Like stars winking   out at the approach of dawn, cultures across the globe removed themselves   from the previously universal exchange of human cargo. Slavery still exists   here and there, but in no society anywhere is it formally accepted as part of   the social fabric.

Historians   have provided many reasons for this extraordinary transition. But one of the   most important is that abolitionists had convinced huge numbers of ordinary   people around the world that slavery was a moral disaster. An institution   fundamental to human society for millennia was swiftly dismantled by ideas   and a call to action, loudly repeated.

In the last   few centuries, such profound changes have occurred repeatedly. Since the   beginning of our species, for instance, every known society has been based on   the domination of women by men. (Rumors of past matriarchal societies abound,   but few archaeologists believe them.) In the long view, women’s lack of   liberty has been as central to the human enterprise as gravitation is to the   celestial order. The degree of suppression varied from time to time and place   to place, but women never had an equal voice; indeed, some evidence exists   that the penalty for possession of two X chromosomes increased with   technological progress. Even as the industrial North and agricultural South   warred over the treatment of Africans, they regarded women identically: in   neither half of the nation could they attend college, have a bank account, or   own property. Equally confining were women’s lives in Europe, Asia, and   Africa. Nowadays women are the majority of U.S. college students, the   majority of the workforce, and the majority of voters. Again, historians   assign multiple causes to this shift in the human condition, rapid in time,   staggering in scope. But one of the most important was the power of ideas—the   voices, actions, and examples of suffragists, who through decades of ridicule   and harassment pressed their case. In recent years something similar seems to   have occurred with gay rights: first a few lonely advocates, censured and   mocked; then victories in the social and legal sphere; finally, perhaps, a   slow movement to equality.

Less well   known, but equally profound: the decline in violence. Foraging societies   waged war less brutally than industrial societies, but more frequently.   Typically, archaeologists believe, about a quarter of all hunters and   gatherers were killed by their fellows. Violence declined somewhat as humans   gathered themselves into states and empires, but was still a constant presence.   When Athens was at its height in the fourth and fifth centuries BC, it was   ever at war: against Sparta (First and Second Peloponnesian Wars, Corinthian   War); against Persia (Greco-Persian Wars, Wars of the Delian League); against   Aegina (Aeginetan War); against Macedon (Olynthian War); against Samos   (Samian War); against Chios, Rhodes, and Cos (Social War).

In this   respect, classical Greece was nothing special—look at the ghastly histories   of China, sub-Saharan Africa, or Mesoamerica. Similarly, early modern   Europe’s wars were so fast and furious that historians simply gather them   into catchall titles like the Hundred Years’ War, followed by the shorter but   even more destructive Thirty Years’ War. And even as Europeans and their   descendants paved the way toward today’s concept of universal human rights by   creating documents like the Bill of Rights and the Declaration of the Rights   of Man and of the Citizen, Europe remained so mired in combat that it fought   two conflicts of such massive scale and reach they became known as “world”   wars.

Since the   Second World War, however, rates of violent death have fallen to the lowest   levels in known history. Today, the average person is far less likely to be   slain by another member of the species than ever before—an extraordinary   transformation that has occurred, almost unheralded, in the lifetime of many   of the people reading this article. As the political scientist Joshua   Goldstein has written, “we are winning the war on war.” Again, there are   multiple causes. But Goldstein, probably the leading scholar in this field,   argues that the most important is the emergence of the United Nations and   other transnational bodies, an expression of the ideas of peace activists   earlier in the last century.

As a   relatively young species, we have an adolescent propensity to make a mess: we   pollute the air we breathe and the water we drink, and appear stalled in an   age of carbon dumping and nuclear experimentation that is putting countless   species at risk including our own. But we are making undeniable progress   nonetheless. No European in 1800 could have imagined that in 2000 Europe   would have no legal slavery, women would be able to vote, and gay people   would be able to marry. No one could have guessed a continent that had been   tearing itself apart for centuries would be free of armed conflict, even amid   terrible economic times. Given this record, even Lynn Margulis might pause   (maybe).

Preventing Homo   sapiens from destroying itself à la Gause would require a still   greater transformation—behavioral plasticity of the highest order—because we   would be pushing against biological nature itself. The Japanese have an   expression, hara hachi bu,which means, roughly speaking, “belly   80 percent full.” Hara hachi buis shorthand for an ancient   injunction to stop eating before feeling full. Nutritionally, the command   makes a great deal of sense. When people eat, their stomachs produce peptides   that signal fullness to the nervous system. Unfortunately, the mechanism is   so slow that eaters frequently perceive satiety only after they have consumed   too much—hence the all-too-common condition of feeling bloated or sick from   overeating. Japan—actually, the Japanese island of Okinawa—is the only place   on earth where large numbers of people are known to restrict their own   calorie intake systematically and routinely. Some researchers claim   that hara hachi bu is responsible for Okinawans’ notoriously   long life spans. But I think of it as a metaphor for stopping before the   second inflection point, voluntarily forswearing short-term consumption to   obtain a long-term benefit.

Evolutionarily   speaking, a species-wide adoption of hara hachi buwould be   unprecedented. Thinking about it, I can picture Lynn Margulis rolling her   eyes. But is it so unlikely that our species, Canbys one and all, would be   able to do exactly that before we round that fateful curve of the second   inflection point and nature does it for us?

I can imagine   Margulis’s response: You’re imagining our species as some sort of   big-brained, hyperrational, benefit-cost-calculating computer! A better   analogy is the bacteria at our feet! Still, Margulis would be the first to   agree that removing the shackles from women and slaves has begun to unleash   the suppressed talents of two-thirds of the human race. Drastically reducing   violence has prevented the waste of countless lives and staggering amounts of   resources. Is it really impossible to believe that we wouldn’t use those   talents and those resources to draw back before the abyss?

Our record of   success is not that long. In any case, past successes are no guarantee of the   future. But it is terrible to suppose that we could get so many other things   right and get this one wrong. To have the imagination to see our potential   end, but not have the imagination to avoid it. To send humankind to the moon   but fail to pay attention to the earth. To have the potential but to be   unable to use it—to be, in the end, no different from the protozoa in the   petri dish. It would be evidence that Lynn Margulis’s most dismissive beliefs   had been right after all. For all our speed and voraciousness, our changeable   sparkle and flash, we would be, at last count, not an especially interesting   species.

Posted in Uncategorized | Tagged | Leave a comment

Israel the world’s most dangerous terrorist pseudo state

Why I Dislike Israel

By Philip Giraldi

October 06, 2012 “
Information Clearing House” –   Even those pundits who seem to want to distance U.S. foreign policy from Tel Aviv’s demands and begin treating Israel like any other country sometimes feel compelled to make excuses and apologies before getting down to the nitty-gritty. The self-lacerating prologues generally describe how much the writer really has a lot of Jewish friends and how he or she thinks Israelis are great people and that Israel is a wonderful country before launching into what is usually a fairly mild critique.

Well, I don’t feel that way. I don’t like Israel very much. Whether or not I have Jewish friends does not define how I see Israel and is irrelevant to the argument. And as for the Israelis, when I was a CIA officer overseas, I certainly encountered many of them. Some were fine people and some were not so fine, just like the general run of people everywhere else in the world. But even the existence of good upstanding Israelis doesn’t alter the fact that the governments that they have elected are essentially part of a long-running criminal enterprise judging by the serial convictions of former presidents and prime ministers. Most recently, former President Moshe Katsav was convicted of rape, while almost every recent head of government, including the current one, has been investigated for corruption. Further, the Israeli government is a rogue regime by most international standards, engaging as it does in torture, arbitrary imprisonment, and continued occupation of territories seized by its military. Worse still, it has successfully manipulated my country, the United States, and has done terrible damage both to our political system and to the American people, a crime that I just cannot forgive, condone, or explain away.

The most recent outrage is Israeli Prime Minister Benjamin Netanyahu’s direct interference in U.S. domestic politics through his appearance in a television ad appearing in Florida that serves as an endorsement of Republican candidate Mitt Romney. The Netanyahu ad and his involvement in the election has been widely reported in the media and has even been condemned by several leading Jewish congressmen, but it has elicited no response from either Obama or Romney. Both should be condemning in the strongest terms the completely unprecedented intervention by a foreign head of government in an American election. That they are saying nothing is a testament to the power that Israel and its friends in Congress and the media have over the U.S. political establishment. Romney might even privately approve of the ads, as he has basically promised to cede to Netanyahu the right to set the limits for U.S. policy in the Middle East.

And why is Benjamin Netanyahu in such a lather? It is because President Barack Obama will not concede to him a “red line” that would automatically trigger a U.S. attack on Iran. Consider for a moment the hubris of Netanyahu in demanding that Washington meet his conditions for going to war with Iran, a nation that for all its frequently described faults has not attacked anyone, has not threatened to attack anyone, and has not made the political decision to acquire a nuclear weapon in spite of what one reads in the U.S. press. At the U.N., Netanyahu’s chart showing a cartoon bomb with a sputtering fuse reminiscent of something that might have been employed by an anarchist in the 1870s failed to pass any credibility test even for the inevitable cheerleaders in the U.S. media. If the U.S. is to go to war based on a Netanyahu cartoon then it deserves everything it gets when the venture turns sour, most likely Iraq Redux, only 10 times worse.

Even more outrageous, and a lot less reported in the media, were the comments made by Patrick Clawson, director of research for the Washington Institute for Near East Policy (WINEP), an organization founded by the American Israel Public Affairs Committee (AIPAC). WINEP is widely viewed as a major component of the Israel Lobby in Washington and is closely tied to the Israeli government, with which it communicates on a regular basis. Clawson heads WINEP’s Iran Security Initiative. At a briefing on Sept. 24 he said, “I frankly think that crisis initiation is really tough, and it’s very hard for me to see how the United States … uh … president can get us to war with Iran.… The traditional way America gets to war is what would be best for U.S. interests.”

Note that Clawson states his conviction that initiating a crisis to get the U.S. involved in a war with Iran and thereby fooling the American people into thinking that it is the right thing to do is actually a “U.S. interest.” He cites Pearl Harbor, Fort Sumter, the Lusitania, and the Gulf of Tonkin as models for how to get engaged. Which inevitably leads to Clawson’s solution: “if the Iranians aren’t going to compromise it would be best if someone else started the war … Iranian submarines periodically go down. Some day one of them may not come up…. We are in the game of using covert means against the Iranians. We could get nastier at that.” Clawson is clearly approving of Israel’s staging an incident that would lead to war, possibly even a false-flag operation carried out by Israel that would implicate the United States directly, or he is urging the White House to do the job itself.

Clawson not surprisingly has never served in the U.S. military and has a Ph.D. in economics from the New School for Social Research, which would at first glance seem to disqualify him from figuring out how to set up a covert operation to sink a submarine and thereby start a war. He might be seen as moderately ridiculous, but like many of his neoconservative colleagues he is well wired into the system. He writes regularly for The Washington Post, The New York Times, and The Wall Street Journal; appears on television as an “expert”; and is a colleague at WINEP of the ubiquitous Dennis Ross, sometimes called “Israel’s lawyer,” who was until recently President Obama’s point man on the Middle East. Clawson is a useful idiot who would be registered as an agent of the Israeli government if the Justice Department were doing its job, but instead he is feted as a man who tells it like it is in terms of American interests. The distortion of the foreign-policy decision-making in this country is something that can be attributed to Clawson and his host of fellow travelers, all of whom promote Israel’s perceived interests at the expense of the United States. And they do it with their eyes wide open.

I will deliberately avoid belaboring another Israel Firster Pamela Geller and her New York subway posters calling Palestinians savages and Israelis civilized, as I am sure the point has been made about how any lie that can serve the cause of Israel will be aggressively defended as “free speech.” A poster excoriating Jews or blacks in similar terms as “savages” would not have seen the light of day in New York City, another indication of the power of the Lobby and its friends to control the debate about the Middle East and game the system.

And then there are the reasons to dislike Israel and what it represents that go way back. In 1952’s Lavon Affair, the Israelis were prepared to blow up a U.S. Information Center in Alexandria and blame it on the Egyptians. In 1967, the Israelis attacked and nearly sank the USS Liberty, killing 34 crewmen, and then used their power over President Lyndon Johnson to block an investigation into what had occurred. In 1987, Jonathan Pollard was convicted of spying for Israel with investigators determining that he had been the most damaging spy in the history of the United States. In the 1960s, Israelis stole uranium from a lab in Pennsylvania to construct a secret nuclear arsenal. And the spying and theft of U.S. technology continues. Israel is the most active “friendly nation” when it comes to stealing U.S. secrets, and when its spies are caught, they are either sent home or, if they are Americans, receive a slap on the wrist.

And Israel gets away with killing American citizens — literally — in the cases of Rachel Corrie and Furkan Dogan of the Mavi Marmara. And let’s not forget Israel’s treatment of the Palestinians which has made the United States complicit in a crime against humanity. Tel Aviv has also played a key role in Washington’s going to war against Iraq, in promulgating a U.S.-led global war on terror against the Muslim world, and in crying wolf over Iran, all of which have served no U.S. interest. Through it all, Congress and the media are oblivious to what is taking place. Israel is a net recipient of over $123 billion in U.S. aid and continues to get $3 billion a year even though its per capita income is higher than that of Spain or Italy. No one questions anything having to do with Israel while Congress rubber-stamps resolution after resolution virtually promising to go to war on Israel’s behalf.

I have to admit that I don’t like what my own government is doing these days, but I like Israel even less and it is past time to do something about it. No more money, no more political support, no more tolerance of spying, and no more having to listen to demands for red lines to go to war. No more favorable press when the demented Benjamin Netanyahu holds up a cartoon at the U.N. The United States government exists to serve the American people, no more, no less, and it is time that our elected representatives begin to remember that fact.

Philip Giraldi, a former CIA officer, is a contributing editor to The American Conservative and executive director of the Council for the National Interest.

his article was originally posted at AntiWar

Posted in Uncategorized | Tagged | Leave a comment