Read The Origin of Humankind Online
Authors: Richard Leakey
I’m often asked whether I think that
Homo
, having become a meat eater, might have included their australopithecine cousins in their diet, thus pushing them into extinction. I have no doubt that from time to time early
Homo
killed vulnerable australopithecines, just as they took antelope and other animal prey when they could. But the cause of australopithecine extinction is likely to have been more prosaic.
We know that
Homo erectus
was an extremely successful species, since it was the first human to expand its range beyond Africa. It is therefore likely that early
Homo
grew rapidly in numbers, thus becoming a significant competitor for a resource essential to australopithecine survival: food. Moreover, between 1 million and 2 million years ago ground-living monkeys—the baboons—were also becoming highly successful and burgeoning in numbers, and would also have competed with australopithecines for food. The australopithecines might well have succumbed to a twofold competitive pressure—from
Homo
on one side and baboons on the other.
A
t least some lines of evidence support the notion that the physique of early
Homo
reflected an active pursuit of meat—that is, as a hunter in search of prey. It is salutary to reflect on the fact that, as a means of subsistence, hunting and gathering persisted until very recently in human prehistory; only with the adoption of agriculture a mere 10,000 years ago did our forebears begin to abandon a simple foraging existence. A major question for anthropologists has been, When did this very human mode of subsistence appear? Was it present from the beginning of genus
Homo
, as I have suggested? Or was it a recent adaptation, having emerged only with the evolution of modern humans, perhaps 100,000 years ago? To answer these questions, we have to pore over clues in the fossil and archeological records, searching for signs of the hunting and gathering mode of subsistence. We will see in this chapter that theories have shifted in recent years, reflecting a change in the way we view ourselves and our ancestors. Before we see how the evidence of prehistory has been scrutinized, it would be helpful to have a picture in mind of the foraging lifestyle, as we know it from modern hunter-gatherers.
The combination of hunting meat and gathering plant foods is unique to humans as a systematic subsistence strategy. It is also spectacularly successful, having enabled humanity to thrive in virtually every corner of the globe, with the exception of Antarctica. Vastly different environments were occupied, from steamy rain forests to deserts, from fecund coastal reaches to virtually sterile high plateaus. Diets varied greatly from environment to environment. The Native Americans of the Northwest harvested salmon in prodigious quantities, for example, while the !Kung San of the Kalahari relied on mongongo nuts for much of their protein.
Yet despite the differences in diet and ecological environment, there were many commonalities in the hunter-gatherer way of life. People lived in small, mobile bands of about twenty-five individuals—a core of adult males and females and their offspring. These bands interacted with others, forming a social and political network linked by customs and language. Numbering typically about five hundred individuals, this network of bands is known as a dialectical tribe. The bands occupied temporary camps, from which they pursued their daily food quest.
In the majority of surviving hunter-gatherer societies that anthropologists have studied, there is a clear division of labor, with males responsible for hunting and females for gathering plant foods. The camp is a place of intense social interaction, and a place where food is shared; when meat is available, this sharing often involves elaborate ritual, which is governed by strict social rules.
To Westerners, the eking out of an existence from the natural resources of the environment by means of the simplest of technologies seems a daunting challenge. In reality, it is an extremely efficient mode of subsistence, so that foragers can often collect in three or four hours sufficient food for the day. A major research project of the 1960s and 1970s conducted by a team of Harvard anthropologists showed this to be true of the !Kung San, whose homeland in the Kalahari Desert of Botswana is marginal in the extreme. Hunter-gatherers are attuned to their physical environment in a way that is difficult for the urbanized Western mind to grasp. As a result, they know how to exploit what to modern eyes seem meager resources. The power of their way of life lies in this exploitation of plant and animal resources within a social system that fosters interdependence and cooperation.
The notion that hunting was important in human evolution has a long history in anthropological thought, going back to Darwin. In his 1871 book
The Descent of Man
, he suggested that stone weapons were used not only for defense against predators but also for bringing down prey. The adoption of hunting with artificial weapons was part of what made humans human, he argued. Darwin’s image of our ancestors was clearly influenced by his experience while on his five-year voyage on the
Beagle
. This is how he described his encounter with the people of Tierra del Fuego, at the southern tip of South America:
There can hardly be any doubt that we are descended from barbarians. The astonishment which I felt on first seeing a party of Fuegans on a wild and broken shore will never be forgotten by me, for the reflection at once rushed into my mind—such were our ancestors. These men were absolutely naked and bedaubed with paint, their long hair was tangled, their mouths, frothed with excitement, and their expression was wild, startled and distrustful. They possessed hardly any arts, and like wild animals lived on what they could catch.
The conviction that hunting was central to our evolution, and the conflation of our ancestors’ way of life with that of surviving technologically primitive people, imprinted itself firmly on anthropological thought. In a thoughtful essay on this issue, the biologist Timothy Perper and the anthropologist Carmel Schrire, both at Rutgers University, put it succinctly: “The hunting model assumes that hunting and meat-eating triggered human evolution and propelled man to the creature he is today.” According to this model, the activity shaped our ancestors in three ways, explain Perper and Schrire, “affecting the psychological, social, and territorial behavior of early man.” In a classic 1963 paper on the topic, the South African anthropologist John Robinson expressed the measure of import the science accorded to hunting in human prehistory:
[T]he incorporation of meat-eating in the diet seems to me to have been an evolutionary change of enormous importance which opened up a vast new evolutionary field. The change, in my opinion, ranks in evolutionary importance with the origin of mammals—perhaps more appropriately with the origin of tetrapods. With the relatively great expansion of intelligence and culture it introduced a new dimension and a new evolutionary mechanism into the evolutionary picture, which at best are only palely foreshadowed in other animals.
Our assumed hunting heritage took on mythic aspects, too, becoming equivalent to the original sin of Adam and Eve, who had to leave Paradise after eating of the forbidden fruit. “In the hunting model, man ate meat in order to survive in the harsh savanna, and by virtue of this strategy became the animal whose subsequent history is etched in a medium of violence, conquest, and bloodshed,” observe Perper and Schrire. This was the theme taken up by Raymond Dart in some of his writings in the 1950s and, more popularly, by Robert Ardrey. “Not in innocence, and not in Asia, was mankind born,” is the famous opening to Ardrey’s 1971 book
African Genesis
. The image proved to be powerful in the minds of both the public and the profession. And, as we shall see, image has been important in the way the archeological record has been interpreted in this respect.
A 1966 conference on “Man the Hunter” at the University of Chicago was a landmark in the development of anthropological thinking about the role of hunting in our evolution. The conference was important for several reasons, not least for its recognition that the gathering of plant foods provided the major supply of calories for most hunter-gatherer societies. And, just as Darwin had done almost a century earlier, the conference equated what we know of the lifeways of modern hunter-gatherers with the behavior patterns of our earliest ancestors. As a result, apparent evidence of meat-eating in the prehistoric record—in the form of accumulations of stone tools and animal bones—had a clear implication, as my friend and colleague the Harvard University archeologist Glynn Isaac observed: “Having, as it were, followed an apparently uninterrupted trail of stone and bone refuse back through the Pleistocene it seemed natural ... to treat these accumulations of artifacts and faunal remains as being ‘fossil home base sites.’” In other words, our ancestors were considered to have lived as modern hunter-gatherers do, albeit in a more primitive form.
Isaac promulgated a significant advance in anthropological thinking with his food-sharing hypothesis, which he published in a major article in
Scientific American
in 1978. In it he shifted the emphasis away from hunting per se as the force that shaped human behavior and toward the impact of the collaborative acquisition and sharing of food. “The adoption of food-sharing would have favored the development of language, social reciprocity and the intellect,” he told a 1982 gathering that marked the centenary of Darwin’s death.
Five patterns of behavior separate humans from our ape relatives, he wrote in his 1978 paper: (1) a bipedal mode of locomotion, (2) a spoken language, (3) regular, systematic sharing of food in a social context, (4) living in home bases, (5) the hunting of large prey. These describe modern human behavior, of course. But, Isaac suggested, by 2 million years ago “various fundamental shifts had begun to take place in hominid social and ecological arrangements.” They were already hunter-gatherers in embryo, living in small, mobile bands and occupying temporary camps from which the males went out to hunt prey and the females to gather plant foods. The camp provided the social focus, at which food was shared. “Although meat was an important component of the diet, it might have been acquired by hunting or by scavenging,” Isaac told me in 1984, a year before his tragically early death. “You would be hard pressed to say which, given the kind of evidence we have from most archeological sites.”
Isaac’s viewpoint strongly influenced the way the archeological record was interpreted. Whenever stone tools were discovered in association with the fossilized bones of animals, it was taken as an indication of an ancient “home base,” the meager litter of perhaps several days’ activity of a band of hunter-gatherers. Isaac’s argument was plausible, and I wrote in my 1981 book
The Making of Mankind
that “the food-sharing hypothesis is a strong candidate for explaining what set early humans on the road to modern man.” The hypothesis seemed consistent with the way I saw the fossil and archeological records, and it followed sound biological principles. Richard Potts, of the Smithsonian Institution, agreed. In his 1988 book titled
Early Hominid Activities at Olduvai
, he observed that Isaac’s hypothesis “seemed a very attractive interpretation,” noting:
The home-base, food-sharing hypothesis integrates so many aspects of human behavior and social life that are important to anthropologists—reciprocity systems, exchange, kinship, subsistence, division of labor, and language. Seeing what appeared to be elements of the hunting-and-gathering way of life in the record, in the bones and stones, archeologists inferred that the rest followed. It was a very complete picture.
In the late 1970s and early 1980s, however, this thinking began to change, prompted by Isaac and by the archeologist Lewis Binford, then at the University of New Mexico. Both men realized that much of prevailing interpretation of the prehistoric record was based on unspoken assumptions. Independently, they began to separate what could truly be known from the record from what was simply assumed. It began at the most fundamental level, questioning the significance of finding stones and animal bones in the same place. Did this spatial coincidence imply prehistoric butchery, as had been assumed? And if butchery could be proved, does that imply that the people who did it lived as modern hunter-gatherers do?
Isaac and I talked often about various subsistence hypotheses, and he would create scenarios in which bones and stones might finish up in the same place but have nothing to do with a hunting-and-gathering way of life. For instance, a group of early humans might have spent some time beneath a tree simply for the shade it afforded, knapping stones for some purpose other than butchering carcasses—for example, they might have been making flakes for whittling sticks, which could be used to unearth tubers. Some time later, after the group had moved on, a leopard might have climbed the tree, hauling its kill with it, as leopards often do. Gradually, the carcass would have rotted and the bones would have tumbled to the ground to lie amid the scatter of stones left there by the toolmakers. How could an archeologist excavating the site 1.5 million years later distinguish between this scenario and the previously favored interpretation of butchering by a group of nomadic hunters and gatherers? My instinct was that early humans did in fact pursue some version of hunting and gathering, but I could see Isaac’s concern over a secure reading of the evidence.
Lewis Binford’s assault on conventional wisdom was rather more acerbic than Isaac’s. In his 1981 book
Bones: Ancient Men and Modern Myth
, he suggested that archeologists who viewed stone-tool and bone assemblages as the remains of ancient campsites were “making up ‘just-so’ stories about our hominid past.” Binford, who has done little of his work on early archeological sites, derived his views initially from study of the bones of Neanderthals, who lived in Eurasia between about 135,000 and 34,000 years ago.
“I became convinced that the organization of the hunting and gathering way of life among these relatively recent ancestors was quite different than that among fully modern
Homo sapiens,”
he wrote in a major review in 1985. “If this was true then the almost ‘human’ lifeways depicted in the ‘consensus’ view of the very early hominids stood out as an extremely unlikely condition.” Binford suggested that systematic hunting of any kind began to appear only when modern humans evolved, for which date he gives 45,000 to 35,000 years ago.
None of the early archeological sites could be regarded as remains of living floors from ancient campsites, argued Binford. He reached this conclusion through analyzing other people’s data on the bones at some of the famous archeological sites in Olduvai Gorge. They were the kill sites of nonhuman predators, he said. Once the predators, such as lion and hyena, had moved on, hominids came to the site to pick up what scraps they could scavenge. “The major, or in many cases the only, usable or edible parts consisted of bone marrow,” he wrote. “There is no evidence supporting the idea that the hominids were removing food from the locations of procurement to a base camp for consumption. . . . Similarly, the argument that food was shared is totally unsupported.” This idea presents a very different picture of our forebears, 2 million years ago. “They were not romantic ancestors,” wrote Binford, “but eclectic feeders commonly scavenging the carcasses of dead ungulates for minor food morsels.”