Lone Survivors (34 page)

Read Lone Survivors Online

Authors: Chris Stringer

BOOK: Lone Survivors
2.76Mb size Format: txt, pdf, ePub

It seems likely that important changes occurred in the frontal lobes of our brain, given their importance in forward thinking, yet I was surprised by CT studies of fossil skulls in which I was involved. These showed that the profile and relative size of the frontal lobes inside the brain cavity had changed much less in modern humans, compared with the obvious external changes in forehead development. At the rear of the skull, the occipital bone is relatively narrower and more evenly curved in modern humans. In
erectus
and
heidelbergensis
skulls the occipital was more sharply angled, and this must have been partly related to the powerful neck muscles that attached across the bone in primitive humans. And in Neanderthals the profile of the occipital was influenced by the rather bulging occipital lobes of their brains, the significance of which is still debated—these lobes contain the visual cortex, for example. In modern humans, the parietals are heightened and lengthened, while the arch they make is narrower at the base but wider higher up. The paleoneurologist Emiliano Bruner investigated these aspects of ancient brain shape using geometric morphometrics. He confirmed earlier, more traditional, studies that blood vessel impressions on the inside of the parietals (reflecting blood supply to the parietal lobes) are altered in modern humans, forming a much more complex network.

So is there anything in the function of the parietal lobes that might explain their expansion in the modern human brain? They are involved in integrating sensory information, in processing data from different parts of the brain, and in social communication, all of which could be reflected in the behavioral changes recognized with the arrival of modern humans. The cognitive archaeologists Thomas Wynn and Frederick Coolidge argued that a key change in the modern human mind must have been the development of an episodic working memory. Memory in humans can be subdivided into declarative memories, such as basic facts and information, and procedural memories, such as strings of words or actions (such as making a tool or route finding). It is known from brain studies that these are separate modules, in the sense that brain damage may interfere with one but not the other, and brain imaging studies show they are controlled by different pathways. It is very likely that both of these types of memory are enhanced in the modern human brain, but there is one special and important type of declarative memory called episodic, personal, or autobiographical memory—a storylike reminiscence of an event, with its associated emotions. This can be used mentally to rerun past events, and, just as important, it can also rehearse future events—a sort of “inner-reality” time machine that can run backward or project possible scenarios forward, and which seems closely linked to concepts of self-awareness (“consciousness”). As we have seen already, archaeological evidence suggests that the reach of modern humans across the landscape in terms of food gathering, sourcing raw materials, and social networks increased during the Middle Stone Age and continued to increase during the Later Stone Age in Africa and contemporaneous industries outside of Africa, such as the Upper Paleolithic. Such developments could reflect the arrival of a modern kind of episodic memory. Moreover, the ability to conjure up vivid inner-reality narratives could also have been critical in the development of religious beliefs, since imaginary scenarios could be created as well as actual ones. Once people could foresee their own deaths, religious beliefs that provided reassurance about such events could perhaps have been selected for their value in promoting survival.

Experiments and observations suggest that the parietal lobes are indeed involved in episodic memory, but it is clear that they are not the only location implicated, since the recall of such memories involves a network of links between the frontal, parietal, and temporal lobes. Moreover, even episodic memory is not a single straightforward path. For example, some patients with selective parietal lobe damage can recall a particular event in detail from a general cue such as “your birthdays” (top-down recall), whereas others need a detailed cue such as a photo of a particular birthday cake (bottom-up recall) to remember one special event properly. But the lower parts of the parietal lobes are also implicated in another vital property of the modern human brain: inner speech. This is our inner voice that consciously and unconsciously guides so much of our thinking and decision making; in a sense it provides one of the most vital bits of software for the hardware of our brains. Indeed, there is evidence that an inability to create and use this program—for example, in people who were born deaf, mute, and blind, and who have been given little sensory stimulation from other humans—greatly limits higher brain functions. Even so, such severely impaired people, when given appropriate inputs from an early age, can develop and use their own codes of inner speech, for example, by recalling the symbols of sign language that they have been taught, instead of spoken words.

Stanley Ambrose, the champion of the impact of the Toba eruption on modern human evolution, also argued for the importance of memory and the development of particular parts of the brain in the success of modern humans. In his view, what was most important was the integration of working memory with prospective memory (dealing with near-future tasks) and constructive memory (mental time traveling), which are centered in the front and lower rear of the frontal lobes. Such links would have facilitated everything from the construction of composite artifacts to the development of the fullest levels of mind reading and social cooperation. In his view, archaic humans like the Neanderthals had developed the memory for short-term planning and the production of composite artifacts, but they lacked the full brain integration and hormonal systems that promoted the levels of trust and reciprocity essential for the much larger social networks of modern humans.

All of this shows how complex our brains are, and what a long way we still have to go to understand their workings in living humans, let alone ones who died 100,000 years ago. Unfortunately for Richard Klein's views of a significant cognitive event about 50,000 years ago, the heightening of the frontal and the expansion of the parietal lobes had apparently already occurred 100,000 years earlier, as shown by the shape of the early modern skulls from Omo Kibish and Herto. Overall, there is little evidence so far for any detectable changes in the modern human brain when Klein argues these should have occurred. Brain volume and EQ apparently increased fairly steadily in modern humans until the last 20,000 years, after which they seem to have declined somewhat, and similarly the trend in increasing cerebellum/cerebrum ratio seems to have altered only in the last 20,000 years or so.

Therefore, all we can say is that there is no obvious physical evidence for such a change in the workings of the human brain 50,000 years ago. Perhaps some genetic support will eventually emerge; there are claims that the gene DRD4, which when negatively mutated is linked with attention-deficit/hyperactivity disorder (ADHD), underwent changes around that time. DRD4 affects the efficacy of the neurotransmitter dopamine in the brain, and it has been suggested that a positive effect of such mutations would be to encourage novelty seeking and risk taking—perhaps important qualities for a migration out of Africa. John Parkington is one of several archaeologists and biologists who have argued that the fish oils obtained by early moderns when they began to seriously exploit marine resources would have boosted the brain power of early
Homo sapiens
—and there are further claims that omega-3 fatty acids would have conferred additional benefits in terms of health and longevity. But unfortunately, at the moment, such changes can only be inferred from indirect evidence such as the archaeological record, which itself can be interpreted in very different ways. Here we need to return to two of the key elements of modernity that might be decipherable from that archaeological evidence: the presence of symbolism and, by inference, complex language.

In chapters 5 and 6, we discussed some of the key behavioral “signatures” of modernity that are usually highlighted by archaeologists—things like figurative art and burials with grave goods. And we saw that there is not yet any strong evidence for figurative or clearly representational art anywhere before the European material dated at about 40,000 years. Equally, there is no evidence of symbolic burials older than the early modern examples known from Skhul and Qafzeh at about 100,000 years, even if older African sites like Herto are suggestive of the ritual treatment of human remains. However, the processing and use of red pigments in Africa does go back considerably farther, to beyond 250,000 years, at sites like Kapthurin and Olorgesailie in Kenya. The record is sporadic after this but emerges at Pinnacle Point in South Africa at about 160,000 years, and much more strongly at sites in North and South Africa from about 120,000 years. In particular, there is the rich material from Blombos Cave, South Africa, which includes about twenty engraved ocher fragments and slabs, dated to around 75,000 years ago and some which extend back to 100,000 years. These fragments seem to be generally accepted as symbolic in intent rather than accidental or utilitarian, but many of the earlier examples are only suggestive of symbolic meaning, rather than definitive.

The evidence seems much stronger in the case of tick shell beads, present at the known limits of the early modern human range at least 75,000 years ago, from cave sites in Morocco, Israel, and South Africa. But even here, the context in which they were being used becomes critical in deciding what level of symbolic meaning they carried. The archaeologist Paul Pettitt suggests an alternative way of looking at symbolic intent by moving away from judging an absolute (and contentious) presence/absence, and instead deconstructing different levels of symbolic meaning, in line with the Robin Dunbar stages of “mind reading” that we discussed in chapter 5. This deconstruction is valuable because it also allows an evolutionary sequence for symbolism to be considered, rather than just an on–off switch, where symbolism is either not there at all or fully developed, with no intermediates. Pettitt points out that symbols can only function as such in recent humans if the “writer” and “reader” are in accord over meaning, but in interpreting archaeological finds we tend to focus on the writer, without considering those who might receive the intended message. He cautions that unless the symbol is repeated at a number of different sites in a given time period, we should be cautious about how widespread was the behavior and how efficient it was at conveying its meaning.

The same symbol might have carried different messages for different individuals, and between different groups, while today we might consider only one of many possible intended meanings, and even that inferred meaning might be wrong. For example, for pigments and perforated shell beads applied to the body, there could have been different levels of use and symbolic meaning, from the simple to complex. The most basic use might be purely decorative and reflect a personal preference (“I wear red because I like red”). Or the message could be one of enhancement of the signal (“I wear red as I know you will read it as a sign of my strength or be impressed by it”). A third level might reflect status or group identity (“I wear red as I know you will recognize it as the regalia of our clan and infer from it that we are culturally the same”). A fourth and even more complex message might be “I wear red as, like you, I am a successful hunter and have killed an adult eland; it is my right to wear this color, and I therefore command respect from all.” And, finally, the most complex, as part of an elaborate myth or cosmological belief, might be “I wear red only at this specific time of the year, marking when the ancestors created the land. This is a vital part of our beliefs, and by doing this I show that I am the bearer of this knowledge.” Just considering the hypothetical examples above, at which of these levels would the tick shells and engraved ocher fragments from Blombos Cave have been functioning about 75,000 years ago? At one or several levels of complexity, and, if so, which ones?

We can probably rule out the simplest level because of the profusion of the shells and their consistency of selection and manufacture, and for the ocher, the engravings generally look carefully and specifically made in each case. But Blombos is an exceptionally rich example, and in other Middle Stone Age sites there may just be ocher crayons, with no engravings and no beads, so should we try and judge the level of symbolic intent at such sites? It is certainly possible that some of the earliest occurrences of red ocher in African sites were nonsymbolic, since ocher can also be used as a component of natural glues, as a preservative, or to tan animal hides. But equally, some may also have reflected a low level of symbolic intent, in terms of personal decoration and simple display. Indeed, the application of red ocher to human skin may have begun for purely practical reasons—as an insect repellent, say, or to hides as part of their production—but the red ocher was then favored for its attractive (and, later, meaningful) appearance. Personally I think the proliferation of shell beads and red ocher use along the length of Africa between 75,000 and 100,000 years ago must reflect an increasing intensity of symbolic exchanges both within and probably between early modern human groups. But perhaps the highest levels of symbolic meaning were still only nascent then.

Pettitt similarly deconstructs ancient burial practices into levels ranging from morbidity (an interest in the dead—shown even by chimpanzees) to mortuary caching (deposition of the body in certain places) to full burial in a special location, with ceremony, or accompanied by symbolic objects. In turn, he links these with Dunbar's levels of mind reading, so the simplest intentionality level (perhaps in apes and early hominins) might be “I believe that you are dead,” followed by “I empathize that you are dead” (perhaps in early
Homo
), then “I know that you must be deposited in a specific place” (
heidelbergensis
, earliest moderns, and Neanderthals?), and finally “Because of your role, you must be disposed of in this way, by this method, at this place, as recognized by our social rules” (later modern humans and perhaps some Neanderthals?). It is possible that early humans like
heidelbergensis
were already treating their dead in some way, while Neanderthals were certainly caching and burying bodies with simple methods, with the possibility of more elaborate burials in some cases. But perhaps only modern humans carried out the most complex disposal practices for their dead.

Other books

The Ice Age by Kirsten Reed
PreHeat (Fire & Ice) by Jourdin, Genevieve
The Girl and the Genie by Lilly, E. M.
A Perfect Chance by Becca Lee
The Dragon Lord by Morwood, Peter
Mary of Nazareth by Marek Halter
Halloween Masquerade by A.R. Williams