Idiot Brain (23 page)

Read Idiot Brain Online

Authors: Dean Burnett

BOOK: Idiot Brain
5.1Mb size Format: txt, pdf, ePub

The frontal eye fields, in the frontal lobe, receive information from the retinas and create a “map” of the visual field based on this, supported and reinforced by more spatial mapping and information via the parietal lobe. If something of interest occurs in the visual field, this system can very quickly point the eyes in that direction, to see what it is. This is called overt or “goal” orientation, as your brain has a goal that is “I want to look at that!” Say you see a sign that reads special offer: free bacon, then you direct your attention to it straight away, to see what the deal is, to complete the goal of getting bacon. The conscious brain drives the attention, so it's a top-down system. Alongside all this there's another system at work, called
covert
orientation, which is more of a “bottom-up” one. This system means something is detected that is of biological significance (for instance, the sound of a tiger growling nearby, or a crack from the tree branch you're standing on) and attention is automatically directed towards it,
before
the conscious areas of the brain even know what's going on, hence it's a bottom-up system. This system uses the same visual input as the other one as well as sound cues, but is supported by a different set of neural processes in different regions.

According to current evidence, the most widely supported model is one where, on detection of a something potentially important, the posterior parietal cortex (already mentioned regarding vision processing) disengages the conscious attention system from whatever it's currently doing, like a parent switching the television off when their child is supposed to take out the garbage. The superior colliculus in the midbrain then moves the attention system to the desired area, like a parent moving their child to the kitchen where the garbage is. The pulvinar nucleus, part of the thalamus, then reactivates the attention system, like a parent putting garbage bags in their child's hand and pushing the child towards the door to put the damn things out!

This system can overrule the conscious, goal-orientated top-down system, which makes sense as it's something of a survival instinct. The unfamiliar shape in your vision could turn out to be an oncoming attacker, or that boring office colleague who insists on talking about his athlete's foot.

These visual details don't have to appear in the fovea, the important middle bit of the retina, to attract our attention. Visually paying attention to something typically involves moving the eyes, but
it doesn't have to.
You'll have heard of “peripheral vision,” where you see something you're not looking at directly. It won't be greatly detailed, but if you're at your desk working at your computer and see an unexpected movement in the corner of your vision that seems the right size and location to be a large spider, you maybe don't want to look at it, in case that's exactly what it is. While you carry on typing, you're very alert to any movement in that particular spot, just waiting to see it again (while hoping not to). This shows that the focus of attention isn't tied directly to where the eyes are pointing.
As with the auditory cortex the brain can specify which part of the visual field to focus on, and the eyes don't have to move to allow it. It may sound like the bottom-up processes are the most dominant, but there's more to it. Stimulus orientation overrides the attention system when it detects a significant stimulus, but it's often the conscious brain that determines what's “significant” by deciding the context. A loud explosion in the sky would certainly be something that would count as significant, but, if you're going for a walk on the evening of July Fourth, an
absence
of explosions in the sky would be more significant, as the brain is expecting fireworks.

Michael Posner, one of the dominant figures in the field of attention research, devised tests that involve getting subjects to spot a target on screen that is preceded by cues which may or may not predict the target location. If there are as few as two cues to look at, people tend to struggle. Attention can be divided between two different modalities (doing a visual test and a listening test at the same time) but if it's anything more complex than a basic yes/no detection test, people typically fall apart trying it. Some people can do two simultaneous tasks if one is something they're very adept at, such as an expert typist doing a math problem while typing. Or, to use an earlier example, an experienced driver holding a detailed conversation while operating a vehicle.

Attention can be very powerful. One well-known study concerned volunteers at Uppsala University in Sweden,
14
where subjects reacted with sweaty palms to images of snakes and spiders that were shown on screen for less than 1/300th of a second. It usually takes about half a second for the brain to process a visual stimulus sufficiently for us to consciously recognize it, so subjects were experiencing responses to pictures
of spiders and snakes in less than a tenth of the time it actually takes to “see” them. We've already established that the unconscious attention system responds to biologically relevant cues, and that the brain is primed to spot anything that might be dangerous and has seemingly evolved a tendency to fear natural threats like our eight-legged or no-legged friends. This experiment is a great demonstration of how attention spots something and rapidly alerts the parts of the brain that mediate responses before the conscious mind has even finished saying, “Huh? What?”

In other contexts, attention can miss important and very unsubtle things. As with the car example, too much occupying our attention means we miss very important things, such as pedestrians (or, more importantly, fail to miss them). A stark example of this was provided by Dan Simons and Daniel Levin in 1998.
15
In their study, an experimenter approached random pedestrians with a map and asked them directions. While the pedestrians were looking at the map, a person carrying a door walked between them and the experimenter. In the brief moment when the door presented an obstruction, the experimenter changed places with someone who didn't look or sound anything like the original person. At least 50 percent of the time, the map-consulting person didn't notice
any
change, even though they were talking to a different person from the one they'd been speaking to
seconds earlier
. This invokes a process known as “change blindness,” where our brains are seemingly unable to track an important change in our visual scene if it's interrupted even briefly.

This study is known as the “door study,” because the door is the most interesting element here, apparently. Scientists are a weird bunch.

The limits of human attention can and do have serious scientific and technological consequences too. For example, heads-up displays, where the instrument display in machines such as airplanes and space vehicles is projected onto the screen or canopy rather than read-outs in the cockpit area, seemed like a great idea for pilots. It saves them having to look down to see their instruments, thus taking their eyes off what's going on outside. Safer all round, right?

No, not really. It turned out when a heads-up display is even slightly too cluttered with information, the pilot's attention is maxed out.
16
They can see right through the display, but they're not
looking
through it. Pilots have been known to land their plane on top of another plane as a result of this (in simulations, thankfully). NASA itself has spent a lot of time working out the best ways to make heads-up displays workable, at the expense of hundreds of millions of dollars.

These are just some of the ways the human attention system can be seriously limited. You might like to argue otherwise, but if you do you clearly haven't been paying attention. Luckily, we've now established you can't really be blamed for that.

_____________

*
Some scientists have called this finding into question, arguing that this staggering number of smell sensations is more a quirk of questionable math used in the research than the result of our mighty nostrils.
1

†
It's important to clarify the difference between
illusions
and
hallucinations
. Illusions are when the senses detect something but interpret it wrongly, so you end up perceiving something other than what the thing actually is. By contrast, if you smell something
with no source
, this is a hallucination; perceiving something that isn't actually there, which suggests something isn't working as it should deep in the sensory-processing areas of the brain. Illusions are a quirk of the brain's workings; hallucinations are more serious.

‡
Not that the eyes aren't impressive, because they are. The eyes are so complex that they're often cited (not a pun) by creationists and others opposed to evolution as clear proof that natural selection isn't real; the eye is so intricate it couldn't just “happen” and therefore must be the work of a powerful creator. But if you truly look at the workings of the eye, then this creator must have designed the eye on a Friday afternoon, or while hung over on the morning shift, because a lot of it doesn't make much sense.

§
Modern camera and computing technology means it's much easier (and considerably less uncomfortable) to track eye movements. Some marketing companies have even used eye scanners mounted on trolleys to observe what customers are looking at in stores. Before this, head-mounted laser trackers were used. Science is so advanced these days that lasers are now old-fashioned. This is a cool thing to realize.

¶
For the record, some people claim that they've had eye surgery and their eye was “taken out” and left dangling on their cheek at the end of the optic nerve, like in a Tex Avery cartoon. This is impossible; there is some give in the optic nerve, but certainly not enough to support the eye like a grotesque conker on a string. Eye surgery usually involves pulling the eyelids back, holding the eye in place with clamps, and numbing injections, so it feels weird from the patient's perspective. But the firmness of the eye socket and fragility of the optic nerve means popping the eye out would effectively destroy it, which isn't a great move for an ophthalmic surgeon.

#
Exactly how we “focus” aural attention is unclear. We don't swivel our ears towards interesting sounds. One possibility comes from a study by Edward Chang and Nima Mesgarani of the University of California, San Francisco, who looked at the auditory cortex of three epilepsy patients who had electrodes implanted in the relevant regions (to record and help localize seizure activity, not for fun or anything).
13
When asked to focus on a specific audio stream out of two or more heard at once, only the one being paid attention to produced any activity in the auditory cortex. The brain somehow suppresses any competing information, allowing full attention to be paid to the voice being listened to. This suggests your brain really can “tune someone out,” like when they won't stop droning on about their tedious hedgehog-spotting hobby.

6

Personality: a testing concept

The complex and confusing properties of personality

Personality. Everybody has one (except maybe those who enter politics). But what is a personality? Roughly, it's a combination of an individual's tendencies, beliefs, ways of thinking and behaving. It's clearly some “higher” function, a combination of all the sophisticated and advanced mental processes humans seem uniquely capable of thanks to our gargantuan brains. But, surprisingly, many think personality doesn't come from the brain at all.

Historically, people believed in dualism; the idea that the mind and body are separate. The brain, whatever you think of it, is still part of the body; it's a physical organ. Dualists would argue that the more intangible, philosophical elements of a person (beliefs, attitudes, loves and hates) are held within the mind, or “spirit,” or whatever term is given to the immaterial elements of a person.

Then, on September 13, 1848, as a result of an unplanned explosion, railroad worker Phineas Gage had his brain impaled by an iron rod more than 3 feet long. It entered his skull just under his left eye, passed right
through
his left frontal lobe, and exited via the top of his skull. It landed some 80 feet away. The force propelling the rod was so great that a human head offered as much resistance as a net curtain. To clarify, this was not a paper cut.

You'd be
forgiven for assuming this would have been fatal. Even today, “huge iron rod right through the head” sounds like a 100-percent-lethal injury. And this happened in the mid-1800s, when stubbing your toe usually meant a grim death from gangrene. But, no, Gage survived, and lived another twelve years.

Part of the explanation for this is that the iron pole was very smooth and pointed, and traveling at such a speed that the wound was surprisingly precise and “clean.” It destroyed almost all the frontal lobe in the left hemisphere of his brain but the brain has impressive levels of redundancy built into it, so the other hemisphere picked up the slack and provided normal functioning. Gage has become iconic in the fields of psychology and neuroscience, as his injury supposedly resulted in a sudden and drastic change in his personality. From a mild-mannered and hardworking sort, he became irresponsible, ill-tempered, foul-mouthed, and even psychotic. “Dualism” had a fight on its hands as this discovery firmly established the idea that the workings of the brain are responsible for a person's personality.

However, reports of Gage's changes vary wildly, and towards the end of his life, he was employed long-term as a stagecoach driver, a job with a lot of responsibility and public interaction, so even if he did experience disruptive personality changes he must have got better again. But the extreme claims persist, largely because contemporary psychologists (at the time, a career dominated by self-aggrandizing wealthy white men, whereas now it's . . . actually, never mind) leapt on Gage's case as an opportunity to promote their own theories about how the brain worked; and if that meant attributing things that never happened to a lowly railway worker, what of it? This
was the nineteenth century, he wasn't exactly going to find out via Facebook. Most of the extreme claims about his personality changes were seemingly made after his death, so it was practically impossible to refute them.

Other books

The Magic Wakes by Bradford, Charity
Druids Sword by Sara Douglass
Retribution by Hoffman, Jilliane
Ten Days of Perfect by Andrea Randall
Redeemer by Chris Ryan
Quarantine: Stories by Rahul Mehta
Lady Sabrina’s Secret by Jeannie Machin