Read Inventing Iron Man Online
Authors: E. Paul Zehr
Figure 3.1. WalkAide neuroprosthetic stimulator, which corrects a condition known as drop foot. The stimulator (
A
) activates the common peroneal nerve, which causes activity in the muscles that lift the foot (
B
). Images courtesy Innovative Neurotronics.
The WalkAide is an example, then, of a neuroprosthetic. It represents a medical device that helps improve (or replace in some cases) bodily function that has been lost due to accident or disease. Other
kinds of neuroprosthetics include stimulators for bladder and bowel control, deep brain stimulation (we will come back to this later on in this bookâso stay tuned), and cochlear prosthetics. Usually neuroprosthetics require insertion and implantation of electrodes into the body near the nerve or muscles that are targeted. But, the WalkAide is an example of a neuroprosthetic that doesn't need any implantation.
Cochlear prosthetics to improve hearing are the most commonly applied and utilized neuroprosthetics. They represent an instructive example of how continuing evolution in the fields of biomedical engineering and neuroscience from the 1950s to now have dramatically improved neuroprosthetic devices. Originally cochlear implants were very large and had external pieces fixed to the body that were wired to parts implanted into the inner ear. Now they are small directly implanted devices. This really nicely parallels the concept of the cardiac pacemaker and defibrillators. We will discuss this in
chapter 7
when I put a new twist on Iron Man's origin story!
Another interesting example of this kind of FES is a neuroprosthetic for improving hand function developed by Arthur Prochazka (coincidentally also at the University of Alberta). This “bionic glove” helps people who have problems moving the wrist and hand due to a stroke or spinal cord injury. As long as the person has some ability to move the wrist, the glove can help stimulate the muscles in the forearm that control grip. Sensors detect wrist angle and then trigger small stimulators to electrically activate the flexor muscles of the forearm. Imagine picking up a bottle of water. As you reach out and pick it up you first open your hand, make contact with the bottle, then close your hand, and lift it up. If you had partial paralysis of your arm and hand this would be difficult, if not impossible, to do. With the bionic glove, the user reaches out and gets the hand around the bottle. But because she cannot contract the flexor muscles, she extends the wrist a bit more. This signal triggers the glove to flex the fingers and the grip is made. Then the bottle can be picked up and used. In this simple device, three different muscle groups are activated with electrical stimulation.
So far our examples have had to do with neuroprosthetics that detect signals related to a residual movement that someone can make (like a bit of a walking movement, a bit of a wrist movement). Then the devices use that signal to trigger a stimulator to activate muscles that cannot get the normal activation from the nervous system. It is important to understand how these types of devices function to get
closer to appreciating how the Iron Man suit could actually work. For starters, the suit would have sensors detecting nervous system commands from nerve or muscle as well as commands from residual movement. The suit then would amplify the normal movement. However, it wouldn't do so by stimulating the muscles like in FES. Instead the trigger signals would drive the motors controlling the joints in the Iron Man robotic suit. Above we were talking about restoring function in a damaged nervous system with FES and neuroprosthetics. In that way the neuroprosthetic helps “bridge” the problems in the nervous system to restore some movement ability. This is what Tony will need to operate the NTU-150 and already exists in the form of the Cyberdyne Hybrid Assistive Limb (HAL) wearable robot suit.
HAL is a kind of robot suit that is worn in order to improve physical capability. As we already learned, when someone tries to make a movement, weak electrical signals travel in the nerves and occur in the muscle during a contraction. These weak signals can be detected, measured, and amplified with electrode sensors placed on the skin over the muscles being used. The HAL suit uses this control signal to trigger the control of motors acting at joints on the suit. As a result, the suit is controlled directly based on the commands coming from the person wearing it. So, controllers for the elbow joint motors are triggered from nervous system commands going to the muscles that normally flex and extend the elbow. Cyberdyne Inc. likes to call this a “voluntary control system.” This type of system relies on the users' intended movements to then amplify those movements by making the robot suit do the appropriate action. An additional layer of control is added using a “robotic autonomous control system,” which is a kind of predictive system that works along with the voluntary triggering. All together, HAL applies a hybrid of the two control modes that provide an almost human-like movement. We will pay more visits to HAL later on in the book.
This basic concept of hybrid control has also been used by a company called Touch Bionics in their development of a fantastic neuroprosthetic hand. Think back to our medieval “Iron Hand” prosthetic shown in
figure 2.4
. Touch Bionics has created a sophisticated robot hand prosthetic that is driven by the normal muscle activation signals for the fingers. It can also be controlled by touch signals taken from pressure sensors. The Touch Bionics 5 finger i-LIMB hand uses inputs that come from the normal muscle signals to open and close the lifelike plastic fingers in the prosthetic. So, it uses the signals that come from muscles in the stump or remaining part of the person's arm. The i-LIMB hand then can open and close to grasp objects in a way similar to a biological hand (panel A of
figure 3.2
).
Figure 3.2. Touch Bionics 5 finger i-LIMB hand, which uses inputs that come from muscle signals to open and close the lifelike plastic fingers in the prosthetic. The i-LIMB makes a pinch grip (
A
) and individual “Pro-Digits” can be used for people with partial amputations (
B
and
C
). Courtesy Touch EMAS Ltd.
ProDigits is an application of this device for people who are missing one or more fingers due to accident or from birth. This device has individually powered and controlled motors for each finger and can be set up to take over for just the fingers needed by the user. This means that a lot more than just an open and closed grip can occur and more dexterous activities can be done, such as pointing with the index finger and typing on a keyboard. These seem pretty simple tasksâand they are if you have an intact hand. But they are not if you don't. An example of replacing one finger is shown in panel B of
figure 3.2
and replacing function for four fingers is shown in panel C. The i-LIMB hand and the ProDigits can be covered in a flexible skin product making it look just like a real biological hand. Or they can be left uncovered. Tony Stark would go for the covered option if he needed one, I think.
Now let's return to Tony Starkâsomeone with a fully intact body and nervous systemâwearing a robotic suit to improve and amplify his normal abilities. If you think this through, you will realize that using the Iron Man suit could occur by tracking the nervous system commands and using them to control the suit. Doing this effectively takes the user's muscles out of the equation. That is, it creates the same disconnect between nervous system and movement that exists after a spinal cord injury or stroke. We just finished talking about using signals in the nervous system to trigger muscle activity and devices like robot suits or artificial limbs. The next step is determining the feasibility of using that initial command signalâthe one from the brain or spinal cordâto power motors and computers directly. This means thinking about what Doc Ock from Spiderman or Professor X / Charles Xavier from X-Men can teach Iron Man about connecting machinery to his nervous system. What would it mean for Tony Stark to engineer the Iron Man armor to be able to use this kind of control? Is it even possible, and, if it is, is it dangerous? To answer this we are going to do a little fast forward and then a rewind!
Figure 3.3. The “neuromimetic telepresence unit” that Tony uses to interface with his brain and to remotely control the Iron Man suit of armor (
A
), from the graphic novel
War Machine
(2008). Note the circled “neural access port” that is meant to penetrate Tony's skull. Tony connects to the telepresence unit (and therefore controls the Iron Man suit) from his hospital bed (
B
) from “This Year's Model” (Invincible Iron Man #290, 1993). Copyright Marvel Comics.
First, the fast-forward part. What we are focusing on here is the issue of somehow using a direct connection between the nervous system and a robotic device. This kind of connection was shown in Iron Man in its most extreme form back in March and April 1993 in “This Year's Model” (Iron Man #290) and “Judgement Day” (Iron Man #291). These stories contain elements of the extended story arc captured in the 2008 Iron Man graphic novel
War Machine
in which Tony Stark had to fake his death. Jim Rhodes has stepped in to become a fill-in “silver” Iron Man (and later became War Machine). Tony then has to use a remote control Iron Man (the NTU-150, but I will call it “robot Iron Man”), which is controlled by a direct connection to his nervous system called a “neuromimetic telepresence unit” (hence the name NTU). This unit basically involves a direct link between activity in Tony Stark's brain and activity in robot Iron Man. Included in the graphic novel is a detailed description of this telepresence unit.
The image shown in panel A of
figure 3.3
comes from that manual. There is a lot of description in the seven-page pseudomanual printed in the novel! However, for our purposes, the piece I want to key on is the description of the actual headset the user must wear. It is of course called a “user interface headset” so the writers are taking a very literal view of how real scientists actually describe things! Anyway, as written in the manual, the headset “provides a direct electronic control channel” for the operator to use to control the robot Iron Man. This headset interfaces with the operator by “the neural port surgically implanted at the base of the operator's skull just behind the right ear, transmitting commands and information between the Central Nervous System and the neuromimetic operating system.” The image in panel B shows the headset being interfaced (“jacked in”) to Tony's brain and comes from “This Year's Model” (Iron Man #290). In both images, I have circled the key neural link panel. Sounds absolutely like comic book fiction, right? Well, partly it is but it also is very much like an emerging phenomenon generally known as a “brain computer interface.” To explore this for Iron Man, let's look at the real science behind this concept.