This is an excerpt from the book series Philosophy for Heroes: Act.
How does the brain know what part of the body it belongs to?
To use our body, our brain needs to build a body schema. Without such a schema, we would experience our body only in an abstract sense, the same way we know we have a pancreas or liver [Graziano, 2017]. Having a clear idea what part of the body is actually part of oneself and where each part of the body is located and how it works is essential for survival. For example, consider passing through a doorway. Damage to the brain parts responsible for creating the body schema can lead to anorexia nervosa. In someone suffering from this condition, the brain miscalculates the size of the body and the person thus may hesitate to walk through doorways thinking he will not fit [Guardia et al., 2012].
Body schema The body schema is the brain’s simplified description of the status of the body. It is built by correlating what we see and feel our body is doing with signals that the brain sends to the muscles.
Imagine seeing an apple. In order to grasp it, your brain needs to build a body schema using these six pieces of information:
- You have a body;
- The position of your body (and limbs);
- There is an apple;
- The position of the apple;
- The position of the apple relative to your body; and
- The position of your arm relative to your body.
To gather this information, you need both the temporal lobe (“What do I see?” to identify the object as an apple), and the parietal lobe (“Where do I see it?” to gauge the position of the apple). Without this body schema, you would have to move your arm and then keep readjusting the muscles so that the image of your hand gets into a position to grasp the apple. We have to rely on this clumsy method when, for example, manipulating something which we only see in a mirror. We have to consciously flip the image (and translate left to right and right to left) instead of relying on our body schema.
The Parietal Lobe
The parietal lobe of the cerebrum is positioned at the top of the brain. It is responsible for processing information about the “where” and thus helps us to recognize our limbs. Any damage to the posterior parietal cortex (the area at the end of the parietal lobe directly adjacent to the occipital lobe) can cause disorders like simultanagnosia (the inability to recognize the forest for the trees), akinetopsia (the inability to perceive motion), apraxia (the inability to produce volitional movement), or optic ataxia (the inability to use visuospatial information to guide movement). These findings support the idea that the parietal lobe is responsible for further processing of visual information from the occipital lobe, especially regarding positioning and movement. At the other end of the parietal lobe is the postcentral gyrus (adjacent to the frontal lobe). It contains the primary somatosensory cortex which (like the visual cortex for visual information) is the main sensory receptive area for the sense of touch. Different parts of the primary somatosensory cortex are specialized for sense input from specific body parts. A disproportionately large number of neurons are allocated to process sense data from the lips, tongue, face, and hands.
Primary somatosensory cortex The primary somatosensory cortex in the parietal lobe deals with the processing of tactile sense input. Each body part is represented in the primary somatosensory cortex. The size of each representation correlates with the sensitivity of the body part to tactile stimulation (for example, lips and hands have a larger representation than other body parts).
Located between the primary somatosensory cortex and the occipital lobe, the parietal lobe contains the superior parietal lobe. It is involved in creating and maintaining a mental representation of the internal state of the body. Given its location and role in the perception of self, the superior parietal lobe is also implicated in episodic memory, visuospatial processing, and self-reflection.
Superior parietal lobe The superior parietal lobe deals with creating and maintaining a mental representation of the internal state of the body.
The parietal lobe also contains a brain part called the angular gyrus which receives visual, auditory, and somatosensory information from adjacent brain areas. It deals with putting things into relationship with each other:
- The left angular gyrus is responsible for transferring visual information to Wernicke’s area. It is involved in combining letters into words and words into sentences. It also deals with recognizing the positioning of numbers relative to each other, which is important for mathematical calculations.
- Research has shown that direct electrical stimulation of the right angular gyrus can produce a temporary out-of-body experience [Blanke et al., 2002]. This points to the right angular gyrus being involved in the relationship of the self to the external world.
Angular gyrus The angular gyrus combines visual, auditory, and somatosensory information and puts things into relationship with each other. The left angular gyrus deals with relationships in the external world, especially relating to words and letters, and the right angular gyrus deals with the relationship between the self and the external world.
The supramarginal gyrus (SMG) is adjacent to the primary somatosensory cortex (sense of touch) and the angular gyrus (limb positioning). Its main tasks are to determine and interpret the status of your and other people’s limbs. The left and right SMG have different specializations:
- The left SMG is particularly involved in tool use. This is because tools require a specific grip and can act as an extension of your arms and hands [Andres et al., 2017].
- The right SMG interprets the postures and gestures of other people. If it is disrupted (for example, through magnetic stimulation during an experiment), subjects begin to put more weight on their own point of view. They lose the ability to assess another person’s feelings by looking at that person’s face, body posture, and gestures. Being blind to other people’s emotional states, subjects start to project their own emotions onto others [Silani et al., 2013]. Accordingly, training our right SMG through mindfulness exercises like certain forms of meditation not only increases its size [Sevinc et al., 2019], but also seems to improve our ability to read other people’s emotional states. A limitation of this study is that (short) mindfulness exercises seem to have the opposite effect on people with a narcissistic personality disorder, enabling them to focus even more on themselves [Ridderinkhof et al., 2017].
Supramarginal gyrus The supramarginal gyrus (SMG) creates an internal representation of the limbs of the body, similar to the adjacent superior parietal lobe (which creates a representation of the internal state of the body). It supports tool use (left SMG) and interpreting the emotional state of other people based on their postures and gestures (right SMG).
Interestingly, both the left and right SMG also seem to be relevant for recognizing and reading words (Hartwigsen et al. [2010], Stoeckel et al. [2009]). In addition, at least the left SMG seems to be involved in the phonological memory [Deschamps et al., 2014]. Given that the angular gyrus and Wernicke’s area are adjacent to the supramarginal gyrus, we can see the evolutionary connection between gestures, posture, tool use, and language. The same mental machinery that builds and uses the rules for how limbs connect and move relative to each other is re-used for building sentences and tools.
For the brain, at a conceptual level, connecting limb–joint–limb is the same as connecting subject–verb–object (grammar). Similarly, mechanics could be seen as a form of grammar for how different parts of a machine or tool interact. For example, to build and handle a bow, it requires you to imagine at least five different interacting parts (arc, bowstring, projectile, feathers, arrow-tip). This depends on the mental ability to form concepts and simulate how those concepts could work together—parallels to language and grammar are obvious. The angular gyrus seems to also be involved in the production (and even the imagination) of music—again, activities that require the application of grammar-like rules [Tanaka and Kirino, 2019].
Given its reliance on grammar rules, language is highly predictive and we can process it efficiently. Often, we already know what words come next before others have finished a sentence—especially when also taking into account our ability to anticipate what other people might be thinking. This comes at the risk of being misled by so-called “garden path sentences.” Try reading the following sentences without getting confused (meaning your mind predicts something and you start stumbling over the words at the end):
- “The old man the boat.”
- “The girl told the story cried.”
- “After the student moved the chair broke.”
- “Fat people eat accumulates.”
- “The complex houses married and single soldiers and their families.”
- “The horse raced past the barn fell.”
For example, in the first sentence, “The old man the boat,” we first assume that “old” is an adjective describing “man” and then predict a verb will follow to describe what the old man does with the boat. Only after carefully reflecting on the structure of the sentence we might notice that “the old” refers to “old people,” while “man” is the verb of the sentence. Similarly, in the other sentences, there is confusion between active and passive (“told”), subject and object (“chair”), adjectives and nouns (“fat”, “complex”), and subordinate clause and main clause (“raced” and “fell”).
The Premotor Cortex
While we know now where things are and how they are positioned relative to each other, the question is still open which of those things are our own limbs. Just because we see an arm does not automatically mean it is our arm. If you think that this is obvious and that there is no need to think about which limbs are yours, it is exactly because you are using your body schema to determine that.
A simpler question would be how you determine who is shaking whose hand. Is the other person shaking your hand, are you shaking the other person’s hand, or are you both actively shaking each others’ hands? Just looking at both hands will not answer your question, you need to check whether or not you have initiated the movement. And this causality between initiating a movement and seeing or feeling your limbs move is the key to discover what limbs are yours. Hence, we need to look at the frontal lobe’s primary motor cortex, which is directly adjacent to the somatosensory area of the parietal lobe.primary motor cortex.
Frontal lobe The frontal lobe of the neocortex deals with running a simplified simulation of the world. It provides us the ability to plan and evaluate actions and their future impact. The somatosensory area of the parietal lobe is directly adjacent to the frontal lobe.
The primary motor cortex deals with activating muscles by sending signals to the spine. Like the somatosensory area, different parts of the primary motor cortex are specialized for different muscles in the body. It makes sense to have both areas close to each other given that the brain needs haptic information to adjust movements, and information about how our limbs move to adjust our sense of touch. For example, simply holding a glass of water requires you to adapt your grip accordingly if you feel it slipping. Or when drawing or writing something with a pen, you need to adapt your writing to the resistance of the pen and paper. Too strong and the pen breaks, too light and nothing shows on the page.
While we can control all muscles of our body with the primary motor cortex, we become effective only once we use our muscles in a coordinated fashion. This is the task of the premotor cortex of the frontal lobe (adjacent to the primary motor cortex). For example, when reaching for a glass, your shoulder, arm and hand muscles have to follow a specific motor program. But you are not thinking consciously about those individual steps. You think only about grasping the glass and the motor program is calculated by your brain in the premotor cortex. Other examples of programs created by the premotor cortex are hand-to-mouth movements, or defensive movements (for example, raising your hands to protect your head).
Premotor cortex The premotor cortex is part of the frontal lobe and prepares motor programs to be executed by the adjacent primary motor cortex. It has strong connections to the superior parietal lobe which provides a model of the current state of the limbs.
In that regard, the premotor cortex is similar to the previously discussed superior parietal lobe: both integrate information into a greater whole. The former combines individual actions into motor programs, the latter combines individual sensory information into an idea of the status of the body as a whole. Unsurprisingly, there are strong connections between both parts of the brain. Scientists have found that the creation of the body schema begins in the womb when babies are moving their limbs [Whitehead et al., 2018]. Combining the feeling of the resistance (somatosensory area) when intentionally kicking (motor cortex) helps the baby to create an initial body schema based on touch. At first, we move our limbs uncontrolled, simply to test how our body reacts. Over time, we create a body schema of our arms, legs, and other body parts in order to be able to coordinate our actions.
The premotor cortex is also strongly connected to the basal ganglia. The reason for this is at any given moment, only one motor program should be executed. As the arbiter, the basal ganglia make sure that we focus our movements on one program and not switch back and forth between different motor programs. It accomplishes this by inhibiting signals of competing motor programs. Without this focus, we could end up trying to, for example, walk forward and backward at the same time.
Also adjacent to the primary motor cortex are the supplementary motor areas (SMA). They are adjacent to each other and are located at the top of the frontal lobe, one in each hemisphere. This closeness relates to their function of stabilizing the body and coordinating both sides of the body. For example, rock climbing has been correlated with strong activation of the supplementary motor area [Carius et al., 2020].
Alien Hand Syndrome
A lack of synchronization between the hemispheres (and therefore the primary motor cortex) can lead to a condition called “Alien Hand Syndrome” (AHS) in which the hand will sometimes reach for objects without the subject wanting to do so, even with the effect of self-harm. Patients sometimes have to physically wrestle with their “alien hand” in order to control it. For example, it can happen that the intent (stemming from the right hemisphere) of moving the left hand does not reach the left hemisphere. If, in addition, the left hemisphere observes that the left hand is moving, it concludes that an outside (“alien”) force must have moved the hand. The left hemisphere does not associate the original intent of the right hemisphere with the observed movement because it never got the information that the right hemisphere wanted to move the hand. In that regard, AHS is similar to reflexes or involuntary muscle spasms due to exertion—we can observe our body moving, but there was no intent to do so. The difference between AHS and reflexes or muscle spasms is that the former requires processing in the brain while the latter involves only the spine or the muscles themselves (see Figure 5.23).
It seems that to correctly identify something as self, the brain requires two independent sources of information. With what we have learned about AHS, we can conclude that these sources do not have to be the visual sense and the sense of touch, but can also be a combination of our visual sense and the internal acknowledgement that a movement was initiated by our brain and not some outside force. The following internal dialog illustrates how the hemispheres combine their information to understand what is going on.
- Right hemisphere: I want to reach that glass of water.
- Left hemisphere: Understood, go ahead.
- Right hemisphere: Moves left hand.
- Left hemisphere: OK, I see the hand moving to grasp the glass of water. I knew this was going to happen, all is fine.
In someone with communication problems between the two hemispheres, the left hemisphere knows as little about why an individual moved his left hand as that individual understands why someone else moved her hand.
- Right hemisphere: I want to reach that glass of water.
- Left hemisphere: I cannot hear you.
- Right hemisphere: Moves left hand.
- Left hemisphere: OK, I see the hand moving to grasp the glass of water. This cannot be caused by this brain as the right hemisphere never reported to me its intent. There must be an outside force moving the hand.
He could still try to predict his hand’s behavior, but without knowing the internal state of his other hemisphere, it would be only a guess, about as accurate as predicting someone else’s behavior [Cleeremans, 2011]. It is similar to people suffering from dissociative identity disorder (also called multiple personality disorder). They report a feeling that their body became “possessed,” that they experienced sudden impulses or strong emotions not under their control, and that they became a passive observer of their actions [Publishing, 2013].
AHS is actually an umbrella term for a number of similar conditions with different causes. This is because to create the body schema, many different parts of the brain are required. If there are issues in this network (for example, a damaged connection between both hemispheres), the brain misses important information to build a proper body schema. We will discuss the interaction between both hemispheres in more detail in a later article.
Phantom Limb Pain
Now, what about the opposite of creating a body schema? What happens if our body is severely injured, and our mind’s body schema no longer aligns with our body? For example, there is the condition called “phantom limb pain” where the patient suffers pain in a limb he has lost. This pain is located outside of the body, where the original limb was. It is not enough for the patient to simply look at the place where the limb should be and convince himself that the pain should not be there. The experience is a conflict between the learned body schema and the available visual data.
Phantom limb pain can be seen as the opposite of the alien hand syndrome. Sufferers of phantom limb pain want to lose the part of their body schema that resembles their formerly healthy (but now painful and missing) limb. Sufferers of AHS want to regain mental control of their limb. The former want to modify their body schema to reflect that the limb is lost; the latter want to modify it so that it contains the limb. Sufferers of phantom limb pain often report that their missing limb is in a clenched state and that they are unable to, for example, open their fist to get rid of the pain.
One way to treat the condition is to suppress the pain by medication. Another way would be by attaching a transplant from a donor or a prosthetic limb. A third alternative targets the body schema itself. But unlike a baby with healthy limbs in the womb, someone who is missing a limb cannot touch anything with their phantom limb. Instead, the patient’s healthy limb is placed beside a mirror which is placed between the remaining and the missing limb so that from the side, it looks like the patient still has two working limbs. He is advised to repeatedly clench and unclench his healthy limb as well as his missing limb (that the brain thinks he still has and thus causes pain). After a while, by observing how the missing limb supposedly does work, the brain notices that the patient somehow regained the ability to move his phantom limb. Studies with patients following a six-week mirror therapy program showed that some patients experienced a significant reduction in phantom limb pain (MacIver et al. [2008], Ramachandran and Rogers-Ramachandran [1996]).
This phenomenon can also be used to extend the body schema of a healthy person. Ask a volunteer to hide his hand behind a screen. Then place a fake rubber hand in front of him. Then, start stroking both the hidden hand and the fake hand with a brush. After a minute or two, the fake hand is integrated into the body schema of the volunteer. The person starts to feel that the fake hand is part of him, and when asked to close his eyes and point to his hand, he will point to the fake hand, and will even experience pain or shock if the fake hand is “injured.” This is called the “rubber hand illusion” [Botvinick and Cohen, 1998].
Rubber hand illusion The rubber hand illusion can be evoked by brushing both a fake rubber hand and the real hand of a human participant. If only the fake rubber hand is visible, the brain will assume it is the real hand and incorporate it into its body schema.
This confirms what we have speculated earlier: creating a body schema, modifying a broken body schema, and extending an existing body schema all have in common that they rely on two sources of information. The brain of a baby in the womb is learning to connect the action of kicking with the feeling of the resistance of the kick. With the phantom limb pain, the brain is connecting the intent of clenching and unclenching the hand with visual information of the mirror image. With the fake rubber hand, the brain is connecting the feeling of brush strokes on the hand with the visual image of the fake rubber hand. We can find this idea of relying on multiple sources to establish our subjective idea of the world in other places as well. For example, think of rumours you hear and how much more believable they become when you hear the same piece of information from several of your friends.
What we can do with mirrors and rubber hands, we can do even better with virtual reality and whole bodies. For example, in one study, people were put into a virtual reality (using headsets) where each participant could see a virtual representation (an avatar) of his own body from two meters away. Then, each participant was shown a hand stroking his or her avatar, while at the same time, the researchers stroked the actual body the participant. Just as with the fake hand, the participants reported that they started feeling that the avatar somehow belonged to their own bodies. This was demonstrated by removing their headsets, blindfolding them, moving them to another spot in the room, and then asking them to return to their original spot. In those cases where the real and virtual backs were stroked in sync, the participants overshot their original location, moving closer to the position of their avatar [Lenggenhager et al., 2007, Miller, 2007, Ehrsson, 2007]. These kinds of out-of-body experiences can also occur without an elaborate experimental setup. Some people report that they have experienced the world from a location outside their physical body, probably caused by a miscalculation by the brain [Arzy et al., 2006]. Besides the treatment of phantom limbs, other applications of virtual reality exist in therapies for amputees to improve motor control of prosthetics [Marasco et al., 2018].