From active touch to tactile communication:
What’s tactile cognition got to do with it?
Although visual and auditory cognition is well researched and better understood, relatively little is known about tactile cognition. Tactile cognition refers to the higher order processing and integration of tactile information through active touch. Recent developments in cognitive neuroscience mean that we now know far more about the mechanisms underlying tactile cognition than ever before.
Making sense of our touch
Touch provides a rich variety of information about the world around us. The sense of touch is the first sense to develop, and it functions even after seeing and hearing begin to fade. Just before the eighth week of gestation an embryo may develop sensitivity to tactile stimulation.
Touch is our most social sense, and it provides us with our most fundamental means of contact with the external world. Interpersonal touch plays an important role in governing our emotional wellbeing.
The sense of touch provides us with an often-overlooked channel of communication. The notion “to touch with fingertips” is very much related to communication as today’s “touch generation”, consisting of a range of software, games, iPods and mobile phones let people connect with each other through interactive experiences.
Active touch, also described as haptics, plays a regular and frequent role in our everyday life. Whenever we retrieve keys or lipstick from the bottom of a pocket or purse, or awake at night to switch on a lamp or answer a phone, we must identify by active touch the desired objects as distinct from other objects.
Understanding the tactile brain
It is through the sense of touch we process the tactile information of our environment. Touch messages are the first link in the “chain” of information properties required for the processing of tactile information.
The tactile processing system involves the basic somatosensory pathways and is divided into different central regions and distinct streams of information processing.
The somatosensory cortex is involved in processing information related to touch. The somatosensory cortex is located in the parietal lobe of the human brain and receives tactile information from the hand, foot & body. It is well known that a relatively larger proportion of the somatosensory cortex is given over to the representation of the hands than to other parts of the body, given their relative surface area.
The human brain is separated into two distinct cerebral hemispheres, connected by the corpus callosum, and the functions of each cortical hemisphere are different. A study has shown that hemispheric dominance appears to be an organizing principle for cortical processing of tactile form and location; a left hemispheric dominance for tactile form recognition (what one is touching) and a right hemispheric dominance for tactile localization (where one is being touched) (van Boven et al., 2005).
Interestingly, a left-hemisphere advantage for processing local spatial details and a right hemisphere advantage for processing global spatial has been similarly described in the visual system.
Furthermore, a neuroimaging study found that during tactile-based spatial processes, areas traditionally associated with both visual imagery and visual perception were activated (Ricciardi, et., 2006). This means that both tactile and visual stimuli lead to similar patterns of neural activation, supporting the view that there is a common spatial map accessible by means of either tactile or visual sensory modality.
”Touch to emotion”: neural correlates of the emotional aspects of tactile processing
We use touch to share our feeling with others, and to enhance the meaning of other forms of verbal and non-verbal communication.
In the case of emotions, it is not our hands but the body, which is crucial to emotional experiences. Given the apparent relationship between bodily-tactile information processing and emotion, it is not surprising that recent neuroscientific research have found evidence for strong neural connections between the somatosensory cortex and the brain regions involved in the processing of emotions; the limbic system.
The limbic system is a set of brain structures including the hippocampus and amygdala, which support a variety of functions including behavior, long term memory and emotion.
While the mechanisms remain unclear, there is evidence for a strong connection between emotion/emotional awareness and tactile-bodily cognition. Thus, it is important to consider emotions as a powerful motivator to tactile learning.
Understanding tactile cognitions
In the last decade there has been a dramatic increase in the number of research studies directed at studying the different concepts of tactile cognition. From these studies, a brief description of the concepts are given below.
Tactile short-term memory can be described as the capacity for holding a small amount of tactile information in mind in an active, readily available state. Visual and auditory short-term memory is said to hold a small amount of information– from about 3 or 4 elements (i.e., words, digits, or letters) to about 9 elements: a commonly cited capacity is 7±2 elements, referred as the magic number. Research has shown that the span for serially presented tactile stimuli is similar as in vision (Heller, 1989).
The term working memory refers to a brain system that provides temporary storage and manipulation of the information necessary for such complex cognitive tasks as language comprehension, learning, and reasoning.
The tactile working memory refers to the ability to hold and manipulate tactile information for short periods, which is the transformation of information while in short-term memory storage.
Working memory allows us to hold the tactile stimulus characteristics on-line to guide behaviour in the absence of external cues or prompts. Without active working memory, initial tactual precepts’ may decay quickly.
Studies investigating the neural basis of working memory have shown that the prefrontal cortex (the frontal system of the brain) becomes active while subjects perform working memory tasks, either in the visual or in the auditory modality.
Tactile learning is the process of acquiring new information through tactile exploration. Research studies of tactile information processing in humans have shown that people can be trained to perceive a large amount of information by means of their sense of touch.
On the basis of numerous studies it is suggested that the processing of active touch, understood from an information-processing approach, is a fully functional cognitive system.
People who are deafblind use active touch in ways that no one else does to explore objects and the environment, to perceive feelings and to act and communicate.
They use many different methods of communication. The method chosen will depend upon the amount of residual sight and hearing and the age of onset for the vision and hearing loss (congenitally or adventitiously deafblind).
There are various tactile communication and tactile language interventions, which are used within the deafblind field, such as haptic communication, full co-active signs, one hand coactive signs and hand-over-hand signing. Recently there has been an interest in understanding the cognitive aspects involved in these tactile communication methods.
When you cannot see or hear things clearly how do you perceive or share your feelings? The sense of touch provides a very powerful means of eliciting and modulating human emotion (Gallace & Spence, 2008).
However, when hearing and vision are limited, emotional interactions occur in a world of physical closeness and one requires skills to perceive and share feelings by active touch. Therefore, it is important to consider emotions as a powerful motivator to tactile learning.
Deafblind individuals are generally more experienced in recognizing stimuli by active touch. What is the impact of combined vision and hearing impairment on tactile cognitions? Can studies with persons with deafblindess help us understand tactile cognition such as tactile working memory, tactile information processing speed or tactile memory?
Working memory tasks include the active monitoring or manipulation of information or behaviors. A study investigating the tactile working memory ability of an adventiously deafblind woman found higher average performance level in a tactile memory span test compared to performance on both visual and auditory memory span tests (Nicholas & Christensen, in press).
The tactile memory span test measures tactile forward memory and tactile backward memory. Tactile forward memory span is the longest number of items that a person independently touches to the complete series of objects in the correct order. Tactile backward memory span is a more challenging variation which involves independently touching the complete series of objects in the correct reverse order. Tactile forward memory is thought to be related to the efficiency of attention, whereas tactile reverse memory is thought to be associated with working memory.
Working memory refers to a cognitive system that allows us to actively maintain and manipulate information in mind for short periods of time. This system plays a critical role in many forms of complex cognition such as learning, reasoning, problem solving, and language comprehension.
Working memory for visual sign language indicates similar systems irrespective of access to auditory information and preferred language modality (Rudner, et. al., 2009). The structure of working memory for sign language is highly similar to working memory for spoken language (Wilson & Emmorey, 1998). This evidence suggests that the structure of working memory for language develops in response to language input regardless of the modality of that input, thus resulting in largely the same architecture across spoken and signed languages (Wilson & Emmorey, 2003).
Tactile information processing speed
Results from a neuropsychological investigation showed that an adventitiously deafblind person took significantly lesser time to feel and remember objects on a Tactile Form Recognition test (Nicholas & Koppen, 2007). This increased tactile processing speed could reflect how efficiently the person’s attention system was functioning and may seem that a combined auditory and visual deprivation may alter the speed of response
to tactile stimuli. Furthermore, results from this study also showed superior performance in tactile memory for the location of objects on tactual performance tests.
Neuroplasticity is the capacity of the nervous system to modify its organization. The issue of neuroplasticity is important to the deafblind field since sensory deprivation is commonly seen within the deafblind population.
Taken as a whole, the results of these two studies indicate that deafblind individuals perform more effectively than sighted-hearing people on tasks of tactile working memory and tactile memory. A possible explanation for the better performance is that deafblind individuals are expected to have more tactile experience since this is the sensory system that they must rely on for information about their environment.
In other words (tactile) practice makes perfect. The deafblind person can recognize an object by feeling a portion of it, which then acts as a signal for the whole image; a brief touch of the object would be enough to prompt full recognition (Meshcheryakov, 1974).
Further to this, the performance of ten deafblind and ten sighted-hearing participants on four tactile memory tasks was compared. Results showed that the deafblind person’s encoding of tactile spatial information is more efficient than that of sighted-hearing people The explanation given for the superior tactual performance of the deafblind people was that it was a product of more tactual experience. This view appears to be consistent with Rönnberg’s (1995) claim that compensation of a deficit by means of unrelated cognitive functions (neuroplasticity) rather than perceptual compensations accounts for the improvement in performance seen in deafblind individuals in different tasks.
Which neural networks are involved in tactile language processing when hearing and vision are lost simultaneously?
A study which compared neural activation during tactile presentation of words and non-words in a postlingually deafblind subject and six ‘normal’ volunteers, found that the tactile language activated the language systems as well as many higher-level systems of the postlingually deafblind subject. This means that tactile languages are equipped with the same expressive power that is inherent in spoken languages.
Finally, it should be noted that the understanding of tactile cognitions is needed in the functional assessment (tactile strengths & weaknesses) of deafblind individuals. The outcomes of an assessment of tactile sensation, perception and cognition, in addition to “embodied experiences” and “bodily-tactile emotions”, can be used as a basis for intervention or intervention planning. However, the assessment should take into account whether the individual has congenital or acquired deafblindness.
The tactile demands the deafblind person has to meet in its environment can serve as a starting point for understanding tactile cognition. When assessing the person with acquired deafblindness, structured interviews, adapted psychometric instruments or checklists measuring the tactile prerequisites of every-day activities could be applied. Furthermore, it is possible that the understanding of tactile cognition could help them to be more conscious of one’s tactile-bodily awareness.
However, when assessing the tactile processing abilities of congenital deafblind individuals, an interdisciplinary integrated assessment is necessary. One should utilize a collaborative team approach and consider the assessment in a dynamic and broader context. This means recognizing ecological and communicational aspects and emphasizing a tactile cognitive assessment approach in day-to-day communication.
Communication is a form of interaction in which meaning is transmitted by the use of signals that are perceived and interpreted by the partner. In the case of deafblindness, this involves the transfer of information by bodily-tactile means. The functional assessment of the congenital deafblind person must be person-guided and involve careful observation and interaction, across environments, learning areas and recreational settings. The assessment should also include the attributes of tactile sensory processing, tactile motor functioning, tactile perceptual processing and especially tactile cognitive processing. The fundamental cognitive capacity of the deafblind person should be understood in terms of tactile cognitions (Nicholas & Frölander, 2009)
Children who are deafblind often use their own unique tactile communication signals, such as movements, muscle tension, postures, and gestures, which may be missed or misunderstood by parents or caregivers.
This difficulty with interactions and tactile (communicational) deprivation over a long period can cause emotional, behavioral and relational problems. Thus, they may become passive and withdrawn, show signs of tactile defensiveness or develop self abusive or aggressive behaviors. For instance, lack of communication skills due to deafblindness may be a contributory factor for the behavioural difficulties seen in CHARGE Syndrome (Nicholas, 2005). Harmonious interactions and mutual sharing of emotions, often done through movement and active touch with children who are deafblind, are essential for the development of tactile communication (Janssen et. al., 2003). It is also an important step in the path to prevent the development of “challenging” behaviors. Thus, the theoretical and clinical understanding of the emotional aspects of active touch and tactile communication is needed in the deafblind field.
By studying the cognitive and emotional aspects of tactile communication of deafblind persons, future research may find answers to some of the following questions; what is the connection between dual sensory impairment and tactile defensiveness; what is the relationship between tactile working memory abilities and the use of linguistic constructions in tactile communication or tactile language; how are the ‘autobiographic’ forms of tactile memory established; how will tactile memories deteriorate over time compared to visual and auditory memory and how does emotion influence tactile cognition?
Gallace, A., Tan, H. Z., Haggard, P., & Spence, C. (2008). Short term memory for tactile stimuli. Brain Research, 1190, 132-142.
Heller, M. A. (1989). Tactile memory in sighted and blind observers: The influence of Orientation and rate of presentation. Perception, 18, 121-133.
Janssen, H.J.M. (2003). Fostering harmonious interactions between deafblind children and Their educators. Oisterwijk: Van den Boogaard groep.
Meshcheryakov, A. (1974). Awakening to life. Moscow: Progress Publishers.
Nicholas, J. (2005).Can Specific Deficits in Executive Functioning Explain the Behavioral Characteristics of CHARGE Syndrome: A Case Study. American Journal of Medical Genetics, 133A:300–305.
Nicholas, J. & Koppen, A. (2007). Understanding the tactile brain. Conference proceedings. 14th Deafblind International (Dbl) World Conference, Western Australia, Perth.
Nicholas. J. & Froland, H.E. (2009). Assessment of cognition in relation to congenital deafblindness: From sensation to dialogue. Conference proceedings. 7th Deafblind International (Dbl) European Conference, Senigallia, Italy.
Nicholas, J. & Christensen, M. Tactile working memory and deafblindness: a case study. In press.
Ronnberg, J. (1995). Perceptual compensation in the deaf and the blind: Myth or reality? In R.A. Dixon & L. Backman (Eds), Compensating for psychological deficits and declines: Managing losses and promoting gains (pp 251-274). Mahwah, NJ: Lawrence Erlbaum Associates
Ricciardi, E., Bonino, D., Gentili, C., Sani, L., Pietrini, P., & Vecchi, T. (2006). Neural correlates of spatial working memory in humans: A functional magnetic resonance imaging study comparing visual and tactile processes , Neuroscience, Volume 139, Pages 339-349.
Ronnberg, J. (1995). Perceptual compensation in the deaf and the blind: Myth or reality? In R.A. Dixon & L. Backman (Eds), Compensating for psychological deficits and declines: Managing losses and promoting gains (pp 251-274). Mahwah, NJ: Lawrence Erlbaum Associates.
Rudner, M., Andin, J., & Rønneberg, J. (2009). Working memory, deafness and sign language. Scandinavian Journal of Psychology, 50, 495-505.
Van Boven, R. W., Ingeholm, J. E., Beauchamp, M. S., Bikle, P. C., & Ungerleider, L. G. (2005). Tactile form and location processing in the human brain. Proceedings of the National Academy of Sciences USA, 102, 12601-12605.
Wilson, M., & Emmorey, K. (1998). A “word length effect” for sign language: Further evidence on the role of language in structuring working memory. Memory and Cognition, 26, 584–590.
Wilson, M., Emmorey, K., & Iverson, A. (2003). Spatial coding in working memory for sign language: Two effects of spatial similarity. Manuscript submitted for publication.
Jude Nicholas, PhD. Resource Center for the Deafblind and Haukeland University Hospital, Bergen, Norway.
The article is abridged from the full report which is available on the website www.dovblindfodt.dk.