"Tactile signing" refers to the mode or medium, i.e. signing (using some form of signed language or code), using touch. It does not indicate whether the signer is using a tactile form of a natural language (e.g. American Sign Language), a modified form of such a visual sign language, a modified form of a manually coded language, or something else.
Until the 1970s, most people who were deaf and blind lived lives of isolation. As professionals became aware of this population, attempts were made to serve deafblind people by creating manual alphabets or modifying sign languages used by deaf-sighted people. See for example Helen Keller National Center, LightHouse for the Blind and Visually Impaired and Alabama Institute for the Deaf and Blind. Several methods of deafblind communication have been developed, including:
- Hand-over-hand (also known as 'hands-on signing'): The receiver's hand(s) are placed lightly upon the back of the hands of the signer to read the signs through touch and movement. The sign language used in hand-over-hand signing is often a slightly modified version of the local sign language; this is especially the case when used by people who have learned to read sign visually before losing their vision as with Usher syndrome. The sign language used may also be a manually coded version of the local oral language (such as Signed English), or a mid-way point between the two known as contact signing.
- Tracking: The listener lightly places their hand(s) on the wrists or forearms of the signer to help them track the signs visually (as the listener knows the location of their own hands and is thus able to focus on the signer's hand(s) as they move in space. The listener using 'tracking' typically has a limited field of vision).
- Protactile: Sharing some qualities with hand-over-hand signing, protactile involves the use of signs on the hands, wrist, elbow, arm, upper back, and when in a seated position, knees and the top of the thigh. Invented by deafblind people, protactile communicates not just words but also information about emotions and the environment.
- Tactile fingerspelling: A manual form of the alphabet in which words are spelled out (see manual alphabet) may be the best known as it was the method Anne Sullivan used to communicate with Helen Keller. Different manual alphabets may be used, such as the one-handed ASL alphabet or the two-handed manual alphabets used, for example, in Britain. Again, the listener places a hand over that of the signer. This alphabet is also rarely used in the United States.
- Lorm: A hand-touch alphabet developed in the 19th century by deafblind inventor and novelist Hieronymus Lorm and used in several European countries.
- Tracing or 'print-on-palm': Tracing letters (or shapes) onto the palm or body of receiver. Capital letters produced in consistent ways are referred to as the 'block alphabet' or the 'spartan alphabet'.
- Braille signing: Using six spots on the palm to represent the six dots of a braille cell. Alternatively, the signer may 'type' onto a table as if using a braille typewriter (see Perkins Brailler) and the receiver will place their hands on top. This method can have multiple receivers on top of each other, however a receiver sitting opposite will be reading the braille cell backwards.
Additionally, simple ways of responding, such as a tap for 'yes' or a rubbing motion for 'no', may be included. In Japan, a system developed by a deafblind woman is in use to represent the five vowels and five major consonants of the Japanese language on the fingers, where the signer 'types' onto a table and the receiver places their hands on top to 'listen' (see this page for more info).
What was especially challenging was communicating with children or babies born deaf and blind who had not had an opportunity to learn a natural (spoken or signed) language. Below are listed some of these attempts.
- Co-active signing: The sender moves and manipulates the hands and arms of the Deafblind person to form sign shapes, or fingerspelt words. This is often used with deafblind children to teach them signs, and with people with an intellectual disability.
- On-body signing: The body of the person who is deafblind is used to complete the sign formation with another person. E.g.: chin, palm, chest. Often used with people who also have an intellectual disability.
As the decades progressed, deafblind people began to form communities where tactile language were born. Just as deaf people brought together in communities first used invented forms of spoken language and then created their own natural languages which suited the lives of deaf-sighted people (i.e. visual languages), so too, deafblind people in communities first used modified forms of visual language and are now creating their own natural tactile languages. For the development of visual sign languages, see for example: Deaf Education; List of sign languages; Nicaraguan Sign Language. One of the most active communities is in the Seattle area of Washington State. See Washington State DeafBlind Citizens .
Comparison to visual sign language
Little data exists on the specifics of variation between visual and tactile sign language use. However, studies suggest a significant degree of difference. In hand-over-hand signing, elements of deaf sign languages known as 'non-manual features' (such as facial expression) will not be received, and will need to be substituted with supplementary information produced manually. Common non-manual features used in Deaf Sign languages that are absent in tactile signing include raised eyebrows as a question marker and a shaking head as a negation.
Tactile signing also resides within a smaller space than is typical in visual sign language. Signs that touch the body may be moved forward into a more neutral space. Other signs which are usually produced in an 'out of range' location (such as the leg) may be modified (either spelled or a variant sign used).
Different rules govern turn-taking, greetings and goodbyes.
An example of a language that naturally developed among the deaf-blind is Bay Islands Sign Language in Honduras.
This section needs expansion. You can help by adding to it. (October 2015)
In 1648 in England, John Bulwer wrote of a couple who were proficient in tactile sign communication:
"A pregnant example of the officious nature of the Touch in supplying the defect or temporall incapacity of the other senses we have in one Master Babington of Burntwood in the County of Essex, an ingenious gentleman, who through some sicknesse becoming deaf, doth notwithstanding feele words, and as if he had an eye in his finger, sees signes in the darke; whose Wife discourseth very perfectly with him by a strange way of Arthrologie or Alphabet contrived on the joynts of his Fingers; who taking him by the hand in the night, can so discourse with him very exactly; for he feeling the joynts which she toucheth for letters, by them collected into words, very readily conceives what shee would suggest unto him. By which examples [referring to this case and to that of an abbot who became deaf, dumb, and blind, who understood writing traced upon his naked arm] you may see how ready upon any invitation of Art, the Tact is, to supply the defect, and to officiate for any or all of the other senses, as being the most faithful sense to man, being both the Founder, and Vicar generall to all the rest."
- Frankel, M. A. (2002), Deaf-Blind Interpreting: Interpreters' Use of Negation in Tactile American Sign Language, in Sign Language Studies 2.2, Gallaudet University Press.
- Mesch, J. (2000), Tactile Swedish Sign Language: Turn Taking in Conversations of People Who Are Deaf and Blind. In Bilingualism and Identity in Deaf Communities, ed. M. Metzger, 187–203. Washington, D.C.: Gallaudet University Press.
- O'Brien, S., and Steffen, C. (1996). Tactile ASL: ASL as Used by Deaf-Blind Persons. Gallaudet University Communication Forum. Volume 5. Washington, D.C.: Gallaudet University Press.
- Bulwer, J. (1648) Philocopus, or the Deaf and Dumbe Mans Friend, London: Humphrey and Moseley.