XClose

Library Services

Home
Menu

DCAL Research Data Archive

The DCAL Research Data Archive holds the data outputs of the Deafness, Cognition and Language Research Centre.

Browse the DCAL Research Data Archive

The vast majority of research studies on language and cognition are based on languages which are spoken and heard. The Deafness, Cognition and Language Research Centre's research provides a unique perspective on language and thought by placing sign languages and Deaf people in the centre of our understanding of language and communication.

DCAL's research since 2006 has contributed substantially to the recognition that deafness is an important model for exploring questions in linguistics, cognitive sciences and neuroscience.

All metadata for the projects listed below is openly available. Some data is restricted to named researchers; all data marked 'Level 1' is open access.

You can read the DCAL Data Archive and Management Policy on UCL Discovery.

For more information, and to discuss access to the restricted data, contact dcal@ucl.ac.uk.

To cite this collection, please cite the following DOI: dx.doi.org/10.14324/000.ds.DCAL

DCAL-funded projects

Normative Data and Assessment Tools for British Sign Language

BSL Norming Study

Vinson, D. P., Cormier, K., Denmark, T., Schembri, A., & Vigliocco, G.

Normative data collected for 300 British Sign Language signs for age of acquisition (AoA, "learning signs"), familiarity ("seeing signs") & iconicity

Research on signed languages offers the opportunity to address many important questions about language which may not be possible to address via studies of spoken languages alone. Many such studies, however, are inherently limited because there exist hardly any norms for lexical variables that have appear to play important roles in spoken language processing.  Here we present a set of norms for age of acquisition, familiarity and iconicity for 300 British Sign Language (BSL) signs, as rated by deaf signers, in the hope that they can prove useful to other researchers of British Sign Language and other signed languages.

BSL Grammaticality Judgement Task

Cormier, K., Schembri, A., Vinson, D., & Orfanidou, E.

In this study, we examined age of acquisition effects in deaf British Sign Language (BSL) users via a grammaticality judgment task.

Age of acquisition (AoA) effects have been used to support the notion of a critical period for first language acquisition. In this study, we examined AoA effects in deaf British Sign Language (BSL) users via a grammaticality judgment task. When English reading performance and nonverbal IQ were factored out, results showed that accuracy of grammaticality judgement decreases as AoA increases, until around age 8, thus showing the unique effect of AoA on grammatical judgement in early learners. No such effects were found in those who acquired BSL after age 8. These late learners appear to have first language proficiency in English instead, which may have been used to scaffold learning of BSL as a second language later in life. This test is not suitable as an assessment tool (either for research or clinical purposes) because of the lack of difference in performance between native and non-native signers.

BSL Sentence Reproduction Test

Cormier, K., Adam, R., Rowley, K., Woll, B., & Atkinson, J.

The BSL Sentence Reproduction Test aims to create a screen that can be used as a test for global fluency in BSL.

The BSL Sentence Reproduction Test (BSLSRT), adapted from a test originally created for American Sign Language (Hauser et al., 2008) aims to create a screen that can be used to distinguish signers with native-like vs. non-native-like skills. The stimulus items, based on a set of 49 sentences from Hauser et al. (2008), includes 40 BSL sentences varying in length and complexity, presented on video by a deaf native BSL signer. Participants were instructed to copy the signed sentence to camera, exactly as they saw it, regardless of phonological or lexical variants for the same concepts that they might prefer. Participants were 20 deaf adults: 10 deaf native signers, 5 deaf early learners first exposed to BSL between ages 2 and 6, and 5 late learners first exposed to BSL at age 11 or later. Responses were scored by a team of deaf and hearing sign language researchers, all fluent in BSL. Responses which were agreed by all scorers as identical to the stimulus were given a score of 1; responses which included any phonological, morphological, lexical or syntactic deviations were given a score of 0 (except for a few specific, agreed-upon acceptable deviations). Results indicate that native signers scored significantly higher than non-native signers. For non-native signers there was no significant differences between early and late learners. These results suggest that the BSL-SRT can be used as a screening test for assessing fluency in deaf adults and for exploring age-of-acquisition effects more generally. Note that researchers must be trained in how to administer and particularly to score the BSLSRT, and they must be competent signers themselves (at least to BSL level 2). In adapting this test for another sign language, a team of at least 3 highly fluent (preferably native) signer researchers is needed.

Face-to-Face Communication

Test of Child Speechreading

Kyle, F.E., Campbell, R., & MacSweeney, M.

This project developed a test of child speechreading suitable for use with deaf and hearing children and explored the relationship between speechreading and cognitive skills.

This project developed of a new Test of Child Speechreading (ToCS) that was specifically designed to be suitable for use with deaf children. ToCS is a child-friendly, computer-based, speechreading test that measures speechreading (silent lip-reading) at three psycholinguistic levels: words, sentences, and short stories. 86 severely and profoundly deaf and 91 hearing children aged between 5 and 14 years participated in the standardisation study. Deaf and hearing children showed remarkably similar performance across all subtests on ToCS. Speechreading improved with age but was not associated with non-verbal IQ. For both deaf and hearing children, performance on ToCS was significantly related to reading accuracy and reading comprehension.

Language Development

Nonsense Sign Repetition Task

Mann, W., Marshall, C., & Morgan, G.

This project considered the effect of phonetics on phonological development in a signed language. Deaf children aged 3-11 acquiring British Sign Language (BSL) and hearing nonsigners aged 6-11 repeated nonsense signs.

The purpose of the project was to investigate the impact of phonetic complexity on children's ability to carry out a phonological task, that is, the repetition of nonsense signs. We tested two groups: Deaf children who are acquiring British Sign Language (BSL) as a first language (ages 3-11) and hearing children with no experience of signing (ages 6-11). This method gave us the opportunity to investigate two things: (1) the systematic manipulation of phonetic (i.e., visual and motoric) complexity in two phonological parameters, handshape and movement, enabled us to investigate how phonetic complexity impacts on children's accuracy in perceiving and articulating nonsense signs and how this changes during development; and (2) the comparison of children who regularly use sign language (Deaf children) to children with no experience of sign language (hearing children) enabled us to determine whether the effects of sign language phonetics are universal, and to what extent sign language processing is affected by language experience and language-specific phonological knowledge.

Theory of Mind

Morgan, G.

How does access to language affect theory of mind in deaf children? Hearing children demonstrate spontaneous understanding of false belief on an eye-tracking task.

Based on anticipatory-looking and reactions to violations of expected events, infants have been credited with 'theory of mind' (ToM) knowledge that a person's search behaviour for an object will be guided by true or false beliefs about the object's location. However, little is known about the preconditions for looking patterns consistent with belief attribution in infants. In this study we compared the performance of 17- to 26-month-olds on anticipatory-looking in ToM tasks. The infants were either hearing or were deaf from hearing families and thus delayed in communicative experience gained from access to language and conversational input. Hearing infants significantly outperformed their deaf counterparts in anticipating the search actions of a cartoon character that held a false belief about a target-object location. By contrast, the performance of the two groups in a true belief condition did not differ significantly. These findings suggest for the first time that access to language and conversational input contributes to early ToM reasoning.

Identifying Specific Language Impairment in Deaf Children who use BSL

Mason, K., Rowley, K., Marshall, C. R., Atkinson, J. R., Herman, R., Woll, B., & Morgan, G.

This was the first project in the UK to investigate the prevalence of Specific Language Impairment in deaf children who communicate using British Sign Language.

We describe in detail the performance of 13 signing deaf children aged 5-14 years on normed tests of British Sign Language (BSL) sentence comprehension, repetition of nonsense signs, expressive grammar and narrative skills, alongside tests of non-verbal intelligence and fine motor control. Results show these children to have a significant language delay compared to their peers matched for age and language experience. This impaired development cannot be explained by poor exposure to BSL, or by lower general cognitive, social or motor abilities. As is the case for SLI in spoken languages, we find heterogeneity in the group in terms of which aspects of language are affected and the severity of the impairment. We discuss the implications of the existence of language impairments in a sign language for theories of SLI and clinical practice.

Language Processing

Iconicity and Phonological Judgements

Vinson, D., Thompson, R. L., Skinner, R., & Vigliocco, G.

Experiments on iconicity and British Sign Language processing.

A standard view of language processing holds that lexical forms are arbitrary, and that non-arbitrary relationships between meaning and form such as onomatopoeias are unusual cases with little relevance to language processing in general. Here we capitalize on the greater availability of iconic lexical forms in a signed language (British Sign Language, BSL), to test how iconic relationships between meaning and form affect lexical processing. In three experiments, we found that iconicity in BSL facilitated picture-sign matching, phonological decision, and picture naming. In comprehension the effect of iconicity did not interact with other factors, but in production it was observed only for later-learned signs. These findings suggest that iconicity serves to activate conceptual features related to perception and action during lexical processing. We suggest that the same should be true for iconicity in spoken languages (e.g., onomatopoeias), and discuss the implications this has for general theories of lexical processing.

Hands and Mouth in Sign Production

Vinson, D., Thompson, R. L., Fox, N., & Vigliocco, G.

Experiments on errors produced by hands and mouth in British Sign Language.

In contrast to the single-articulatory system of spoken languages, sign languages employ multiple articulators, including the hands and the mouth. We asked whether manual components and mouthing patterns of lexical signs share a semantic representation, and whether their relationship is affected by the differing language experience of deaf and hearing native signers. We used picture-naming tasks and word-translation tasks to assess whether the same semantic effects occur in manual production and mouthing production. Semantic errors on the hands were more common in the English-translation task than in the picture-naming task, but errors in mouthing patterns showed a different trend. We conclude that mouthing is represented and accessed through a largely separable channel, rather than being bundled with manual components in the sign lexicon. Results were comparable for deaf and hearing signers; differences in language experience did not play a role. These results provide novel insight into coordinating different modalities in language production.

The Deaf Individual and the Community

Bilingualism in Two Sign Languages: Australian Irish Sign Language

Adam, R. & Woll, B.

A sociolinguistic study of Australian Irish Sign Language - a language in attrition.

Little is known about unimodal sign bilingualism: whether it resembles unimodal (spoken) bilingualism, or bimodal (spoken and signed) bilingualism, or whether it has unique qualities. This study is the first to examine this topic through a study of bilingualism in two Deaf communities in which dialects of unrelated languages: British Sign Language (BSL) and Irish Sign Language (ISL) are used. The research looks at previously unexplored aspects of code-blending and code-mixing, and compares the data with data on bimodal bilingualism (in a signed and a spoken language) and unimodal bilingualism (in two spoken languages) with a combination of experimental and naturalistic data. The study was based on interviews with bilinguals. As well as phenomena already described for unimodal spoken language bilingualism, including code-switching and code mixing, the study reports on mouthing, where spoken mouth patterns (in this case English) are produced simultaneously with manual signs. These are usually considered examples of code-blending, reflecting active mixing of two languages. This study provides an initial understanding of how modality interacts with bilingualism and suggests the need for further explorations.

Sign Language and Interpreter Aptitude Test Battery

Stone, C. & Vinson, D.

The aptitude study aims to identify screening tools to assessment aptitude for sign language and interpreter education.

Undergraduates (initially sign language naïve or low fluency) were recruited (n = 29 although some participants did not continue for the full study) for the longitudinal study from university programs that included Deaf studies and interpreting with students being selected for interpreting depending on their exams results at the end of semester four. A battery of tasks was administered to the undergraduates, which can broadly be split into six areas:

  1. General language skills - Modern Language Aptitude Task (MLAT) administered semester one (five sub-tests)
  2. General intelligence - digit span and matrix reasoning (WAIS IV) administered semesters one and six.
  3. L1 language skills - English reading age (Vernon-Warden) administered semesters one and six.
  4. L2 language skills - BSL grammatically judgement task (BSLGJT - DCAL test) administered semesters one, three, five, six and an adult non-sign repetition task (NSRT - DCAL test) using the stimuli from the prosodic word study (Morgan et al) administered semesters one and six.
  5. Cognitive tasks - connections A (psychomotor) and B (psychomotor and cognitive control), patterns (perceptual processing) (provided by Macnamara) and a classic flanker task (provided by Timarova) administered semesters three, five, six.
  6. Psychological task - The Barratt's impulsivity scale (BIS) administered semesters three, five, six.

Sign Language Documentation and Change

Changing Languages and Identities

This study looks at (English and BSL) languages and identities of deaf young people in different educational settings. 

This study is motivated by the many changes in the deaf community over the last twenty years related to early diagnosis, use of Cochlear Implants, type of schooling and access to Higher Education, as well as the impact these changes may have on language and the identity of the deaf community. In this context, we look at three groups of 16-19 year olds - a group attending mainstream programmes and uses CSWs/interpreters to access the curriculum, a group attending a deaf, oral school, and another group attending a deaf, signing school. One of the main aims is the study of issues related to language and identity. In addition to the deaf students, CSWs and interpreters, who work in education, are included. All participants partake in interviews and complete a number of linguistic tasks. 

Online Measures of Communication

Measuring Language Lateralization with fTCD

Gutierrez-Sigut, E., Payne, H., & MacSweeney, M.

This project used fTCD to assess hemispheric dominance during covert and overt linguistic fluency tasks.

Although there is consensus that the left hemisphere plays a critical role in language processing, some questions remain. Here we examine the influence of overt versus covert speech production on lateralization, the relationship between lateralization and behavioural measures of language performance and the strength of lateralization across the subcomponents of language. The present study used functional transcranial Doppler sonography (fTCD) to investigate lateralization of phonological and semantic fluency during both overt and covert word generation in right-handed adults. The laterality index (LI) was left lateralized in all conditions, and there was no difference in the strength of LI between overt and covert speech. This supports the validity of using overt speech in fTCD studies, another benefit of which is a reliable measure of speech production.

Assessing Hemispheric Dominance During Rhyme and Line Judgements Using fTCD

Payne, H.,  Gutierrez-Sigut, E., Subik, J., & MacSweeney, M.

This project used fTCD to assess hemispheric dominance during linguistic and non-linguistic tasks.

Functional Transcranial Doppler Sonography (fTCD) measures changes in blood flow speed in left and right middle cerebral arteries. Differential increases between left and right arteries have been observed in a number of language tasks, showing greater increases in the left middle cerebral artery (e.g. Knecht et al., 1996). Studies to date that have used fTCD to examine lateralisation of language function have predominantly used overt or covert word or sentence generation tasks. Here we sought to further assess the sensitivity of fTCD to language lateralisation during a metalinguistic task: rhyme judgment in response to written words. In addition this externally paced paradigm allowed us to manipulate the number of stimuli presented to participants and thus assess the influence of cognitive load on strength of laterality indices. In Experiment 1, 28 right handed participants participated in rhyme and visuo-spatial (line) judgement tasks and showed reliable left and right lateralisation at the group level for each task respectively. In Experiment 2 we directly manipulated the influence of pace (and therefore cognitive load) on strength of laterality indices for both judgment types.  Eighteen of the participants performed both judgment tasks during a 'slow' presentation rate (stimulus duration 3.5s - 0.28 per second) and a fast presentation rate (stimulus duration 2.1s - 0.48 per second). A significant main effect of pace demonstrated that the strength of laterality index was increased in rhyme and line judgements during the fast presentation rate. The current results suggest a different manipulation of task difficulty can influences the strength of lateralisation, contrary to previous studies. We suggest that in earlier studies that have used paradigms other than unconstrained word or sentence generation, failure to demonstrate robust hemispheric language lateralisation may indeed be due to an insufficient cognitive load during the active period.  

Foundations of Communication

Baby Iconicity Project

Morgan, G., England, R., Perniss, P., Thompson, R., & Vigliocco, G.

We developed a test of young children's appreciation of iconicity in BSL. This data revealed differences across deaf and hearing children exposed to BSL and English at different ages.

This study investigates the role of representational gestures in children's object naming. Stefanini, Bello, Caselli, Iverson, & Volterra (2009) have previously reported that Italian 24-36 month old children, when labelling pictures of objects, use a high proportion of representational gestures to accompany their spoken responses. The two studies reported here use the same task as Stefanini et al, (2009) to explore the function of such gestures in: (1) typically developing 24-46 month old hearing children acquiring English and in: (2) deaf children of deaf and hearing parents aged 24-63 months acquiring British Sign Language (BSL) and spoken English. Hearing children in Study 1 scored within the range of correct spoken responses previously reported for other language groups but produced very few representational gestures. When they did gesture, however, they tended to express the same action meanings as reported in previous research. The action bias in gestures was also observed in the responses of deaf children of hearing parents in Study 2, who labelled pictures with signs, spoken words and gestures. The deaf group with deaf parents used sign language almost exclusively with few additional gestures. Findings for the different groups are discussed, and the function of representational gestures in spoken and signed vocabulary development is considered in relation to differences between native and non-native sign language acquisition.

The BSL "toy task data" were used to preliminarily investigate whether iconicity - resemblance between form and meaning - is used in the language input as a potential cue to meaning, and might thus play a role in word learning. An extensive body of research has been dedicated to understanding how children learn form-meaning mappings so prodigiously, and a variety of mechanisms - some focusing on the infant's innate learning abilities; others on aspects of the communicative context - have been shown to be involved. In the present project, the idea was that iconicity might be an additional mechanism supporting the process of word learning. We investigated the ways in which caregivers modify their child-directed language in ways that make the link between form and meaning more salient, and thus potentially easier to learn.

Atypical Language

Coding Language Isolates and Late L1 Signers

Adam, R. & Woll, B.

A sociolinguistic study of Australian Irish Sign Language - a language in attrition.

Little is known about unimodal sign bilingualism: whether it resembles unimodal (spoken) bilingualism, or bimodal (spoken and signed) bilingualism, or whether it has unique qualities. This study is the first to examine this topic through a study of bilingualism in two Deaf communities in which dialects of unrelated languages: British Sign Language (BSL) and Irish Sign Language (ISL) are used. The research looks at previously unexplored aspects of code-blending and code-mixing, and compares the data with data on bimodal bilingualism (in a signed and a spoken language) and unimodal bilingualism (in two spoken languages) with a combination of experimental and naturalistic data. The study was based on interviews with bilinguals. As well as phenomena already described for unimodal spoken language bilingualism, including code-switching and code mixing, the study reports on mouthing, where spoken mouth patterns (in this case English) are produced simultaneously with manual signs. These are usually considered examples of code-blending, reflecting active mixing of two languages. This study provides an initial understanding of how modality interacts with bilingualism and suggests the need for further explorations.

Language and Cognition

Iconicity and Language Processing

Vinson, D., Thompson, R. L., Skinner, R., & Vigliocco, G.

Experiments on iconicity and British Sign Language processing.

A standard view of language processing holds that lexical forms are arbitrary, and that non-arbitrary relationships between meaning and form such as onomatopoeias are unusual cases with little relevance to language processing in general. Here we capitalize on the greater availability of iconic lexical forms in a signed language (British Sign Language, BSL), to test how iconic relationships between meaning and form affect lexical processing. In three experiments, we found that iconicity in BSL facilitated picture-sign matching, phonological decision, and picture naming. In comprehension the effect of iconicity did not interact with other factors, but in production it was observed only for later-learned signs. These findings suggest that iconicity serves to activate conceptual features related to perception and action during lexical processing. We suggest that the same should be true for iconicity in spoken languages (e.g., onomatopoeias), and discuss the implications this has for general theories of lexical processing.

Cognitive Control: Executive Functions

Executive Function in Older Deaf Adults

Atkinson, J., Denmark, T., & Woll, B. 

Norming of executive function tasks with older deaf adults aged 50-89 years.

Data sets were collected at a holiday camp for older deaf adults aged 50-89 in October 2012, including demographic interviews, control tasks for nonverbal intellectual ability (WASI matrices) and measures of: wellbeing, language (BSL production, sign repetition and shadowing), and executive cognitive control (Letter fluency, Design fluency, Colour Trails, symbol search, Nelson card sorting, Tower test, Sun and Apple Simon Task). Some participants also completed Digit span, WASI Matrix Reasoning.

Executive Function and Language Abilities in Deaf Children

Jones, A., Marshall, C., Botting, N., Denmark, T., & Morgan, G.

Longitudinal study testing executive function and language in deaf and hearing children aged 6-13 years.

Data were collected between June 2012 and October 2015 at schools and homes across the UK (N = 258). 75 deaf and 85 hearing children were tested a second time 18-24 months after the first time of testing. Children were tested on two language tasks: BSL narrative production test (responding in spoken English or BSL) and EOWPVT (spoken English/BSL productive vocabulary); two control tasks for non-verbal intellectual ability (WASI matrices and WISC Symbol Search); and task measuring executive cognitive control (Simon Sun Apple (inhibition), Tower of London (planning) Design fluency, Colour Trails, Semantic fluency, Odd one out, Spatial span (working memory)). Some participants also completed Digit span at the second time of testing, and most oral deaf and hearing children completed the Bus story narrative tasks at Time 2 also. At Time 1, parents completed a demographic questionnaire, the Language Proficiency Profile (LLP) and the Behavioural Rating of Executive Function (BRIEF). At time 2, the BRIEF was completed a second time.