Phonetik und Sprachverarbeitung
print

Links und Funktionen
Sprachumschaltung

Navigationspfad


Inhaltsbereich

Publications

This is a searchable list of publications of scientists working at or associated with the Institute of Phonetics and Speech Processing. You can choose to sort the list by year or by publication type.

The complete list in BibTeX format can be downloaded here:
Download list of publications (bibtex)

The “Research Reports of the Institute of Phonetics and Speech Communications” (FIPKM, “Forschungseberichte des Instituts für Phonetik und Sprachliche Kommunikation“) were edited and published for 39 volumes until the series was discontinued in 2002. Some of the volumes published between 1996 and 2002 are available online. Others are available in print at request.
More information


Search


Regular expression, case-insensitive, matched against all BibTeX fields (author, title, etc.)


One or more years or ranges of years, e. g.
1993
1995-1998
08-
-99,02-06,14-





Reference

Schmid, G., Ziegler, W. (2006). Audio-Visual Matching of Speech and Non-Speech Oral Gestures in Patients with Aphasia and Apraxia of Speech. Neuropsychologia, 44(4), 546-555.

BibTeX

@article{ekn_bibtex_00147,
  title = {Audio-Visual Matching of Speech and Non-Speech Oral Gestures in Patients with Aphasia and Apraxia of Speech},
  shorttitle = {Audio-Visual Matching of Speech and Non-Speech Oral Gestures in Patients with Aphasia and Apraxia of Speech},
  author = {Schmid, G. and Ziegler, W.},
  year = {2006},
  journal = {Neuropsychologia},
  volume = {44},
  number = {4},
  eprint = {16129459},
  eprinttype = {pubmed},
  pages = {546--555},
  abstract = {BACKGROUND: Audio-visual speech perception mechanisms provide evidence for a supra-modal nature of phonological representations, and a link of these mechanisms to motor representations of speech has been postulated. This leads to the question if aphasic patients and patients with apraxia of speech are able to exploit the visual signal in speech perception and if implicit knowledge of audio-visual relationships is preserved in these patients. Moreover, it is unknown if the audio-visual processing of mouth movements has a specific organisation in the speech as compared to the non-speech domain. METHODS: A discrimination task with speech and non-speech stimuli was applied in four presentation modes: auditory, visual, bimodal and cross-modal. We investigated 14 healthy persons and 14 patients with aphasia and/or apraxia of speech. RESULTS: Patients made substantially more errors than normal subjects on both the speech and the non-speech stimuli, in all presentation modalities. Normal controls made only few errors on the speech stimuli, regardless of the presentation mode, but had a high between-subject variability in the cross-modal matching of non-speech stimuli. The patients' cross-modal processing of non-speech stimuli was mainly predicted by lower face apraxia scores, while their audio-visual matching of syllables was predicted by word repetition abilities and the presence of apraxia of speech. CONCLUSIONS: (1) Impaired speech perception in aphasia is located at a supra-modal representational level. (2) Audio-visual processing is different for speech and non-speech oral gestures. (3) Audio-visual matching abilities in patients with left-hemisphere lesions depend on their speech and non-speech motor abilities}
}

Powered by bibtexbrowser