Tags

Type your tag names separated by a space and hit enter

Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm.
Sci Rep 2019; 9(1):7892SR

Abstract

Although sound position is initially head-centred (egocentric coordinates), our brain can also represent sounds relative to one another (allocentric coordinates). Whether reference frames for spatial hearing are independent or interact remained largely unexplored. Here we developed a new allocentric spatial-hearing training and tested whether it can improve egocentric sound-localisation performance in normal-hearing adults listening with one ear plugged. Two groups of participants (N = 15 each) performed an egocentric sound-localisation task (point to a syllable), in monaural listening, before and after 4-days of multisensory training on triplets of white-noise bursts paired with occasional visual feedback. Critically, one group performed an allocentric task (auditory bisection task), whereas the other processed the same stimuli to perform an egocentric task (pointing to a designated sound of the triplet). Unlike most previous works, we tested also a no training group (N = 15). Egocentric sound-localisation abilities in the horizontal plane improved for all groups in the space ipsilateral to the ear-plug. This unexpected finding highlights the importance of including a no training group when studying sound localisation re-learning. Yet, performance changes were qualitatively different in trained compared to untrained participants, providing initial evidence that allocentric and multisensory procedures may prove useful when aiming to promote sound localisation re-learning.

Authors+Show Affiliations

Center for Mind/Brain Sciences (CIMeC), University of Trento, Trento, Italy. giuseppe.rabini@unitn.it.Department of Psychology and Cognitive Sciences (DiPSCo), University of Trento, Trento, Italy.Center for Mind/Brain Sciences (CIMeC), University of Trento, Trento, Italy. Department of Psychology and Cognitive Sciences (DiPSCo), University of Trento, Trento, Italy. IMPACT, Centre de Recherche en Neuroscience Lyon (CRNL), Bron, France.

Pub Type(s)

Journal Article

Language

eng

PubMed ID

31133688

Citation

Rabini, Giuseppe, et al. "Interactions Between Egocentric and Allocentric Spatial Coding of Sounds Revealed By a Multisensory Learning Paradigm." Scientific Reports, vol. 9, no. 1, 2019, p. 7892.
Rabini G, Altobelli E, Pavani F. Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm. Sci Rep. 2019;9(1):7892.
Rabini, G., Altobelli, E., & Pavani, F. (2019). Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm. Scientific Reports, 9(1), p. 7892. doi:10.1038/s41598-019-44267-3.
Rabini G, Altobelli E, Pavani F. Interactions Between Egocentric and Allocentric Spatial Coding of Sounds Revealed By a Multisensory Learning Paradigm. Sci Rep. 2019 May 27;9(1):7892. PubMed PMID: 31133688.
* Article titles in AMA citation format should be in sentence-case
TY - JOUR T1 - Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm. AU - Rabini,Giuseppe, AU - Altobelli,Elena, AU - Pavani,Francesco, Y1 - 2019/05/27/ PY - 2018/11/23/received PY - 2019/05/08/accepted PY - 2019/5/29/entrez PY - 2019/5/28/pubmed PY - 2019/5/28/medline SP - 7892 EP - 7892 JF - Scientific reports JO - Sci Rep VL - 9 IS - 1 N2 - Although sound position is initially head-centred (egocentric coordinates), our brain can also represent sounds relative to one another (allocentric coordinates). Whether reference frames for spatial hearing are independent or interact remained largely unexplored. Here we developed a new allocentric spatial-hearing training and tested whether it can improve egocentric sound-localisation performance in normal-hearing adults listening with one ear plugged. Two groups of participants (N = 15 each) performed an egocentric sound-localisation task (point to a syllable), in monaural listening, before and after 4-days of multisensory training on triplets of white-noise bursts paired with occasional visual feedback. Critically, one group performed an allocentric task (auditory bisection task), whereas the other processed the same stimuli to perform an egocentric task (pointing to a designated sound of the triplet). Unlike most previous works, we tested also a no training group (N = 15). Egocentric sound-localisation abilities in the horizontal plane improved for all groups in the space ipsilateral to the ear-plug. This unexpected finding highlights the importance of including a no training group when studying sound localisation re-learning. Yet, performance changes were qualitatively different in trained compared to untrained participants, providing initial evidence that allocentric and multisensory procedures may prove useful when aiming to promote sound localisation re-learning. SN - 2045-2322 UR - https://www.unboundmedicine.com/medline/citation/31133688/Interactions_between_egocentric_and_allocentric_spatial_coding_of_sounds_revealed_by_a_multisensory_learning_paradigm_ L2 - http://dx.doi.org/10.1038/s41598-019-44267-3 DB - PRIME DP - Unbound Medicine ER -