Understanding personalised auditory-visual associations in multi-modal interactions

dc.contributor.authorO'Toole, Patrick
dc.contributor.funderScience Foundation Irelanden
dc.date.accessioned2021-11-26T10:12:02Z
dc.date.available2021-11-26T10:12:02Z
dc.date.issued2021-10-18
dc.description.abstractCan we sharpen our auditory and visual senses and better understand the relationship between these modalities to benefit our interactions in human-computer interfaces? This research paper proposes a framework to understand auditory-visual associations and explore the impact of emotion, personality, age and gender in understanding information from both modalities. Studies into the areas of emotion and personality as well as their association with the auditory and visual senses have increased within the fields of psychology, neuroscience, affective computing and human-computer interaction (HCI). From a HCI perspective, advances in technologies and machine learning techniques provide a new way to understand people and to develop systems where computers work along side people to help develop efficient interactions and clearer perceptions of our environment. The proposed framework will be developed along side a personalised auditory-visual interface that can be used to provide intelligent interactions with users that can help them learn from efficient associations between their senses. This research can be used to create personalised auditory-visual-emotion-personality profiles that can be use in adaptive musical teaching platforms, as well as mental health and wellness applications for more personalised care programs.en
dc.description.sponsorshipScience Foundation Ireland (18/CRT/6222)en
dc.description.statusPeer revieweden
dc.description.versionPublished Versionen
dc.format.mimetypeapplication/pdfen
dc.identifier.citationO'Toole, P. (2021) 'Understanding personalised auditory-visual associations in multi-modal interactions', ICMI '21: Proceedings of the 2021 International Conference on Multimodal Interaction, Montréal QC, Canada, 18-22 October, pp. 812-816. doi: 10.1145/3462244.3481277en
dc.identifier.doi10.1145/3462244.3481277en
dc.identifier.endpage816en
dc.identifier.isbn978-1-4503-8481-0
dc.identifier.startpage812en
dc.identifier.urihttps://hdl.handle.net/10468/12272
dc.language.isoenen
dc.publisherAssociation for Computing Machinery (ACM)en
dc.relation.ispartofICMI '21: International Conference on Multimodal Interaction, Montréal QC, Canada, 18-22 October 2021
dc.rights© 2021, the Author. This work is licensed under a Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/)en
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en
dc.subjectAuditory-visual associationsen
dc.subjectMulti-modal interactionsen
dc.subjectMusicen
dc.subjectSynaesthesiaen
dc.subjectMachine learningen
dc.titleUnderstanding personalised auditory-visual associations in multi-modal interactionsen
dc.typeConference itemen
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
3462244.3481277.pdf
Size:
702.48 KB
Format:
Adobe Portable Document Format
Description:
Published Version
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
2.71 KB
Format:
Item-specific license agreed upon to submission
Description: