Research Seminar: João Guerreiro

Date: 9 December 2015

Time: 12:00

Location: Wolfson Lecture Theatre, Computing, Queen Mother Building

Host: Kyle Montague

Title: Text-to-Speeches: Enhancing Blind People’s Information Scanning with Concurrent Speech

Abstract: Blind people rely mostly on the auditory feedback of screen readers to consume digital information. Efficiency is a problem especially in situations where relevant information must be recognized among large amounts of irrelevant one. Sighted people use scanning as a strategy to achieve this goal, by glancing at all content expecting to identify information of interest to be subsequently analyzed with further care. In contrast, screen readers rely on a sequential auditory channel that is impairing a quicker overview of the content, when compared to the visual presentation on screen. We propose taking advantage of the Cocktail Party Effect, which states that people are able to focus their attention on a single voice among several conversations, but still identify relevant content in the background. Therefore, oppositely to one sequential speech channel, we hypothesize that blind users can leverage concurrent speech to quickly get the gist of digital information. Grounded on literature reviews that documented several features (e.g. spatial location, voice characteristics) that increase speech intelligibility, we investigated if and how we could take advantage of concurrent speech to accelerate blind people’s scanning for digital information. In this talk I will describe the prototypes and user studies that pointed out concurrent speech as a strong alternative to current solutions and I will discuss other prospective scenarios for concurrent speech interfaces.

Bio: João Guerreiro has recently submitted his PhD dissertation at Instituto Superior Técnico (IST), University of Lisbon (UL) and he is a research assistant at the Visualization and Intelligent MultiModal Interfaces (VIMMI) group from INESC-ID. He holds BSc and MSc degrees in Computer Science from IST, UL. His main research interests include human-computer interaction, non-visual interaction and accessibility. In his thesis, he studied the perception of concurrent speech by blind people in order to improve their efficiency when scanning for digital content. His experience in other research projects include: Touch-based braille input and braille reading through multi-point vibrotactile feedback; non-visual, two-handed multitouch exploration and text-entry using simultaneous audio feedback; and personal information visualization.

Directions: http://www.computing.dundee.ac.uk/about/travel-information


Can't find what you're looking for? Contact Us