EdgeSonic: Image feature sonification for the visually impaired
Research output: Contribution to journal › Conference article › Research › peer-review
Standard
EdgeSonic : Image feature sonification for the visually impaired. / Yoshida, Tsubasa; Kitani, Kris M.; Koike, Hideki; Belongie, Serge; Schlei, Kevin.
In: ACM International Conference Proceeding Series, 2011.Research output: Contribution to journal › Conference article › Research › peer-review
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - GEN
T1 - EdgeSonic
T2 - 2nd Augmented Human International Conference, AH'11
AU - Yoshida, Tsubasa
AU - Kitani, Kris M.
AU - Koike, Hideki
AU - Belongie, Serge
AU - Schlei, Kevin
PY - 2011
Y1 - 2011
N2 - We propose a framework to aid a visually impaired user to recognize objects in an image by sonifying image edge features and distance-to-edge maps. Visually impaired people usually touch objects to recognize their shape. However, it is difficult to recognize objects printed on flat surfaces or objects that can only be viewed from a distance, solely with our haptic senses. Our ultimate goal is to aid a visually impaired user to recognize basic object shapes, by transposing them to aural information. Our proposed method provides two types of image sonification: (1) local edge gradient sonification and (2) sonification of the distance to the closest image edge. Our method was implemented on a touch-panel mobile device, which allows the user to aurally explore image context by sliding his finger across the image on the touch screen. Preliminary experiments show that the combination of local edge gradient sonification and distance-to-edge sonification are effective for understanding basic line drawings. Furthermore, our tests show a significant improvement in image understanding with the introduction of proper user training.
AB - We propose a framework to aid a visually impaired user to recognize objects in an image by sonifying image edge features and distance-to-edge maps. Visually impaired people usually touch objects to recognize their shape. However, it is difficult to recognize objects printed on flat surfaces or objects that can only be viewed from a distance, solely with our haptic senses. Our ultimate goal is to aid a visually impaired user to recognize basic object shapes, by transposing them to aural information. Our proposed method provides two types of image sonification: (1) local edge gradient sonification and (2) sonification of the distance to the closest image edge. Our method was implemented on a touch-panel mobile device, which allows the user to aurally explore image context by sliding his finger across the image on the touch screen. Preliminary experiments show that the combination of local edge gradient sonification and distance-to-edge sonification are effective for understanding basic line drawings. Furthermore, our tests show a significant improvement in image understanding with the introduction of proper user training.
KW - Edge detection
KW - Image sonification
KW - Sensory substitution
KW - Visually impaired
UR - http://www.scopus.com/inward/record.url?scp=79953118564&partnerID=8YFLogxK
U2 - 10.1145/1959826.1959837
DO - 10.1145/1959826.1959837
M3 - Conference article
AN - SCOPUS:79953118564
JO - ACM International Conference Proceeding Series
JF - ACM International Conference Proceeding Series
Y2 - 13 March 2011 through 13 March 2011
ER -
ID: 301831500