EdgeSonic: Image feature sonification for the visually impaired

Publikation: Bidrag til tidsskriftKonferenceartikelForskningfagfællebedømt

Standard

EdgeSonic : Image feature sonification for the visually impaired. / Yoshida, Tsubasa; Kitani, Kris M.; Koike, Hideki; Belongie, Serge; Schlei, Kevin.

I: ACM International Conference Proceeding Series, 2011.

Publikation: Bidrag til tidsskriftKonferenceartikelForskningfagfællebedømt

Harvard

Yoshida, T, Kitani, KM, Koike, H, Belongie, S & Schlei, K 2011, 'EdgeSonic: Image feature sonification for the visually impaired', ACM International Conference Proceeding Series. https://doi.org/10.1145/1959826.1959837

APA

Yoshida, T., Kitani, K. M., Koike, H., Belongie, S., & Schlei, K. (2011). EdgeSonic: Image feature sonification for the visually impaired. ACM International Conference Proceeding Series. https://doi.org/10.1145/1959826.1959837

Vancouver

Yoshida T, Kitani KM, Koike H, Belongie S, Schlei K. EdgeSonic: Image feature sonification for the visually impaired. ACM International Conference Proceeding Series. 2011. https://doi.org/10.1145/1959826.1959837

Author

Yoshida, Tsubasa ; Kitani, Kris M. ; Koike, Hideki ; Belongie, Serge ; Schlei, Kevin. / EdgeSonic : Image feature sonification for the visually impaired. I: ACM International Conference Proceeding Series. 2011.

Bibtex

@inproceedings{c9ded67f1765495cbcfdeecf77e3ee2d,
title = "EdgeSonic: Image feature sonification for the visually impaired",
abstract = "We propose a framework to aid a visually impaired user to recognize objects in an image by sonifying image edge features and distance-to-edge maps. Visually impaired people usually touch objects to recognize their shape. However, it is difficult to recognize objects printed on flat surfaces or objects that can only be viewed from a distance, solely with our haptic senses. Our ultimate goal is to aid a visually impaired user to recognize basic object shapes, by transposing them to aural information. Our proposed method provides two types of image sonification: (1) local edge gradient sonification and (2) sonification of the distance to the closest image edge. Our method was implemented on a touch-panel mobile device, which allows the user to aurally explore image context by sliding his finger across the image on the touch screen. Preliminary experiments show that the combination of local edge gradient sonification and distance-to-edge sonification are effective for understanding basic line drawings. Furthermore, our tests show a significant improvement in image understanding with the introduction of proper user training.",
keywords = "Edge detection, Image sonification, Sensory substitution, Visually impaired",
author = "Tsubasa Yoshida and Kitani, {Kris M.} and Hideki Koike and Serge Belongie and Kevin Schlei",
year = "2011",
doi = "10.1145/1959826.1959837",
language = "English",
journal = "ACM International Conference Proceeding Series",
note = "2nd Augmented Human International Conference, AH'11 ; Conference date: 13-03-2011 Through 13-03-2011",

}

RIS

TY - GEN

T1 - EdgeSonic

T2 - 2nd Augmented Human International Conference, AH'11

AU - Yoshida, Tsubasa

AU - Kitani, Kris M.

AU - Koike, Hideki

AU - Belongie, Serge

AU - Schlei, Kevin

PY - 2011

Y1 - 2011

N2 - We propose a framework to aid a visually impaired user to recognize objects in an image by sonifying image edge features and distance-to-edge maps. Visually impaired people usually touch objects to recognize their shape. However, it is difficult to recognize objects printed on flat surfaces or objects that can only be viewed from a distance, solely with our haptic senses. Our ultimate goal is to aid a visually impaired user to recognize basic object shapes, by transposing them to aural information. Our proposed method provides two types of image sonification: (1) local edge gradient sonification and (2) sonification of the distance to the closest image edge. Our method was implemented on a touch-panel mobile device, which allows the user to aurally explore image context by sliding his finger across the image on the touch screen. Preliminary experiments show that the combination of local edge gradient sonification and distance-to-edge sonification are effective for understanding basic line drawings. Furthermore, our tests show a significant improvement in image understanding with the introduction of proper user training.

AB - We propose a framework to aid a visually impaired user to recognize objects in an image by sonifying image edge features and distance-to-edge maps. Visually impaired people usually touch objects to recognize their shape. However, it is difficult to recognize objects printed on flat surfaces or objects that can only be viewed from a distance, solely with our haptic senses. Our ultimate goal is to aid a visually impaired user to recognize basic object shapes, by transposing them to aural information. Our proposed method provides two types of image sonification: (1) local edge gradient sonification and (2) sonification of the distance to the closest image edge. Our method was implemented on a touch-panel mobile device, which allows the user to aurally explore image context by sliding his finger across the image on the touch screen. Preliminary experiments show that the combination of local edge gradient sonification and distance-to-edge sonification are effective for understanding basic line drawings. Furthermore, our tests show a significant improvement in image understanding with the introduction of proper user training.

KW - Edge detection

KW - Image sonification

KW - Sensory substitution

KW - Visually impaired

UR - http://www.scopus.com/inward/record.url?scp=79953118564&partnerID=8YFLogxK

U2 - 10.1145/1959826.1959837

DO - 10.1145/1959826.1959837

M3 - Conference article

AN - SCOPUS:79953118564

JO - ACM International Conference Proceeding Series

JF - ACM International Conference Proceeding Series

Y2 - 13 March 2011 through 13 March 2011

ER -

ID: 301831500