Perceptually correct haptic rendering in mid-air using ultrasound phased array

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Standard

Perceptually correct haptic rendering in mid-air using ultrasound phased array. / Raza, Ahsan; Hassan, Waseem; Ogay, Tatyana; Hwang, Inwook; Jeon, Seokhee.

I: IEEE Transactions on Industrial Electronics, Bind 67, Nr. 1, 8691689, 01.2020, s. 739-745.

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Harvard

Raza, A, Hassan, W, Ogay, T, Hwang, I & Jeon, S 2020, 'Perceptually correct haptic rendering in mid-air using ultrasound phased array', IEEE Transactions on Industrial Electronics, bind 67, nr. 1, 8691689, s. 739-745. https://doi.org/10.1109/TIE.2019.2910036

APA

Raza, A., Hassan, W., Ogay, T., Hwang, I., & Jeon, S. (2020). Perceptually correct haptic rendering in mid-air using ultrasound phased array. IEEE Transactions on Industrial Electronics, 67(1), 739-745. [8691689]. https://doi.org/10.1109/TIE.2019.2910036

Vancouver

Raza A, Hassan W, Ogay T, Hwang I, Jeon S. Perceptually correct haptic rendering in mid-air using ultrasound phased array. IEEE Transactions on Industrial Electronics. 2020 jan.;67(1):739-745. 8691689. https://doi.org/10.1109/TIE.2019.2910036

Author

Raza, Ahsan ; Hassan, Waseem ; Ogay, Tatyana ; Hwang, Inwook ; Jeon, Seokhee. / Perceptually correct haptic rendering in mid-air using ultrasound phased array. I: IEEE Transactions on Industrial Electronics. 2020 ; Bind 67, Nr. 1. s. 739-745.

Bibtex

@article{708c7bf01033471aa694a0b74b49157c,
title = "Perceptually correct haptic rendering in mid-air using ultrasound phased array",
abstract = "This paper provides a perceptually transparent rendering algorithm for an ultrasound-based mid-air haptic device. In a series of experiments, we derive a systematic mapping function relating from the device command value to final user's perceived magnitude of a mid-air vibration feedback. The algorithm is designed for the ultrasonic mid-air haptic interface that is capable of displaying vibro-tactile feedback at a certain focal point in mid-air through ultrasound phased array technique. The perceived magnitude at the focal point is dependent on input parameters, such as input command intensity, modulation frequency, and position of the focal point in the work-space. This algorithm automatically tunes these parameters to ensure that the desired perceived output at the user's hand is precisely controlled. Through a series of experiments, the effect of the aforementioned parameters on the physical output pressure are mapped, and the effect of this output pressure to the final perceived magnitude is formulated, resulting in the mapping from the different parameters to the perceived magnitude. Finally, the overall transparent rendering algorithm was evaluated, showing better perceptual quality than rendering with simple intensity command.",
keywords = "Contact-less haptics, force feedback, haptic perception, mid-air haptic feedback, psychophysics",
author = "Ahsan Raza and Waseem Hassan and Tatyana Ogay and Inwook Hwang and Seokhee Jeon",
note = "Funding Information: Manuscript received October 31, 2018; revised February 20, 2019; accepted March 18, 2019. Date of publication April 15, 2019; date of current version August 30, 2019. This work was supported in part by the Electronics and Telecommunications Research Institute grant funded by the Korean government under Grant19ZS1300 (The development of smart context-awareness foundation technique for major industry acceleration) and in part by the MSIP through IITP under Grant 2017-0-00179 (HD Haptic Technology for Hyper Reality Contents). (Corresponding author: Seokhee Jeon.) A. Raza, W. Hassan, and T. Ogay are with the Department of Computer Science and Engineering, Kyung Hee University, Yongin-si 17104, South Korea (e-mail:, ahsanraza@khu.ac.kr; waseem.h@khu.ac.kr; ta.ogay92@gmail.com). Publisher Copyright: {\textcopyright} 1982-2012 IEEE.",
year = "2020",
month = jan,
doi = "10.1109/TIE.2019.2910036",
language = "English",
volume = "67",
pages = "739--745",
journal = "I E E E Transactions on Industrial Electronics",
issn = "0278-0046",
publisher = "Institute of Electrical and Electronics Engineers",
number = "1",

}

RIS

TY - JOUR

T1 - Perceptually correct haptic rendering in mid-air using ultrasound phased array

AU - Raza, Ahsan

AU - Hassan, Waseem

AU - Ogay, Tatyana

AU - Hwang, Inwook

AU - Jeon, Seokhee

N1 - Funding Information: Manuscript received October 31, 2018; revised February 20, 2019; accepted March 18, 2019. Date of publication April 15, 2019; date of current version August 30, 2019. This work was supported in part by the Electronics and Telecommunications Research Institute grant funded by the Korean government under Grant19ZS1300 (The development of smart context-awareness foundation technique for major industry acceleration) and in part by the MSIP through IITP under Grant 2017-0-00179 (HD Haptic Technology for Hyper Reality Contents). (Corresponding author: Seokhee Jeon.) A. Raza, W. Hassan, and T. Ogay are with the Department of Computer Science and Engineering, Kyung Hee University, Yongin-si 17104, South Korea (e-mail:, ahsanraza@khu.ac.kr; waseem.h@khu.ac.kr; ta.ogay92@gmail.com). Publisher Copyright: © 1982-2012 IEEE.

PY - 2020/1

Y1 - 2020/1

N2 - This paper provides a perceptually transparent rendering algorithm for an ultrasound-based mid-air haptic device. In a series of experiments, we derive a systematic mapping function relating from the device command value to final user's perceived magnitude of a mid-air vibration feedback. The algorithm is designed for the ultrasonic mid-air haptic interface that is capable of displaying vibro-tactile feedback at a certain focal point in mid-air through ultrasound phased array technique. The perceived magnitude at the focal point is dependent on input parameters, such as input command intensity, modulation frequency, and position of the focal point in the work-space. This algorithm automatically tunes these parameters to ensure that the desired perceived output at the user's hand is precisely controlled. Through a series of experiments, the effect of the aforementioned parameters on the physical output pressure are mapped, and the effect of this output pressure to the final perceived magnitude is formulated, resulting in the mapping from the different parameters to the perceived magnitude. Finally, the overall transparent rendering algorithm was evaluated, showing better perceptual quality than rendering with simple intensity command.

AB - This paper provides a perceptually transparent rendering algorithm for an ultrasound-based mid-air haptic device. In a series of experiments, we derive a systematic mapping function relating from the device command value to final user's perceived magnitude of a mid-air vibration feedback. The algorithm is designed for the ultrasonic mid-air haptic interface that is capable of displaying vibro-tactile feedback at a certain focal point in mid-air through ultrasound phased array technique. The perceived magnitude at the focal point is dependent on input parameters, such as input command intensity, modulation frequency, and position of the focal point in the work-space. This algorithm automatically tunes these parameters to ensure that the desired perceived output at the user's hand is precisely controlled. Through a series of experiments, the effect of the aforementioned parameters on the physical output pressure are mapped, and the effect of this output pressure to the final perceived magnitude is formulated, resulting in the mapping from the different parameters to the perceived magnitude. Finally, the overall transparent rendering algorithm was evaluated, showing better perceptual quality than rendering with simple intensity command.

KW - Contact-less haptics

KW - force feedback

KW - haptic perception

KW - mid-air haptic feedback

KW - psychophysics

U2 - 10.1109/TIE.2019.2910036

DO - 10.1109/TIE.2019.2910036

M3 - Journal article

AN - SCOPUS:85072113586

VL - 67

SP - 739

EP - 745

JO - I E E E Transactions on Industrial Electronics

JF - I E E E Transactions on Industrial Electronics

SN - 0278-0046

IS - 1

M1 - 8691689

ER -

ID: 394539128