*Article* **Multi-Sensory Color Code Based on Sound and Scent for Visual Art Appreciation**

**Luis Cavazos Quero, Chung-Heon Lee and Jun-Dong Cho \***

> Department of Electrical and Computer Engineering, Sungkyunkwan University, Suwon 16419, Korea; luis@skku.edu (L.C.Q.); ch.lee@skku.edu (C.-H.L.)

**\*** Correspondence: jdcho@skku.edu

**Abstract:** The development of assistive technologies is improving the independent access of blind and visually impaired people to visual artworks through non-visual channels. Current single modality tactile and auditory approaches to communicate color contents must compromise between conveying a broad color palette, ease of learning, and suffer from limited expressiveness. In this work, we propose a multi-sensory color code system that uses sound and scent to represent colors. Melodies express each color's hue and scents the saturated, light, and dark color dimensions for each hue. In collaboration with eighteen participants, we evaluated the color identification rate achieved when using the multi-sensory approach. Seven (39%) of the participants improved their identification rate, five (28%) remained the same, and six (33%) performed worse when compared to an audio-only color code alternative. The participants then evaluated and compared a color content exploration prototype that uses the proposed color code with a tactile graphic equivalent using the System Usability Scale. For a visual artwork color exploration task, the multi-sensory color code integrated prototype received a score of 78.61, while the tactile graphics equivalent received 61.53. User feedback indicates that the multi-sensory color code system improved the convenience and confidence of the participants.

**Keywords:** assistive technology; multi-sensory interface; auditory interface; scent interface; color
