Next Article in Journal
Familiar and Strange: Gender, Sex, and Love in the Uncanny Valley
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Welcome to MTI—A New Open Access Journal Dealing with Blue Sky Research and Future Trends in Multimodal Technologies and Interaction

by
Adrian David Cheok
1 and
Cristina Portalés Ricart
2,*
1
Imagineering Institute, Iskandar Malaysia and City University London, London EC1V 4PB, UK
2
Institute of Robotics and Information and Communication Technologies, Universitat de València, Av. de Blasco Ibáñez, 13, València 46010, Spain
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2017, 1(1), 1; https://doi.org/10.3390/mti1010001
Submission received: 12 April 2016 / Accepted: 13 April 2016 / Published: 20 April 2016
In this era of massive use of computers and other computational devices (e.g., low-cost wearable sensors, smartphones, other smart devices, etc.), the nature of digital data is becoming more complex and heterogeneous. Additionally, different technologies are emerging that consider the digitization of other stimuli apart from the traditional ones—i.e., visual and sonic—such as tactile, taste or smell, leading to multimodal inputs and outputs, and enabling new kinds of communication [1,2,3]. These technologies include both sensors (inputs to the system) and displays (outputs of the system). They are connecting to the Internet 24/7 and are leading to a new era of hyperconnectivity.
At the other side of the technology scale, there is the human being, which interacts with those digital contents thanks to a variety of interfaces. Current user inputs go beyond the traditional keyboard and mouse, and include speech [4], touch [5], gestures [6], body movement [7], gaze [8], etc. In this regard, new interaction paradigms are emerging, which may consider only one kind of interface (uni-modal) or a variety of them (multi-modal).
These issues open new avenues at the research level, such as how to create a symbiosis between the analog and the digital world and how to increase emotional experiences through digital contents. For instance, it is challenging to digitally sense and reproduce taste or smell, and the process may involve electric machines and neurological factors [9,10]. Once this is achieved, these new digitalized senses might also be communicated through the Internet as digital data: imagine you could send a virtual cake to your mum on her birthday, which not only looked like a cake, but also smelt and tasted like it. This vision may materialize soon, and the way we perceive the world and the way we sense and interact with our reality could change drastically over the next few years.
These topics related to multisensory communication are highly universal, as they are of application in very different areas of knowledge, and therefore they will open up new horizons for future research, including areas such as human computer interfaces, entertainment, arts, learning, medicine and wellness. Besides, the recently expanded availability of broadband internet, the exponential growth in proliferation of mobile and wearable computing devices and high speed wireless internet access, have enabled a new era of hyperconnectivity, where the improvement of multisensory communication becomes critical in a 24/7 world [11]. Hyperconnectivity refers not only to the technology itself, but also to the impact that this technology has on personal lives, business, government and societal behavior.
Consequently, any research work dealing with these topics is of high novelty, and thus can be of relevance for the entire society: on the one hand, the different technologically-related industries that are committed to innovate in order to be competitive; on the other, the final users/consumers (general public, professionals, scholars, etc.) that demand new, improved solutions that offer more intuitive solutions and alternative access due to the increasing complexity of technologies.
Multimodal Technologies and Interaction (MTI) is an open access journal focused on these exciting research topics, leading to worldwide dissemination opportunities, as anyone with an Internet connection can access their contents. One of the aims of the Editors and the Editorial Board is to reduce the traditional review cycle in order to provide a faster dissemination of the research work while still keeping the highest standards of peer review. We also ask authors to publish their research in as much detail as necessary as there is no restriction on the length of papers.
We envision MTI as an exciting project dealing with both cutting-edge and future research in these topics, and we are fully committed to disseminate high quality research and achieve a distinguished reputation in the research community. To that end, we count on a highly qualified editorial board, including top researchers in the related fields. Of course, the success of this journal would not be possible without the invaluable contributions of authors and the support of its audience. On behalf of the Editorial Board, we invite you to submit your manuscripts, review papers and suggestions for Special Issues and guest editors to MTI. We are looking forward to receiving your research contributions as well as other suggestions or ideas that can enhance this new journal.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Teh, J.K.S.; Cheok, A.D.; Peiris, R.L.; Choi, Y.; Thuong, V.; Lai, S. Huggy Pajama: A mobile parent and child hugging communication system. In Proceedings of the 7th International Conference on Interaction Design and Children, Chicago, IL, USA, 11–13 June 2008; pp. 250–257.
  2. Wei, J.; Wang, X.; Peiris, R.L.; Choi, Y.; Román Martínez, X.; Tache, R.; Koh, J.T.K.V.; Halupka, V.; Cheok, A.D. CoDine: An interactive multi-sensory system for remote dining. In Proceedings of the 13th ACM International Conference on Ubiquitous Computing, Beijing, China, 17–21 September 2011; pp. 21–30.
  3. Saadatian, E.; Samani, H.; Parsani, R.; Pandey, A.V.; Li, J.; Tejada, L.; Cheok, A.D.; Nakatsu, R. Mediating intimacy in long-distance relationships using kiss messaging. Int J. Human Comput. Stud. 2014, 72, 736–746. [Google Scholar] [CrossRef]
  4. Chun, L.M.; Arshad, H.; Piumsomboon, T.; Billinghurst, M. A combination of static and stroke gesture with speech for multimodal interaction in a virtual environment. In Proceedings of the International Conference on Electrical Engineering and Informatics, Denpasar, Bali, Indonesia, 10–11 August 2015; pp. 59–64.
  5. Portalés Ricart, C.; Perales Cejudo, C.D.; Cheok, A.D. Exploring social, cultural and pedagogical issues in AR-gaming through the Live LEGO House. In Proceedings of the International Conference on Advances in Computer Entertainment Technology, Salzburg, Austria, 15–17 June 2007; pp. 238–239.
  6. Perales, C.D.; Portalés, C.; Sanmartín, F. Sonic Gestures Applied to a Percussive Dialogue in TanGram using Wii Remotes. In Proceedings of the 8th International Conference on Entertainment Computing, Paris, France, 3–5 September 2009; pp. 216–221.
  7. Portalés, C.; Viñals, M.J.; Alonso-Monasterio, P.; Morant, M. AR-Immersive Cinema at the Aula Natura Visitors Center. IEEE MultiMedia 2010, 17, 8–15. [Google Scholar] [CrossRef]
  8. Kassner, M.; Patera, W.; Bulling, A. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In Proceedings of the ACM International Joint Conference on Pervasive and Ubiquitous Computing, Seattle, WA, USA, 13–17 September 2014; pp. 1151–1160.
  9. Ranasinghe, N.; Karunanayaka, K.; Cheok, A.D.; Newton Fernando, O.N.; Nii, H.; Gopalakrishnakone, P. Digital taste and smell communication. In Proceedings of the 6th International Conference on Body Area Networks, Beijing, China, 7–8 November 2011.
  10. Nakamura, H.; Miyashita, H. Augmented gustation using electricity. In Proceedings of the 2nd Augmented Human International Conference, Tokyo, Japan, 13–13 March 2011.
  11. Cheok, A.D. Hyperconnectivity and the Future of Internet Connections, 1st ed.; LAP Lambert Academic Publishing: Saarbrücken, Germany, 2015. [Google Scholar]

Share and Cite

MDPI and ACS Style

Cheok, A.D.; Portalés Ricart, C. Welcome to MTI—A New Open Access Journal Dealing with Blue Sky Research and Future Trends in Multimodal Technologies and Interaction. Multimodal Technol. Interact. 2017, 1, 1. https://doi.org/10.3390/mti1010001

AMA Style

Cheok AD, Portalés Ricart C. Welcome to MTI—A New Open Access Journal Dealing with Blue Sky Research and Future Trends in Multimodal Technologies and Interaction. Multimodal Technologies and Interaction. 2017; 1(1):1. https://doi.org/10.3390/mti1010001

Chicago/Turabian Style

Cheok, Adrian David, and Cristina Portalés Ricart. 2017. "Welcome to MTI—A New Open Access Journal Dealing with Blue Sky Research and Future Trends in Multimodal Technologies and Interaction" Multimodal Technologies and Interaction 1, no. 1: 1. https://doi.org/10.3390/mti1010001

Article Metrics

Back to TopTop