Interaction Based on Drawing Gestures to Complement the Teaching-Learning Process
DOI:
https://doi.org/10.29019/enfoqueute.v7n4.110Keywords:
Hand gestures, touchless interaction, information technology in education, Human-Computer InteractionAbstract
Interaction based on hand gestures has experienced a great progress in recent years. Its utility has been verified in several areas, particularly in the educational field, where it can contribute to enhance the quality of education. However, there are still some topics that have received little attention, such as the use of drawing gestures executed in the air with the hand. For this reason, this article analyzes the feasibility of using this type of gestures with educational purposes. With this aim, we conducted a study in which the participants interacted with the developed application by means of several drawing gestures. The obtained quantitative and qualitative results confirm the validity of our proposal. Specifically, the participants’ performance was acceptable according to the values of the used metrics. Moreover, the participants indicated the proposal is interesting and enjoyable. As a consequence, the use of applications of this type, in the classroom or at home, could contribute to increase the students’ interest in the corresponding subject, which should lead to obtain better marks.
Downloads
References
Akazawa, N., Yawata, K., Takeda, D., Nakayama, Y., Kakuda, H., & Suzuki, M. (2014b). A Playing and Learning Support System Using Kinect for Romaji. Proc. 3rd Global Conference on Consumer Electronics (pp. 345-349). IEEE.
Anthony, L., & Wobbrock, J. O. (2010). A Lightweight Multistroke Recognizer for User Interface Prototypes. Proc. Graphics Interface 2010 (pp. 245-252). Canadian Information Processing Society.
Baudel, T., & Beaudouin-Lafon, M. (1993). Charade: remote control of objects using free-hand gestures. Communications of the ACM, 36(7), 28-35.
Beekes, W. (2006). The 'millionaire' method for encouraging participation. Active learning in higher education, 7(1), 25-36.
Bennett, S., Maton, K., & Kervin, L. (2008). The 'digital natives' debate: A critical review of the evidence. British journal of educational technology, 39(5), 775-786.
Buchmann, V., Violich, S., Billinghurst, M., & Cockburn, A. (2004). FingARtips – Gesture Based Direct Manipulation in Augmented Reality. Proc. 2nd international conference on Computer graphics and interactive techniques in Australasia and South East Asia (pp. 212-221). ACM.
Castellucci, S. J., & MacKenzie, I. S. (2008). Graffiti vs. unistrokes: an empirical comparison. Proc. SIGCHI Conference on Human Factors in Computing Systems (pp. 305-308). ACM.
Connolly, T. M., Boyle, E. A., MacArthur, E., Hainey, T., & Boyle, J. M. (2012). A systematic literature review of empirical evidence on computer games and serious games. Computers & Education, 59(2), 661-686.
Erazo, O., & Pico, R. (2014). Interfaces de usuario basadas en gestos manuales sin contacto para la sala de clases: una revisión bibliográfica. Enfoque UTE, 5(4), 34-53.
Fang, B. (2009). From distraction to engagement: Wireless devices in the classroom. Educause Quarterly, 32(4), 4-9.
Fried, C. B. (2008). In-class laptop use and its effects on student learning. Computers & Education, 50(3), 906-914.
Giovanni, S., Choi, Y. C., Huang, J., Khoo, E. T., & Yin, K. (2012). Virtual Try-On Using Kinect and HD Camera. Proc. International Conference on Motion in Games (pp. 55-65). Springer Berlin Heidelberg.
González-Jorge, H., Riveiro, B., Vázquez-Fernández, E., Martínez-Sánchez, J., & Arias, P. (2013). Metrological evaluation of Microsoft Kinect and Asus Xtion sensors. Measurement, 46(6), 1800-1806.
Groff, J., & Mouza, C. (2008). A Framework for Addressing Challenges to Classroom Technology Use. AACe Journal, 16(1), 21-46.
Holzinger, A., Softic, S., Stickel, C., Ebner, M., Debevc, M., & Hu, B. (2010). Nintendo Wii remote controller in higher education: development and evaluation of a demonstrator kit for e-teaching. Computing and Informatics, 29(4), 601-615.
Homer, B., Kinzer, C., Plass, J., Letournear, S., Hoffman, D., Bromley, M., Hayward, E., Turkay, H., & Kornak, Y. (2014). Moved to learn: The effects of interactivity in a Kinect-based literacy game for beginning readers. Computers & Education, 74, 37-49.
Hsu, H. M. (2011). The potential of kinect in education. International Journal of Information and Education Technology, 1(5), 365-370.
Jagodziński, P., & Wolski, R. (2015). Assessment of Application Technology of Natural User Interfaces in the Creation of a Virtual Chemical Laboratory. Journal of Science Education and Technology, 24(1), 16-28.
Johnson, K., Pavleas, J., & Chang, J. (2013). Kinecting to Mathematics through Embodied Interactions. Computer, 46(10), 101-104.
Kandroudi, M., & Bratitsis, T. (2012). Exploring the educational perspectives of XBOX kinect based video games. Proc. ECGBL (pp. 219-227).
Kim, J. H., Lim, J. H., & Moon, S. H. (2012). The Effect of Visual Feedback on One-hand Gesture Performance in Vision-based Gesture Recognition System. Journal of the Ergonomics Society of Korea, 31(4), 551-556.
Kühnel, C., Westermann, T., Hemmert, F., Kratz, S., Müller, A., & Möller, S. (2011, October). I’m home: Defining and evaluating a gesture set for smart-home control. International Journal of Human-Computer Studies, 69(11), 693-704.
Lee, W. J., Huang, C. W., Wu, C. J., Huang, S. T., & Chen, G. D. (2012). The Effects of Using Embodied Interactions to Improve Learning Performance. Proc. 12th International Conference on Advanced Learning Technologies (pp. 557-559). IEEE.
Lui, A. K., Ng, V. S., & Chan, C. H. (2013). Gesture-Based Interaction for Seamless Coordination of Presentation Aides in Lecture Streaming. Proc. International Conference on ICT in Teaching and Learning (pp. 108-119). Springer Berlin Heidelberg.
Mehler, A., Lücking, A., & Abrami, G. (2014). WikiNect: Image schemata as a basis of gestural writing for kinetic museum wikis. Universal Access in the Information Society, 14(3), 333-349.
Meng, M., Fallavollita, P., Blum, T., Eck, U., Sandor, C., Weidert, S., Waschke, J., & Navab, N. (2013). Kinect for Interactive AR Anatomy Learning. Proc. International Symposium on Mixed and Augmented Reality (pp. 277-278). IEEE.
Nandi, S., Deb, S., & Sinha, M. (2016). Enhanced Understanding of Education Content Using 3D Depth Vision. Proc. Information Systems Design and Intelligent Applications (pp. 483-490). Springer India.
Panger, G. (2012). Kinect in the Kitchen: Testing Depth Camera Interactions in Practical Home Environments. CHI'12 Extended Abstracts on Human Factors in Computing Systems (pp. 1985-1990). ACM, (pp. 1985-1990).
Piumsomboon, T., Clark, A., Billinghurst, M., & Cockburn, A. (2013). User-Defined Gestures for Augmented Reality. Proc. IFIP Conference on Human-Computer Interaction (pp. 282-299). Springer Berlin Heidelberg.
Plaumann, K., Lehr, D., & Rukzio, E. (2016). Who Has the Force?: Solving Conflicts for Multi User Mid-Air Gestures for TVs. Proc. International Conference on Interactive Experiences for TV and Online Video (pp. 25-29). ACM.
Prenksy, M. (2001). Digital natives, digital immigrants. On the Horizon, 9(5), 1-6.
Ren, G., & O'Neill, E. (2013). 3D selection with freehand gesture. Computers & Graphics, 37(3), 101-120.
Senin, P. (December 2008). Dynamic time warping algorithm review. Honolulu, USA: Information and Computer Science Department University of Hawaii at Manoa Honolulu.
Şimşek, S., & Durdu, P. O. (2014). Developing an Interactive Learning Environment with Kinect. Proc. International Conference on Human-Computer Interaction (pp. 150-155). Springer International Publishing.
Susi, T., Johannesson, M., & Backlund, P. (2007). Serious Games: An Overview. DiVA.
Tinio, V. L. (2003). ICT in Education. E-Primers. UNDP, New York.
Wachs, J. P., Kölsch, M., Stern, H., & Edan, Y. (2011). Vision-Based Hand-Gesture Applications. Communications of the ACM, 54(2), 60-71.
Published
How to Cite
Issue
Section
License
The articles and research published by the UTE University are carried out under the Open Access regime in electronic format. This means that all content is freely available without charge to the user or his/her institution. Users are allowed to read, download, copy, distribute, print, search, or link to the full texts of the articles, or use them for any other lawful purpose, without asking prior permission from the publisher or the author. This is in accordance with the BOAI definition of open access. By submitting an article to any of the scientific journals of the UTE University, the author or authors accept these conditions.
The UTE applies the Creative Commons Attribution (CC-BY) license to articles in its scientific journals. Under this open access license, as an author you agree that anyone may reuse your article in whole or in part for any purpose, free of charge, including commercial purposes. Anyone can copy, distribute or reuse the content as long as the author and original source are correctly cited. This facilitates freedom of reuse and also ensures that content can be extracted without barriers for research needs.
This work is licensed under a Creative Commons Attribution 3.0 International (CC BY 3.0).
The Enfoque UTE journal guarantees and declares that authors always retain all copyrights and full publishing rights without restrictions [© The Author(s)]. Acknowledgment (BY): Any exploitation of the work is allowed, including a commercial purpose, as well as the creation of derivative works, the distribution of which is also allowed without any restriction.