Contact Us Search Paper

GIST: Gesture-free Interaction by the Status of Thumb; an interaction technique for Virtual Environments

Muhammad Raees1, *, Sehat Ullah2, Sami Ur Rahman3

Corresponding Author:

Muhammad Raees

Affiliation(s):

Department of Computer Science and IT, University of Malakand, Pakistan
1. [email protected], 2. [email protected], 3. [email protected]
*Corresponding Author: Muhammad Raees, Email: [email protected]

Abstract:

User interface has special importance in immersive virtual environments. Interactions based on the simple and conceivable gestures of a hand may enhance immersivity of a Virtual Environment (VE).  However, due to the structural issues like small size and complex shape of human hand, recognition of hand gestures are more challenging. This work introduces a novel interaction technique to perform the basic interaction tasks by the simple movement of hand instead of distinct gestures. With an ordinary camera, the fist posture of hand is segmented out from the image stream using the optimal segmentation model. Like pressing a button with a thumb, the status of thumb is traced for the activation or deactivation of the interactions.  After the activation of interaction, the trajectory of hand is followed to manipulate a virtual object about an arbitrary axis. Without training and comparison of gestures, the basic interactions required in a VE are performed by the perceptive movement of a hand. By incorporating image processing in the realm of VE, the technique is implemented in a case-study project; FIRST (Feasible Interaction by Recognizing the Status of Thumb). A group of 12 users evaluated the system in a moderate lighting condition. Outcomes of the evaluation revealed that the technique is suitable for Virtual Reality (VR) applications.

Keywords:

3D interactions, Gestural interfaces, Virtual Reality, Finger Recognition, Computer Vision

Downloads: 278 Views: 2589
Cite This Paper:

Muhammad Raees; Sehat Ullah; Sami Ur Rahman (2019). GIST: Gesture-free Interaction by the Status of Thumb; an interaction technique for Virtual Environments. Journal of Artificial Intelligence and Systems, 1, 125–142. https://doi.org/10.33969/AIS.2019.11008.

References:

[1] Holl M, Oberweger M, Arth C. and Lepetit V. (2018) Efficient Physics-Based Implementation for Realistic Hand-Object Interaction in Virtual Reality. In IEEE Conference on Virtual Reality and 3D User Interfaces, Reutlingen, Germany, 18-22 March 2018, 175-182. http://dx.doi.org/10.1109/VR.2018.8448284
[2] Ghotkar A.S, Kharate G.K. (2012) Hand segmentation techniques to hand gesture recognition for natural human computer interaction. International Journal of Human Computer Interaction (IJHCI). 3(1),15. http://dx.doi.org/10.1109/ICIIP.2011.6108940
[3] Shamaie A and Sutherland A. (2013) Accurate recognition of large number of hand gestures. K.N. Toosi University of Technology, 308-317.
[4] Kiyokawa K, Takemura H, Katayama Y, Iwasa H, Yokoya and N. (1996) Vlego: A simple two-handed modeling environment based on toy blocks. Based on Toy Blocks, In proceedings of the ACM Symposium on Virtual Reality Software and Technology, 27-34. http://dx.doi.org/10.1145/3304181.3304189
[5] Hua J, Qin H. (2001) Haptic sculpting of volumetric implicit functions. In Proceedings of the Ninth Pacific Conference on Computer Graphics and Applications, Tokyo, Japan, 16-18 Oct. 2001, 254-264. http://dx.doi.org/10.1109/PCCGA.2001.962881
[6] Wingrave C.A, Williamson B, Varcholik P.D, Rose J, Miller A, Charbonneau E, Bott J, LaViola Jr J. (2010) The wiimote and beyond: Spatially convenient devices for 3d user interfaces. IEEE Comp Graph and App; 1(2), 71-85. https://doi.org/10.1109/MCG.2009.109
[7] Chun J, Lee B. (2010) Dynamic Manipulation of a Virtual Object in Marker-less AR system Based on Both Human Hands. KSII Transactions on Internet & Information Systems, 1; 4(4).
https://doi.org/10.3837/tiis.2010.08.010
[8] Buchmann V, Violich S, Billinghurst M, Cockburn A. (2004) FingARtips: gesture based direct manipulation in Augmented Reality. In Proceedings of the 2nd international conference on Computer graphics and interactive techniques, Australasia and South East Asia, Jun 15 2004, 212-221. https://doi.org/10.1145/988834.988871
[9] Song S, Goh P, Hutama, WB, Fu, W and  Liu, X. (2012) A handle bar metaphor for virtual object manipulation with mid-air interaction.  In Proceedings of SIGCHI Conference on human factors in Computing Systems, Austin, Texas, USA, 5-10 May 2012, pp. 1297–1306. http://dx.doi.org/ 10.1145/159544.159562
[10] Höll M, Oberweger M, Arth C and Lepetit V. (2018) Efficient Physics-Based Implementation for Realistic Hand-Object Interaction in Virtual Reality. In  IEEE Conference on Virtual Reality and 3D User Interfaces, Reutlingen Germany, 18-22 March 2018, 175-182. https://doi.org/10.1145/988834.988871
[11] Guna J, Jakus G, Pogačnik M, Tomažič S and Sodnik J. (2014) An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking. Sensors, 14(2), 3702-20. https://doi.org/10.3390/s140203702
[12] Zimmermann C and Brox T. (2017) Learning to estimate 3d hand pose from single rgb images. In IEEE International Conference on Computer Vision (ICCV), 2017, pp. 4913-4921. https://doi.org/10.1109/ICCV.2017.525
[13] Oberweger M, Wohlhart P and Lepetit V. (2015) Hands deep in deep learning for hand pose estimation, arXiv preprint arXiv:1502.06807, 4903-4911. https://doi.org/10.1109/ ICCV.2015.100
[14] Moehring M and Froehlich B. (2011) Effective manipulation of virtual objects within arm's reach. InVirtual Reality Conference (VR), Singapore, 19-23 March 2011, 131-138. http://dx.doi.org/10.1109/VR.2011.5759451
[15] Yim D, Loison GN, Fard FH, Chan E, McAllister A and Maurer F. (2016) Gesture-driven interactions on a virtual hologram in mixed reality. In Proceedings of the ACM Companion on Interactive Surfaces and Spaces, Niagara Falls, Ontario, Canada, November 06 - 09 2016, pp. 55-61. https://doi.org/10.1145/3009939.3009948
[16] Prachyabrued M and Borst C W. (2012) Virtual grasp release method and evaluation. International Journal of Human-Computer Studies, 70(11), 828-48. https://doi.org/10.1016/j.ijhcs.2012.06.002
[17] Rijpkema H and Girard M. (1991) Computer animation of knowledge-based human grasping. In ACM Siggraph Computer Graphics, 1991, 25(4), 339-348. https://doi.org/10.1145/122718.122754
[18] Zhang Z. (2012) Microsoft kinect sensor and its effect. IEEE multimedia, 19(2), 4-10. https://doi.org/10.1109/MMUL.2012.24
[19] Frank W, Bachmann D, Rudak B and Fisseler D. (2013) Analysis of the accuracy and robustness of the leap motion controller, Sensors, 13(5), 6380–6393, 2013. http://dx.doi.org/10.3390/s130506380
[20] Benko H. (2009) Beyond flat surface computing: challenges of depth-aware and curved interfaces, In Proceedings of the 17th ACM International  Conference  Multimedia, Vancouver, British Columbia, Canada, October 19-24, 2009, 935–944. http://dx.doi.org/ 10.1145/1631272.1631462
[21] Vanacken D, Beznosyk A and Coninx K. (2014) Help systems for gestural interfaces and their effect on collaboration and communication, In Workshop on gesture-based interaction design: communication and cognition. http://dx.doi.org/ 10.1.1.710.8679
[22] Smedt Q. (2017) Dynamic hand gesture recognition-From traditional handcrafted to recent deep learning approaches,  Doctoral dissertation, Université de Lille 1, Sciences et Technologies; CRIStAL UMR 9189, https://hal.archives-ouvertes.fr/tel-01691715/document
[23] Bejan A, Wieland M, Murko P and Kunze C. A (2018) Virtual Environment Gesture Interaction System for People with Dementia. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility, Hong Kong, June 9-13 2018, 225-230. http://dx.doi.org/10.1145/3197391.3205440
[24] Holzer A, Vozniuk A, Bendahan S and Gillet D. (2016) Rule of thumb: effect of social button icons on interaction. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct, 659-666. https://doi.org/10.1145/2957265.2961842
[25] Choi J H, Ko N Y and Ko D Y. (2001). Morphological gesture recognition algorithm. In Proceedings of IEEE Region 10 International Conference on Electrical and Electronic Technology, Cat. No. 01, vol. 1,  291-296.
[26] Ackad C, Kay J and Tomitsch M. (2014). Towards learnable gestures for exploring hierarchical information spaces at a large public display, In CHI Workshop on Gesture-based Interaction Design, 57. http://dx.doi.org/ 10.1145/2669485.2670531
[27] Card S K. (2014). A simple universal gesture scheme for user interfaces. In Gesture-Based Interaction Design: Communication and Cognition, CHI 2014 Workshop.
[28] Pham H A. (2018). The challenge of hand gesture interaction in the Virtual Reality Environment: evaluation of in-air hand gesture using the Leap Motion Controller.
[29] Dardas N H and Georganas N D. (2011). Real-time hand gesture detection and recognition using bag-of-features and support vector machine techniques. IEEE Transactions on Instrumentation and measurement, 60 (11), 3592-3607. https://doi.org/10.1109/TIM.2011.2161140
[30] Zhang C, Yang X and Tian Y. (2013). Histogram of 3D facets: A characteristic descriptor for hand gesture recognition. In 2013 10th IEEE international conference and workshops on automatic face and gesture recognition (FG), 1-8. https://doi.org/10.1109/FG.2013.6553754
[31] Asadi-Aghbolaghi M, Clapes A, Bellantonio M, Escalante H J, Ponce-López V, Baró X and Escalera S. (2017). A survey on deep learning based approaches for action and gesture recognition in image sequences. In 12th IEEE international conference on automatic face & gesture recognition, (FG 2017) , 476-483. https://doi.org/10.1109/FG.2017.150
[32] Phung S L, Bouzerdoum A and Chai D. (2002) A novel skin color model in ycbcr color space and its application to human face detection, In Proceedings of International Conference on Image, 1, I-I. https://doi.org/10.1109/ICIP.2002.1038016
[33] Maheswari, S and Korah, R. (2017) Enhanced skin tone detection using heuristic thresholding, Biomedical Research, 28(9), 29-35. https://doi.org/10.11648/j.ajai.20170101.14
[34] Sreedhar, K. and  Panlal, B. (2012) Enhancement of images using morphological transformation. arXiv preprint arXiv:1203.2514. https://doi.org/10.5121/ijcsit.2012.4103
[35] Raees, M., Ullah, S, Rahman, S U and Rabbi, I. (2016) Image based recognition of Pakistan sign language,  Journal of Engineering Research , 4(1), 1-21. https://doi.org/10.7603/s40632-016-0002-6
[36] Raees, M., Ullah, S., and Rahman, S. U. (2018). VEN-3DVE: vision based egocentric navigation for 3D virtual environments, International Journal on Interactive Design and Manufacturing (IJIDeM), 1-11. https://doi.org/10.1007/s12008-018-0481-9
[37] Tversky, B., Jamalian, A., Segal, A., Giardino, V. and Kang, S. M. (2014) Congruent gestures can promote thought, In Gesture-Based Interaction Design: Communication and Cognition, CHI  Workshop. http://dx.doi.org/ 10.1186/s41235-016-0004-9
[38]  Card, S. K. (2014) A simple universal gesture scheme for user interfaces, In Gesture-Based Interaction Design: Communication and Cognition, CHI  Workshop. http://dx.doi.org/ 10.1007/s00779-013-0725-42013
[39] Walter, R., Bailly, G. and Müller, J. (2013) Strikeapose: revealing mid-air gestures on public displays, In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 841–850. http://dx.doi.org/10.1145/2470654.2470774
[40]  Hespanhol, L., Tomitsch, M., Grace, K., Collins, A. and Kay, J. (2012) Investigating intuitiveness and effectiveness of gestures for free spatial interaction with large displays,  In  Proceedings of the International Symposium on Pervasive Displays, 6. http://dx.doi.org/10.1145/2307798.2307804
[41] Diman Zad, T., Sampson, A., Mytkowicz, T and McKinley, K S. (2017) High Five: Improving Gesture Recognition by Embracing Uncertainty, arXiv preprint arXiv:1710.09441. https://arxiv.org/pdf/1710.09441
[42] Kerber, F, Puhl, M and Kruger A. (2017) User-independent real-time hand gesture recognition based on surface electromyography,” In Proceedings of the 9th Int. Conf. Human Comp. Inter. with Mobile Devices & Services, Vienna, Austria, September 04-07 2017, 36. http://dx.doi.org/ 10.1007/s10462-012-9356-9
[43] Weissmann, J and Salomon, R. (1999) Gesture recognition for virtual reality applications using data gloves and neural networks, In International Joint Conference on Neural Networks (IJCNN), Washington, DC, USA,  10-16 July 1999, 2043–2046.  http://dx.doi.org/ 10.1109/IJCNN.1999.832699
[44] Ullah A, Muhammad K., Del Ser, J, Baik S W and Albuquerque V. (2018) Activity recognition using temporal optical flow convolutional features and multi-layer LSTM. IEEE Transactions on Industrial Electronics. 9692-9702. http://dx.doi.org/ 10.1109/TIE.2018.2881943