I’m currently exploring new ways of gestural interactions using static symbolic gestures. The idea is to develop a system which could provide an alternative to traditional VR/MR gesture interfaces which are quite limited and always rely on extensive user-interface menus to trigger certain functions. In this study I focus on basic actions like scaling and rotating. As soon as I finished my studies I will upload much deeper insights and results alongside a working prototype.
Using complex finger-distance measuring I was able to trigger certain actions (coloring billboards) when using different static gesture shortcuts.
Further along the basic gesture recognition I have added the ability to recognize each X-, Y- and Z- axis by rotation your static gesture.
Here you can see a more refined version with a basic UI and feedback system. The slider shown here is just a placeholder and not implemented in this version.