Orientation based input device for XR: the comparison of hand movements 
Lately, I have been working on a text input method for XR based on hand orientation, the idea is to reduce the user's effort when typing in an immersive environment, because admit it, typing on #AR #VR devices is slow and requires all sort of hand movements make it tricky to create content using those devices, or simply painful to do a movie search on Amazon Prime or Netflix VR. Relying on orientation is a good start, for example a finger swiping gesture can rotate an angle as much as hand a waving hand gesture, yet the amounts of movements are different, it depends on where the orientation tracking sensor is placed.
I know from the beginning to use the SoundxVision thumb input solution but still want to examine other placements, such as wrist, this’s when things have gotten interesting. During my experiments, I’ve found that there are significant differences between the 2 embodiments i.e the amount of hand movements and space, mainly due to the joints of our hand and wrist’s structures. This is very interesting to see, so I decide to make a video comparing the amount of movements and space required by the 2 approaches to navigate on an incomplete virtual keyboard. For the comparisons, I used a SoundxVision prototype that can be used on both thumb and wrist, it is powered by a SEEED Studio Xiao BLE Sense board that senses my hand orientation and uses it as input data for navigating on a virtual keyboard on my Meta Quest 2.
The setup
On the hardware side, a prototype device (powered by SEEED Xiao BLE Sense) that can be mounted on either finger or wrist is used a is used for collecting the orientation data via the inertial measurement unit STMicroelectronics  LSM6DS3, the collected data is processed so that this input device will send to my Meta Quest 2 a directional signal (up, down, left, right) every time the accumulated rotation reach a certain value (ranging from 3-12 degrees on y and z axes, depends on how sensitive you want the device be).
To clearer demonstrate how we can use less space and hand movements, I included two 2 hands positions on the test, when the hand is up in the air and when it is down. 

The setups for the comparison

Result

Video: the comparison

The differences in the amount of movements and space required to navigate within the keyboard is obvious, with the thumb embodiment requires the fewer hand movements, especially when the user's hand is put down compared to the wrist-mounted approach. It’s also important to note that more hand movement means more precise rotation, but it can also be achieved on both setups, lock the wrist joint and there are little to zero differences between the two approaches in terms of accuracy.
Disclaimer 1: This is only the keyboard layout, and it’s kind of rough, still requires a lot of work before it becomes usable, the goal is to increase the typing speed up to 20-50%, in the best scenario as fast as typing on the phone.
Disclaimer 2: Wrist-based input sensing approach has its pros and cons over finger based approach, this is just one aspect that I want to demonstrate.
Disclaimer 3: The 2 embodiments are in fact the same, using the same hardware and software, that’s the beauty of orientation tracking.