PROOF OF CONCEPT
As my thesis proceeded, I started working on the proof of concept of the thumb mounted controller using the Adafruit Feather nrf52840 sense as the mainboard with 2 force sensors and an external battery pack.
Implementation
For sensing thumb-gestures, a Feather nrf52840 sense Bluetooth low energy development board is used, this board comes with built-in inertial measurement unit (IMU) for collecting measurement data and two force sensors are connected to the board for touch sensing. Collected data is further convert to input signal and transmit to an AR device (in this case is an iPad) using BLE. This input device is capable of sensing: swiping in four directions, quick press and long press.​​​​​​​
Figure 1 shows the mock-up of the AR input device which houses the Feather sense board and two external touch zones with each touch zone comprises one force sensitive resistor (FSR) connected to the Feather Sense.

Figure 1: Mock-up of the input device

Figure 2 shows the input device is mounted on a user's thumb powered by an external power wrist band.

Figure 2: The input device on hand

Testing
For testing purpose, two AR interfaces are devloped using Unity and ARkit for demonstrating supported interactions by the controller for controlling in AR/VR.
Video 1 shows an AR user interface, in which a top bar is controlled by swiping on the 2nd touch zone and the the bottom is controlled by the first touch zone with the purple cell indicating the cell in selection, pink for quick press and orange for long press.

Video 1: Swiping test on iPad

Video 2 shows full interaction is performed using the input device including swiping up/down, left/right and quick/long press.

Video 2: Pressing test on iPad

What's next
A thumb-mounted input device is a unique concept that needs to be proved viable before it comes to further developments. For this purpose, the first thumb-mounted controller was made to show how users gonna use it to interact with an AR system. In terms of size, it is already smaller than other kinds of controllers such as Oculus Touch, but this solution was not very neat and problematic in terms of ergonomics, the size does not fit on different hands and test users did not feel it comfortable to wear and it is lack of discoverability. Those issues would addressed on the Prototype. However, when it comes to gesture sensing, the PoC is shown to be viable and has the potential to be useful in XR systems, in addition to the PoC, an application has been implemented to use the PoC in Oculus Quest, even at this size the Oculus’s hand tracking system was able to detect the hands.​​​​​
SoundxVision is born
Realise this idea could be very usable for future XR systems, the project was given the name "SoundxVision", this is a turning point of the project and a team was formed in order to make this idea into a more appealing input solution in the XR field.