Overview
We are in the very early days of Augmented Reality(AR) and Mixed Reality(MR) Glasses adoption, as it develops, AR/MR reveals the possibilities to improve our everyday life and works. But we still have long way to go for AR glasses to be widely adopted as everyday wearables, the interaction between human and machine is among one of the challenge. This project objective is to ideate, prototype and test a less vision-depended AR controller concept for everyday wearable AR glasses.
My role in the project is to design the interaction, software and data communication from the hardware device to the AR application.
The Idea
Thumb is the finger that we use most on mobile phones and many handheld gadgets. In this project I focus on using thumb to navigate in Augmented Reality environment. Thumbs' most noticeable characteristic is to afford great precision and power grip, which leads us to the idea of creating an Augmented Reality(AR) controller that makes use of your thumb and can be on your thumb most of the day.
The concept idea is a device that attaches to the thumb and uses a finger as the pad for navigation within the AR glasses system. From the 2 most basic gestures of swiping thumbs, a wide range of gestures were developed for the future AR controller. By using thumb for navigating we can expect it work even when the device is out of sight.
The concept idea is a device that attaches to the thumb and uses a finger as the pad for navigation within the AR glasses system. From the 2 most basic gestures of swiping thumbs, a wide range of gestures were developed for the future AR controller. By using thumb for navigating we can expect it work even when the device is out of sight.

Figure 1: Two basic gestures of thumb's movements

Figure 2: Three main ideas about how the device should be on the thumb
Idea #1 was chosen as its design leaves more space for hardware components.
The device in this concept is called "Neo"
The Specification
Functional specs:
- Navigate in AR environment both in-sight and out of sight.
- Ability to perform most normal thumb functions even with the AR controller on.
- Comfortable on thumb for all-day usage, this requires good ergonomic and product material.
- Beside navigation in AR, this can also be used as secure payment communication using NFC
Technical specs:
- Data transmitting between the navigation device and AR glasses with the help of nRF52 series BlueTooth multi-protocol SoC from Nordic Semiconductor.
- 9 Degree of Freedom inertial measurements unit for absolute orientation and gestures recognition.
- Sensing thumb touches and vertical swiping using an array of micro force sensors on the right side of the device. Ability to sense tapping, holding, and pressing motion.
- NFC for secure payments and objects communication.
- Small size rechargeable battery with the capacity of at least enough for 1 day of usage.
The Prototype
The prototype is an Adafruit Feather Sense kit with 2 external forces sensors. This development kit is equipped with Nordic Semiconductor nRF52840 and a set of accelerometer, gyroscope, and magnetometer allows it to sense 9DoF motion. One drawback with this development kit is the lack of LE Audio support.
Even though the Adafruit Feather Sense board is much larger than the concept device, this development kit is rather small compared to other development kits available in the market. Two force sensors will be connected to the board to simulate thumb motion sensing.
Even though the Adafruit Feather Sense board is much larger than the concept device, this development kit is rather small compared to other development kits available in the market. Two force sensors will be connected to the board to simulate thumb motion sensing.

Figure 3: Adafruit Feather sense development board
The Gestures
By implementation of force sensors and IMU, a collection of gestures sensing can be developed. The figure 4 shows gestures list that can be used on Neo and their symbols.

Figure 4: Gestures list and symbols
The Simulation
An AR environment with a pair of AR device and controller to demonstrate the data transmutation and motion data processing. On the AR device side, an iPad Pro, the application is developed using unity, C# script, and a BLE plug-in for connection establishment, this device act as a central role in the communication. On the AR controller side is my iPhone 6s, which act as a Peripheral role that transfers motion data to the central for gestures sensing, this application is developed using swift with the use of CoreMotion and CoreBluetooth APIs.

Figure 4: iPhone application as a AR controller
Video: Simulation of using gyro data to navigate in the AR environment.
Applications
AR UI
To take advantages of Neo, an augmented reality application consists of two main components "Navigation area" and "Content area". The Navigation area can be navigated by swiping thumb on two fingers. for the Content area, swiping thumb on one finger is used.
Discord is used in this concept as navigating in an application. The UI was redesigned to suit AR environment, but the philosophy remains, users have access to navigate through servers, its channels and reply using simply thumbs swiping/sliding gestures. The layout is divided into 3 columns, each contains its children rows that can be navigated by swiping up/down.

Voice input control
Human language is a complex structure, and the lack of shared common ground sometimes makes it impossible for our machines to get us right. In this concept of voice input, we propose a way to improve the effectiveness of speech recognition, where users can correct the speech recognition quickly when it gets our intentions wrong.
This can be separated into "voice command" and "voice dictation"
Neo can be a good aid for voice recognition


Future Works
For future work, there will be more focus on power consumption improvements, development of a raycast pointer to act as a navigation aid.