
Touchless User Interface For Electronic Devices – VR / AR Headsets
WO2013132242A1
Filed by Elliptic Laboratories AS
This invention presents a hybrid sensing system that enables reliable and intuitive touchless interaction using both optical and ultrasonic technologies. It is designed to track hand position and gestures in three-dimensional space, even when hands move beyond the visible range of a camera.
The core innovation is the combination of two complementary sensing mechanisms. Optical sensors (such as infrared or visible-light cameras) are used for precise hand tracking within the camera’s field of view, while ultrasonic sensors provide a much wider detection zone, extending interaction coverage beyond the limits of the optical system. This ensures continuous gesture tracking even when the hand leaves the camera frame, maintaining a seamless user experience.
This dual-sensor architecture solves a common problem in AR and VR systems: limited or interrupted input when hands move out of view. By integrating ultrasonic sensing, the system can still determine position, velocity, and intent, even when optical input is lost, enabling reliable, low-latency control in immersive environments like mixed-reality headsets, gaming systems, or smart displays.
Another key feature of the patent is real-time feedback. By providing continuous visual cues in response to hand motion, the system helps users understand how their gestures affect the interface, supporting faster learning and more natural interaction.
Compared to vision-only gesture systems, this approach is less sensitive to lighting conditions and occlusions, and consumes less power. It offers a robust and scalable solution for future-generation user interfaces, particularly in wearables, AR/VR devices, and compact consumer electronics.