
U-EYE INTERFACE
WORK IN PROGRESS
(full case study available soon)

Recent developments in XR are pushing Human Computer Interactions to new territories. It could bring some significant paradigm changes in terms of user accessibility and the very nature of interactions.
CONTEXT
CHALLENGE
– How might we develop new body-mediated modes of interaction and input?
– How might we allow people with physical limitations to easily interact with interfaces?
UPDATE
The Department of Neurology at University College London (UCL) approached Visyon to discuss the development of an input system based on eye-tracking technologies for AR and VR.
​
The call was initially made to look at enabling users severely affected by motor neurone diseases, such as amyotrophic lateral sclerosis (ALS) and multiple sclerosis (MS) to become more independent and regain the ability to interact with interfaces when their motor functions are compromised.
The hypotheses
Despite ALS and MS being relatively rare conditions compared to the total population there are a series of other use cases which we believe could benefit from an interface solution that doesn't require physical/tactile input. For example, patients experiencing lengthy recovery from surgery or accidents; extreme-condition and life-support suits that limit movement range, specially on hands and fingers, and many more.
​
We think that there is a potential for this technology to find a different number of applications, ranging from healthcare to industry, education and entertainment.
​
The challenges
Video and infrared and Eye-tracking technology has been widely adopted already. In most cases they act as output sensors that recognise infinitesimal gaze movements, such as vestibulo-ocular reflex (V.O.R.) and saccadic movement, translating it data and heath maps, for example. Another recent use is the application of infrared eye-tracking in augmentative and alternative communication (AAC) which points towards turning eye movement into input. However, we believe that the technology can be pushed further still.
​
Preliminary laboratory research led us to the first challenge. How to accurately 'filter' eye movement as a result of visual stimuli from a new class of movements designed to work as inputs? Considering the physiology of the eye and its primary function as a receptor it became evident that a specific set of gazes will need to be identified or further developed.
​
Project status
We are currently working on several hypotheses on how to develop distinct eye movements and categories of gaze that can expand the set of commands in a given system, similarly to touchless hand gestures used in AR, VR and wearables.
​
Next steps
Once a group of viable hypotheses are identified the team is set to conduct rounds of user testing in specific use cases and continue to iterate and refine the parameters until a satisfactory accuracy threshold is attained.
​
Concept XX
UX Design Vilmar Pellisson
Development Visyon
Support UCL