Button Text
⬅ Home

Eye Tracking Remote

What
Lead Product Design
User Testing
Collaboration with Dev, A11y and Design System Teams

Who
Role: Lead Product Designer
Team: 10 members (2 Stakeholders, 1 P.O, 1 Lead Product Design, 1 UX Designer, 1 Developer, 2 Researchers, 2 A11y developers)
Why
Of Spectrum's 27,742,365 customers over 3.468 million have some sort of motor impairment (including but not limited to MS, Cerebral Palsy, Stroke, spinal cord injury, Spinal Muscular Atrophy and ALS). Of that 3.468 million over 74% have difficulties using our physical remote. As our team has dedicated ourselves to ensuring that all of our products are usable by all of our users, we identified a large gap. In order to ensure that our users with motor impairments are able to use our products effectively we used a technology called eye tracking in order to create a remote that works for everyone. With our new eye tracking remote users are able to control their TV set top box to interact with their TV and consume content the same way our abled users can.
How
We started the process by first identifying competitors in our market and what specific solutions they have for motor impaired users. We found that although competitors do have products in the market, they lack in depth and are frankly frustrating to use. We knew we could bring improvements to our users. We created user personas to aid us in framing the entire project from the start. We identified two main personas:
--
Stuart
Gender: Male, Age: 57 Disability: ALS
Stuart was diagnosed with ALS 18 months ago and is looking for tools that allow him to maintain his independence. He would like to operate his TV without needing his caretaker and gets frustrated when he is left unassisted and is unable to change channels or volume on his own.


Julia
Gender: Female, Age: 11, Disability: Spinal Cord Injury
Julia was in a car accident in which she sustained a significant spinal cord injury. As a result Julia is paralyzed from the neck down but has retained her ability to speak. Julia's parents want to arm her with the tools she needs to continue to be a kid and maintain independence. Julia currently uses an eye gaze tablet to stay connected to her friends, do her homework, and control various IR devices in her home. At times Julia gets agitated with her inability to do many tasks on her own like turning on the TV.

We then dug into user testing by doing research on existing UX eye-gaze, VR and AR usability tests and conducted our own in house to fill additional gaps. We used an eye-gaze tablet running Windows 10 with a built in eye tracker in order to design the initial screens, due to an existing OS we had to design around various safe areas.
Challenges
Due to the eye-gaze tablet's built in OS we were unable to adjust any safe areas including the large teal toolbar that hugs the right rail. Many users are custom to using this brand's specific architecture and we did not want to create a custom experience if the existing user base in custom to the OS's native controls. We continued to house all of our settings in the teal toolbar as users frequently change their settings based on their upcoming task. We also were unable to pivot from the physical remote as the current set top box only accepts those specific IR commands, we look forward to hurdling these challenges in the second phase.


Another major concern was tackling general remote control shortcomings, the main issues identified after conducting interviews were 1. That there is too much information. 2. The contrast is too low on both text and iconography. 3. The text isn't heavy enough to provide adequate contrast. And 4. The existing remote iconography isn't clear enough for all of our users.

We were also unable to use physical texture/sizing as this remote is not a physical object. We were forced to use color, organization, and hierarchy to give user's enough information to navigate the interface.

Interaction Steps from top left to bottom right:
1. Circular cursor follows user’s eyes to aid in cursor identification and to improve overall accuracy.
2. When user gazes at a specific button, the rest of the ui fades in opacity (-20%) after held for 1.2 seconds.
3. Once the user’s specific gaze time is activated, circular cursor shrinks in size until it hits 84px over said duration.
Focusing of circular cursor aids in accuracy and improves button press confirmation.
4. Once gaze duration is completed cursor goes back to original size and the button changes state for .8s.
5. Once user gazes off of the button 100% opacity returns to the UI.
6. If user stays gazed on same button (for a multi-press) restart cursor animation to give indication of a separate press.
Solution
After testing we found that the ideal settings for both speed and accuracy was using the dwell controls. Essentially a user must stare at their desired button and hold for 1 - 3 seconds in order to send the command to the TV. We then dug into our existing research to discover the most pressed buttons and the most common combinations of button presses to help us organize the content. Based on existing studies we found the ideal button size for dwell accuracy and speed to further help our users complete tasks without stress.

In regards to design we wanted to make sure we bumped up the contrast, increased typography weight, and helped clarify the iconography by using more recognizable icons that spoke both to the general zeitgeist as well as particular functions in Spectrum products. We decided to go with dark mode, and used a light blue (over a white) to help preserve battery life as we identified a need based on a short battery life and lower battery usage with our chosen color pallet. Finally, we improved the interactions to help users know when they had sent a command to their TV via a button press. Although we recommend dwell controls, we also created a custom experience for users using blink and smile controls as well.
back to top