Eyepointing Interface Research at Stanford

Sharing is caring!

A number of years ago, just as Windows 3.1 was being replaced by Windows 95 in the office I was working in, one of the newest employees whom I worked with was having an incredibly hard time learning to use a mouse.

She held it gingerly, as if afraid that she might break it. She had just been given a new computer, and her old one was a DOS-based computer with a keyboard driven menu. It really didn’t look like she would get the hang of it.

About a year later, she was actively teaching others in the office how to use a fairly complex software system that I had helped work upon, and distribute through the office. Without Solitaire, she might never have mastered that mouse and gotten as far along as she did.

As I watched her initial efforts to use the mouse, and the frustration on her face (she had worked for many years in offices that used typewriters, and learning to use a computer was a big step to begin with), I remembered hearing that Microsoft included the game with their operating systems to help people learn to use a mouse. I figured I had nothing to lose, and showed her how to get to the game.

As we worked on finetuning this new software that she developed an expertise in, she provided a lot of great feedback on workflow involved in performing different tasks on the system. I think that early experience in her use of computers tuned her into how people interact with computer systems. Our experience together made me pay a great deal of attention to human computer interface issues.

Three new papers from researchers at Stanford University describe studies involving a user interface that may make it easier for people with disabilities, and people without, to navigate around a computer screen, scroll down pages, and switch between applications. The authors of these papers tell us:

For our research we chose to investigate how gaze-based interaction techniques can be made simple, accurate and fast enough to not only allow disabled users to use them for standard computing applications, but also make the threshold of use low enough that able-bodied users will actually prefer to use gaze-based interaction.

GUIDe: Gaze-enhanced UI Design

EyePoint: Practical Pointing and Selection Using Gaze and Keyboard

Gaze-enhanced Scrolling Techniques

Might the gaze-based system described in these papers be an interface that we use in tomorrow’s computers? I can’t answer that for certain, but if they are, I might be playing a fair amount of solitaire to get the hang of them.

Sharing is caring!

14 thoughts on “Eyepointing Interface Research at Stanford”

  1. Offering a low-pressue, familiar game to help users learn new techniques is a hugely helpful system. The strain of learning a new game is gone, as is the strain of potentially making a mistake during a critical activity.

    These gaze pointers sound like a great way to escape some repetive motion problems; although I’m with you on playing a lot of Solitaire!

  2. Just the suggestion that she play Solitaire changed her disposition completely. She went from confused to enthusiastic in a matter of moments.

    Excellent point on the repetitive motion issues.

    I like the idea of these gaze pointers, and wouldn’t mind the chance to try this interface out at all. I am wondering how confusing they might be to people who might have to use them for the first time in a work situation – hopefully those folks will get a chance to play some solitaire to learn how. 🙂

  3. I’ve used Solitaire as well with good success with people not familiar with a mouse. It works quite well in a classroom setting, as those who get the hang of it quickly usually don’t mind playing solitaire a few extra minutes while the rest of the class gets up to speed.

  4. The office I was working in at the time was a courthouse, and the judges weren’t too fond of people playing games or accessing the internet when they were working, though they would let them during breaks.

    I made sure that I let her supervisors know that I told her to play some Solitaire to get used to working her mouse. They had seen how intimidated she was by it – they had no problems with it.

  5. Interesting how they used the gaze based interaction to make it easier for them. I wonder if all that time in my life playing solitaire actually helped me to understand the Windows GUI better. Interesting read none the less.

  6. I have been thinking about using this type of technology in an upcoming project. I was glad to see what you had to say on the subject.


  7. Thank you, Jesse.

    The eyepointing technology does sound pretty interesting, and helpful to people who might have problems using keyboards and other pointing devices.

  8. Dear William (firstly it’s rather cool to see seobythesea on the stanford domain) i remember that in 1999 my mother (she is a genetics expert) used to be afraid also of the mouse. However, as a proof that people learn fast, i can tell you that today, she does 3D animations, uses Catia and she can even program. Life is a miracle, isn’t it?

  9. Hi Emil,

    Life is indeed a miracle, and one that we should be thankful for everyday. I love that there are dedicated researchers like the ones working on eye-pointing interfaces that can help people who might not otherwise be able to interact effectively with a mouse and computer, and that people who might hesitate at using a computer at first are diving in, and developing skills like your mother has.

  10. Very cool article. In college, I worked at a lab working on early detection and intervention for autism and we used a lot of eye-tracking, gaze-based tools for the studies, and intervention.

    Also, I have a friend with high-functioning Aspergers and he has told me that sometimes it’s easier to communicate on the web (e.g. Facebook) than in conversation since he is able to digest what is said and then think about a reply where in real-time it’s more difficult.

    I think now that 1:88 people are showing Autism-like symptoms (many different conclusions to be drawn here), developers and programmers will start focusing on user interfaces that cater to people with disabilities.

    Also check out BCI (Brain Computer Interface).

    Bill, do you think that progress might be made on usable interfaces in general from designing interfaces to suit the needs of people with disabilities? My initial thought is that it would, since we could find commonalities in interface preference between all kinds of people and find the bottom line.

  11. Hi Bill,

    Thanks for the reply. Yes, some mentors in the ABA program that I work with use the ABA therapy app! They seem to work pretty well and the kids eyes get wide when they see the iPad screen for some reason(don’t think that’s limited to kids with autism haha).

    I’ll past this list onto the mentor team; there’s a lot of them on there that I’m sure people haven’t heard of yet. Thanks!


  12. Hi David,

    You’re welcome. Something about gadgets like that just get (many) kids to open up and get excited.

    Happy to hear that the mentor team may find some new apps on that page.

Comments are closed.