With increasing numbers of digital devices vying for our attention and time today, researchers from the Human Media Lab (HML) at Queen’s University have developed a new concept that allows computers to pay attention to their users’ needs. HML researchers are addressing the problem of the barrage of messages people receive from large numbers of digital appliances. Their Attentive User Interface (AUI) is a new paradigm for interacting with groups of computers that moves beyond the traditional desktop interface. From Queens University:
Queen’s researchers invent computers that ‘pay attention’ to users
Reduce interruptions from e-mail, cell phones, digital appliances
(Kingston, ON) ? With increasing numbers of digital devices vying for our attention and time today, researchers from the Human Media Lab (HML) at Queen’s University have developed a new concept that allows computers to pay attention to their users’ needs.
HML researchers are addressing the problem of the barrage of messages people receive from large numbers of digital appliances. Their Attentive User Interface (AUI) is a new paradigm for interacting with groups of computers that moves beyond the traditional desktop interface.
Current computers are generally designed to act in isolation, without considering what the user is doing before producing distracting interruptions. As a result, today’s user has trouble keeping up with volumes of e-mail, instant messages, phone calls and appointment notifications.
“Today’s digital lifestyle has the unfortunate side effect of bombarding people with messages from many devices all the time, regardless of whether they’re willing, or able to respond,” says HML director Dr. Roel Vertegaal. “Like spam [unsolicited e-mail], this problem needs to be addressed.” The HML team is designing devices that determine the level of user attention and the importance of each message relative to what the user is doing. Then the computer decides whether to “take a turn” to deliver the message.
Next week in Fort Lauderdale, Florida, Dr. Vertegaal and his students, Jeffrey Shell, Alexander Skaburskis and Connor Dickie, will present their findings at the prestigious ACM CHI 2003 Conference on Human Factors in Computing Systems. HML works in collaboration with IBM Almaden Research Center in San Jose, and Microsoft Research in Redmond, Washington. This month the Association of Computing Machinery’s (ACM) flagship publication, Communications of ACM, features a special issue on Attentive User Interfaces, edited by Dr. Vertegaal.
“The way that we use computers has fundamentally changed,” says Dr. Vertegaal. “There has been a shift over the past four decades from many users sharing a single mainframe computer, to a single user with a single PC, to many people using many portable, networked devices.
“We now need computers that sense when we are busy, when we are available for interruption, and know when to wait their turn ? just as we do in human-to-human interactions,” the HML director continues. “We’re moving computers from the realm of being merely tools, to being ‘sociable’ appliances that can recognize and respond to some of the non-verbal cues humans use in group conversation.”
Many of the Queen’s team’s discoveries are rooted in their research into the function of eye contact in managing human group conversation. One of the main underlying technologies they developed is an eye contact sensor that allows each device to determine whether a user is present, and further, whether that user is looking at the device. This allows devices to establish what the user is attending to, and when, whether, and how they should contact the user, if at all.
Funding support for the Human Media Lab includes the Premier’s Research Excellence Awards (PREA), the Natural Sciences and Engineering Research Council (NSERC), Institute for Robotics and Information Systems (IRIS), Communications and Information Technology Ontario (CITO) and Microsoft Research.
A number of AUI applications have been developed to date in the Human Media Lab:
? Eye contact sensors use computer vision to track when a person looks at a device.
? Eye contact sensing glasses recognize when people look at each other.
? Eye proxy, a pair of robotic eyes with embedded eye contact sensors allow a computer to look back at the user, to visually communicate its attention.
? An attentive videoconferencing system communicates eye contact over the Internet through video images, optimizing bandwidth on the basis of the joint attention of users.
? Attentive cell phones use eye contact sensing glasses to determine when users are in face-to-face conversations, automatically shifting from audible rings to vibration alerts.
? Attentive speaker phones allow users to initiate calls by looking at an eye proxy representing the remote person.
? Attentive messaging systems (AMS) forward e-mails to the device currently in use.
? Attentive televisions automatically pause when nobody is watching them.
? Attentive home appliances allow people to use their eyes as pointing devices and their mouths as keyboards. These appliances use speech recognition to execute commands; eye contact sensors determine which device is the target of a command (e.g. the attentive desk light activates when the user looks at the fixture and responds when the user says “Turn on”).
? Auramirror is a video mirror that visualizes the exchange of attention between people in conversations. Each person’s “aura” of attention is represented as a bubble of viscous fluid that grows in the direction of their eye gaze. When two people look at each other, their bubbles connect, representing mutual attention.
When they look at the mirror, this bubble pops. This process serves as a metaphor for attention and interruption.
For high resolution pictures of the above projects: http://www.hml.queensu.ca/press.html
To download HML papers: http://www.hml.queensu.ca/papers.html
For details of the CHI 2003 conference: http://www.chi2003.org.