Skip to main content.

ABOUT

The CHINZ conference provides a general forum for all those involved with Computer Human Interaction (CHI or HCI) to address design centred human use of technology. We welcome full length and short research papers in all areas of this diverse field and especially encourage graduate students to present at the conference. There will poster sessions and hands-on demonstrations during the conference, at which we would encourage industry practitioners to present.

In recent years more and more researchers in the field of HCI have looked to apply principles from the design industry to their work. Many have come to see themselves as Interaction Designers rather than just Interface Designers. The conference will address a wide range of topics around the central theme of Design Centered HCI. Topics of interest include, but are not limited to:

Interaction design; user centred design, applying design principles in HCI, social implications of technology (e.g., for disabled or elderly); mobile technologies; games; robotics; tools and tool support; cyber psychology; usability; guidelines and heuristics; implications of technology for cognition; education aspects; and industry case studies.

Please contact the conference organizers for more details if you are unsure about how appropriate your work is.

 

KEYNOTE by Stephen Brewster

We are honoured to have Prof Stephen Brewster, one of the leading researchers in multimodal and mobile HCI as our keynote speaker. His research focuses on multimodality, or using multiple sensory modalities (particularly hearing, touch and smell) to create a rich, natural interaction between human and computer.

Abstract

‘Head up’ interaction: can we break our addiction to the screen and keyboard?

Our interactions with mobile devices are based on techniques developed for desktop computers in the 1970s such as buttons, sliders, windows and progress bars. This seminar will look at the possibility of moving away from these kinds of interactions to ones more suited to mobile devices and their dynamic contexts of use where users need to be able to look where they are going, carry shopping bags and hold on to children. I will present a range of multimodal (audio and tactile) interactions that we have developed which can be used eyes and hands free, and allow users to interact in a ‘head up’ way.

I will present some of the work we have done on input using pressure, and gestures done with fingers, wrist and head, along with work on output using non-speech audio, 3D sound and tactile displays in applications such as text entry, camera phone interfaces and navigation. I will talk about how we designed these for mobile use and the evaluation techniques we have developed to assess whether they are effective or not for users on the move.

 

CALL FOR PARTICIPATION

 

Hosted by:

Department of Information Systems and Operations Management and Department of Computer Science

University of Auckland

ACM SIGCHI New Zealand