Abstract:

Computers offer valuable assistance to people with physical disabilities. However designing human-computer interfaces for these users is complicated. The range of abilities is more diverse than for able-bodied users, which makes analytical modelling harder. Practical user trials are also difficult and time consuming. I have developed a simulator to help with the design and evaluation of assistive interfaces. It can predict the likely interaction patterns when undertaking a task using a variety of input devices, and estimate the time to complete the task in the presence of different disabilities and for different levels of skill. I have also addressed the shortcomings of existing HCI models and hope to develop a system that will be easier to use than the existing models and support both able-bodied and disabled users.

The simulator is developed according to the concept of Model Human Processor. It consists of a Perception model, a Cognitive model and a Motor-behaviour Model. The perception model simulates the phenomenon of visual perception (like focussing and shifting attention). Currently, I have investigated eye gaze patterns (using a Tobii X120 eye tracker) of normal as well as people with visual impairment. My model can reproduce the results of previous experiments on visual perception in the context of HCI and can also simulate the effects of different visual impairments (e.g.: Wet and Dry Maccular Degeneration, Diabetic Retinopathy, Tunnel Vision etc.) on interaction. The cognitive model uses CPM-GOMS model to simulate expert performance. It has a novel and easy-to-use module to simulate performance of novices based on the concept of dual-space model. Finally the motor-behaviour model is developed by statistical analysis of cursor traces from motor-impaired users. Currently, I have worked on evaluating hand strength (using a Baseline 7-pc Hand Evaluation Kit) of normal and motor-impaired people and investigated how hand strength affects HCI. The main contributions of my work are:

Identification and calibration of two image processing algorithms to predict points of eye-gaze fixations and the corresponding fixation durations during visual search in a computer screen undertaken by people with and without visual-impairment.
Analysis of eye movement trajectories during visual search in a computer screen and identification of the most probable strategies to predict the actual trajectory.
Investigation of the effect of hand strength on human-computer interaction.
Development of a statistical model to predict pointing times of motor-impaired computer users based on their hand strength.

My studies are already being used to design and develop inclusive computer interfaces (e.g. accessible game, new assistive interaction technique etc.). My university has recently been awarded EU funding for the GUIDE project that will employ results from my PhD research.

Full Thesis:
Download Pradipta Biswas’s Full Thesis

Thesis Advisor:
Peter Robinson

Award Date:
March 1, 2010

Institution:
University of Cambridge Computer Laboratory
Cambridge, UK

Author Contact:
pb400-ta-nullcam-tod-ac-tod-uk