Skip to main content
Log in

Can we do without GUIs? Gesture and speech interaction with a patient information system

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

We have developed a gesture input system that provides a common interaction technique across mobile, wearable and ubiquitous computing devices of diverse form factors. In this paper, we combine our gestural input technique with speech output and test whether or not the absence of a visual display impairs usability in this kind of multimodal interaction. This is of particular relevance to mobile, wearable and ubiquitous systems where visual displays may be restricted or unavailable. We conducted the evaluation using a prototype for a system combining gesture input and speech output to provide information to patients in a hospital Accident and Emergency Department. A group of participants was instructed to access various services using gestural inputs. The services were delivered by automated speech output. Throughout their tasks, these participants could see a visual display on which a GUI presented the available services and their corresponding gestures. Another group of participants performed the same tasks but without this visual display. It was predicted that the participants without the visual display would make more incorrect gestures and take longer to perform correct gestures than the participants with the visual display. We found no significant difference in the number of incorrect gestures made. We also found that participants with the visual display took longer than participants without it. It was suggested that for a small set of semantically distinct services with memorable and distinct gestures, the absence of a GUI visual display does not impair the usability of a system with gesture input and speech output.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.
Fig. 6.
Fig. 7.
Fig. 8.

Similar content being viewed by others

References

  1. Brewster SA (2002) Overcoming the lack of screen space on mobile computers. Personal and Ubiquitous Computing 6(3):188–205

    Article  Google Scholar 

  2. Brewster SA, Lumsden J, Bell M, Hall M, Tasker S (2003) Multi-modal ‘eyes free’ interaction techniques for wearable devices. Proc. CHI’03 Conference on Human Factors in Computing Systems, CHI Letters, ACM Press, 5(1):473–480

  3. Kostakos V, O’Neill E (2003) A directional stroke recognition technique for mobile interaction in a pervasive computing world. In People and Computer XVII, Proc. HCI 2003: Designing for Society. Bath, UK, 197–206

  4. O’Neill E, Woodgate D, Kostakos V (2004) Easing the wait in the Emergency Room: building a theory of public information systems. Proc. DIS’04 Conference on Designing Interactive Systems, ACM Press, pp 17–25

  5. James F, Roelands J (2002) Voice over Workplace (VoWP): voice navigation in a complex business GUI. Proc. Fifth International Conference on Assistive Technologies, ACM Press, pp 197–204

  6. Amar R, Dow S, Gordon R, Hamid MR, Sellers C (2003) Mobile ADVICE:an accessible device for visually impaired capability enhancement. Extended Abstracts CHI’03 Conference on Human Factors in Computing Systems, CHI Letters, ACM Press 5(1):918–919

  7. Xiao Y, Lasome C, Moss, J, Mackenzie C, Faraj S (2001) Cognitive properties of a whiteboard: a case study in a trauma centre. Proc. ECSCW 2001 Seventh European Conference on Computer Supported Cooperative Work Kluwer, pp 259–278

  8. Clarke, K., Hughes, J. and Rouncefield, M (2002) When a bed is not a bed: the situated display of knowledge on a hospital ward. In Workshop on Public, Community and Situated Displays, at CSCW 2002, ACM Conference on Computer Supported Cooperative Work. Available at http://www.appliancestudio.com/cscw/papers.htm. Last accessed December 2004

  9. Naumann S, Miles JA, (2001) Managing waiting patients’ perceptions the role of process control. Journal of Management in Medicine 15(5):376–386

    Article  PubMed  Google Scholar 

  10. Maister DH (1988) The psychology of waiting lines. In: Lovelock J (ed) Managing services: marketing, operations and human resources. Prentice-Hall, pp 176–183

  11. Dansky KH, Miles JA (1997) Patient satisfaction with ambulatory healthcare services: waiting time and filling time. Hospital and Health Services Administration 42:165–177

    Google Scholar 

  12. Goldstein M, Book R, Alsio G, Tessa S (1999) Non-keyboard QWERTY touch typing: a portable input interface for the mobile user. Proc. CHI’99 Conference on Human Factors in Computing Systems, ACM Press, pp 32–39

  13. Wigdor D, Balakrishnan R (2003) TiltText:using tilt for text input to mobile phones. Proc. 16th Annual ACM Symposium on User Interface Software and Technology, ACM press, pp 81–90

  14. Pirhonen A, Brewster S, Holguin C (2002) Gestural and audio metaphors as a means of control for mobile devices. Proc. CHI’99 Conference on Human Factors in Computing Systems, ACM Press, pp 291–298

  15. Lines L, Hone KS (2002) Older adults’ evaluations of speech output. Proc. Fifth International Conference on Assistive Technologies, ACM Press, pp 170–177

  16. Igarashi T, Edwards WK, LaMarca A, Mynatt ED, (2000) An architecture for pen-based interaction on electronic whiteboards. Proc. Working Conference on Advanced Visual Interfaces, ACM Press, pp 68–75

  17. Kostakos V, O’Neill E (2004) Pervasive computing in emergency situations, Proc. Thirty-Seventh Annual Hawaii International Conference on System Sciences. IEEE Computer Society Press, p.30081b

  18. Kostakos V (2004) A design framework for pervasive information systems. PhD thesis. Department of Computer Science, University of Bath. Technical Report CSBU-2005-02, Technical Report Series ISSN-1740-9497

  19. Sawhney N, Schmandt C (2000) Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments. ACM Transactions on Human-Computer Interaction 7(3):353–383

    Article  Google Scholar 

  20. Manzke JM (1998) Adaptation of a cash dispenser to the needs of blind and visually impaired people. Proc. Third International Conference on Assistive Technologies, ACM Press, pp 116–123

  21. Ross DA, Blasch BB (2002) Development of a wearable computer orientation system. Personal and Ubiquitous Computing 6(1):49–63

    Article  Google Scholar 

  22. Miller GA (1956) The magical number seven, plus or minus two: some limits on our capacity for processing information. The Psychological Review 63:81–97

    Article  Google Scholar 

  23. Cicero MT, De Oratore, II lxxxvi. Translated by H. Rackham, pp 350–353

  24. Hoc J-M (2001) Towards ecological validity of research in cognitive ergonomics. Theoretical Issues in Ergonomics Science 2(3):278–288

    Article  Google Scholar 

Download references

Acknowledgements

The research reported here is part of a UK EPSRC-funded research project ‘Designing for common ground in mobile distributed collaborative systems’, award number GR/R24562/01. We thank Hilary Johnson and Leon Watts for their advice and insightful comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eamonn O’Neill.

Appendix

Appendix

Transcript of the speech output file played to participants in the experimental evaluation. The text here was prerecorded. Speech output responses to participants’ gestures were generated in real time based on which gestures were made.

1.1 Gesture Interaction in a Hospital Waiting Room

1.1.1 Background

You have just come to the hospital with a very painful wrist and you want to have it checked over by a doctor. You have never really had the need to go to the hospital before and therefore are unsure of the process of being seen by a doctor.

Upon entering the hospital the first thing you see is a Reception desk. You are greeted by a nurse at the desk who takes your personal details and asks you to wait. You sit down in the waiting area. On the wall of the waiting area is a large computer display, listing services that are available to you and the gesture that invokes each service. When you make a gesture, the service is delivered to you via loudspeakers.

1.1.2 Evaluation

We are running a test of the gesture based system to see how it performs and how usable people find it. Please note that we are evaluating the performance of the computer system, not of you. During the evaluation you will be asked to perform specific gestures, which will provide you with particular services during your time at the hospital. Please note that, in addition to these requested gestures, you are able to perform the ‘help’ gesture at any time to hear what services are available to you and their associated gestures.

You now have <5/10 min> training time to familiarise yourself with the functionality and their associated gestures. If you have any questions please ask one of the evaluators before the experiment starts.

1.1.3 The Tasks

As you are unfamiliar with the hospital process, you wish to hear an overview of what is involved.

  1. 1.

    Perform a gesture to hear an overview of the hospital process.

When you gave your details at Reception, you were asked to take a seat in the waiting room and told a nurse will see you as soon as possible. You are impatient and would like information on how long you will have to wait before treatment.

  1. 2.

    Perform a gesture to hear the estimated treatment time.

The estimated treatment time is quite long, so you decide to go and get a drink from the café.

  1. 3.

    Perform a gesture to get directions to the hospital café.

After getting your drink you return to the waiting room and continue to wait. While you are waiting you decide to listen to some music.

  1. 4.

    Perform a gesture to listen to some music.

You are now called by the nurse for a pre-examination of your injury to see how serious it is. You don’t want to be distracted by the music while you are talking with the nurse.

  1. 5.

    Perform a gesture to turn the music off.

While your injury seems to be not too serious, the nurse would like you to be seen by a doctor in case of a possible fracture. The nurse also suggests that you may need to be x-rayed. The nurse asks you to take a seat in the waiting room and tells you a doctor will see you as soon as possible.

While in the waiting room, you wish to pass the time and therefore decide to listen to some music again.

  1. 6.

    Perform a gesture to listen to some music.

After waiting for some time, you wonder if you are meant to do anything before seeing the doctor, such as going for the x-rays suggested by the nurse.

  1. 7.

    Perform a gesture to hear the next step in the process.

The doctor comes to the waiting room and calls you to be seen.

  1. 8.

    Perform a gesture to turn the music off.

You follow the doctor to a treatment cubicle. After examining you, the doctor decides that your wrist must be x-rayed. You are told to wait in the radiology waiting room.

  1. 9.

    Perform a gesture to listen to some music.

After a short wait there, you are called by a radiographer.

  1. 10.

    Perform a gesture to turn the music off.

Your wrist is x-rayed and you are again asked to wait in the radiology waiting room. You are now unsure what you should be doing next.

  1. 11.

    Perform a gesture to hear the next step in the process.

The radiographer returns and gives you your x-ray plates. You return to the treatment cubicle. The doctor is satisfied that your wrist is not broken but is just slightly sprained. The doctor leaves you with a nurse who bandages your wrist and is then called away.

You are now unsure if it is all right for you to leave, or if you have to make a follow up appointment.

  1. 12.

    Perform a gesture to hear the next step in the process.

You now know that you have been discharged and are therefore able to leave the hospital and go home.

  1. 13.

    Perform a gesture to book a taxi.

While you are waiting for your taxi you decide to go to the hospital café and get some breakfast. However you have forgotten where the café is.

  1. 14.

    Perform a gesture to get directions to the hospital café.

You get some breakfast from the café and your taxi arrives to take you home.

Rights and permissions

Reprints and permissions

About this article

Cite this article

O’Neill, E., Kaenampornpan, M., Kostakos, V. et al. Can we do without GUIs? Gesture and speech interaction with a patient information system. Pers Ubiquit Comput 10, 269–283 (2006). https://doi.org/10.1007/s00779-005-0048-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-005-0048-1

Keywords

Navigation