HUDs as Immersive Devices

Of late I have become involved in two separate development projects for the University of Otago. Both are in the area of medicine, the first being the Otago Virtual Hospital; already developed simulations that allow students to practice clinical decision making; and the second, a new development involving the creation of scenario based activities for the Occupational and Aviation Medicine Unit.

Currently underway, my task in the virtual hospital is to analyse the two simulations, optimise the programming and suggest build improvements that will enhance its purpose. The reconfigured scenarios will then be further developed as rezzable builds through the creation of a holodeck-type system, providing the ability for multiple students to use the scenarios simultaneously, as and when needed, rather than just having the one build available as is currently the case.

The second project, close to its commencement, involves the creation of an occupational environment representative of a cement manufacturing plant. The build will display immersive spaces descriptive of the actual site, where students and teaching staff might discuss the industrial processes and assess the occupational risks associated with particular workers at the plant. A health clinic room will be also be built where the workers can undergo medical examination by the students, resulting in written clinical assessments.

Both projects involve the examination of patients by clinical staff and it is this aspect of the scenarios that this post investigates.

One of the difficulties of patient examinations in virtual environments such as Second Life and OpenSim is their realistic portrayal. While a clinician might walk to the patient adequately using default animations, any realism in the examination movements would then involve a number of custom animations being played out. Take for example picking up the patient’s hand to take a pulse; not only the clinician’s animation in reaching for the hand and holding it correctly for pulse taking, but the synchronised raising of the patient’s hand in time to the clinician’s movement would need developing.

Move to something that involves an object coming into play, for example taking the patient’s blood pressure, and we have to add in picking up the BP cuff (through avatar attachment), putting it on the patient (detaching from the clinician, attaching to the patient) along with the synchronised animations. This represents an extensive amount of work developing animations and attachment procedures, along with the associated programming to have them run synchronously, that in my view could be better spent on aspects of the scenarios that provide more learning value. It is in the development of these aspects of scenario-based simulations or role-plays that I tend to suggest the use of HUDs (heads-up displays).

I do however often find an initial resistance to this approach. In my view there are two perceptions at work here; the first is that all actions in a simulation need to be played out for learning to occur; the second is that the HUD will interfere with the sense of immersion. In addressing the first I would point to the above example of taking a pulse. Until haptic technology is sufficiently advanced that pulse taking could actually be accurately, virtually performed, then I see no educational benefit in acting out taking the patient’s pulse and accruing the associated development cost. The requirement would be that the pulse is taken and recorded and a HUD could perform this function adequately, e.g. by the user clicking on the take pulse button (with appropriate icon) and the HUD delivering a programmed pulse rate response. The student role playing the clinician knows they have taken the patient’s pulse and an indication could be given to the patient, if necessary in the scope of the scenario, that their pulse is being taken. In this sense the pulse taking has been acted out, textually, in much the same manner as has been common practice in MUDs for years.

In addressing the issue of immersion I would use the example of HUDs used in video games. In many situations they are a necessary part of the game play and to this end are designed in such a way that they graphically belong. Players slip in and out of them as and when needed, say to change a weapon or to peruse a world map, even in the middle of intense play activities, yet without any detraction from the immersive experience. In the end the players know the experience isn’t ‘real’ and are prepared to accept ‘unreal’ aspects as long as they merge seamlessly into the whole and contribute positively to the game-play. In my view HUDs in virtual environment simulations, given purposeful design, will have the same effect of supporting the immersive experience as opposed to detracting from it.

Advertisements

One thought on “HUDs as Immersive Devices

  1. Pingback: Occupational Medicine Research Project: Initial Development | F/Xual Education Services

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s