Preparing for Emergencies...Virtually

Robert P. Bennett



Just as computers have done in the past, virtual reality (VR) is now becoming a part of our common culture. In laboratories, schools, businesses and even game rooms virtual reality is taking over the tasks once left to human hands. Each day more and more people are affected by this growing technology. Each day another life is enhanced because of a tool first envisioned on the drawing board then developed and tested in a virtual environment.

In few other places can VR's effects be more felt than in the medical community. Medical personnel are always seeking better ways to treat their patients. Constantly coming in contact with new situations, those who treat the sick and injured must always find new, better and faster ways to improve their skills and techniques. This is where virtual reality can become not only a highly useful tool, but a life-saving one as well. In the virtual environment medical personnel of all sorts can develop new instruments and new procedures without the need to experiment on flesh and bone. Using virtual models they can practice procedures until they feel comfortable enough to work with a patient. This can be of particular import in emergency situations where speed and effectiveness of treatment can mean the difference between life and death. In military field operations, for example, education and training on triage techniques, used to assess patient conditions and help medical personnel devise treatment in mass casualty situations, can be facilitated with computer-based training. The endless variety of scenarios and models which can be created in a virtual environment can offer experiences that are near to reality.

In the past emergency medical personnel have been trained in a variety of ways utilizing textbooks, animal models or some combination of the two. However, there are problems associated with these methods. Animal models are expensive, inadequate in their ability to demonstrate appropriate anatomy, and of limited use because only one kind of medical malady can be visited upon any one single model. These problems make the continued use of animal models in the training of medical personnel ultimately impractical. Unfortunately there are not enough viable human cadavers to go around. This shortage of useful material has forced the use of animal models which, at least, offer the trainee the ability to feel and manipulate tissue. Furthermore, lab-based training is difficult because the labs aren't open all the time. You can't learn at your own pace.

Text-based learning offers a fast and effective method of accumulating facts but it does nothing to prepare the emergency medical trainee for real-life situations either in the field or in a hospital setting. One can learn from a text what an injury is supposed to look, smell and feel like but most real-life experiences do not fall into the realm of textbook perfection. Without hands-on experience the trainee is left relatively unskilled in diagnostic and treatment procedures.

Computer workstations offer a better way to train medical personnel because they can produce realistic anatomical structures and physiological processes and they can be available twenty-four hours a day. Workstations capable of generating one hundred thousand polygons and somewhat realistically rendering anatomical structures (it has been estimated that 500,000 polygons per second is optimal for a realistic reconstruction of just a human abdomen), first came on the scene in late 1984.

The roots of computer-generated medical simulations can be traced back to the creation of both flight simulation and medical information visualization, both applications providing a strong foundation upon which to base medical VR applications. While flight simulators provided the foundation for realistic, real-time environments, information visualization allowed the creators of medical VR to bring textual facts into scenarios that makes otherwise cold knowledge come alive.

Early medical simulations could not be rendered in real time and, therefore, were of limited use as tools for training. Furthermore, for the most part, most could not be acted upon and those that could did not respond the way true anatomical structures did.

In his work on medical simulators, Dr. Richard M. Satava outlined five areas which needed to be addressed to insure that future medical simulations were accurate and closely mimicked real life situations. According to Satava, and as one can imagine, the graphics used to reproduce anatomical structures has to be of the highest caliber. Virtual structures have to be designed to change and move like those they are simulating. They have to react to damage and disease in the same ways as real structures (including the flow of blood when the structure is cut). Furthermore, virtually constructed anatomical objects have to interact with each other, and with the virtual implements introduced to them, in the same way they would if they were real objects. Finally, the virtual objects, whether tissue, organ or bone, have to feel correct and provide the same kinds of pressure-related responses as the structures they are created to look and feel like.

Since the early days of medical VR, many newer and more advanced simulators have been developed. Over the past few years technology has improved greatly. Now the use of simulations is enhanced by improved medical scanning equipment, creating computer-generated models which are more realistic. Many of today's virtual models are complex images derived from high quality digital data, usually handmade tracings of photographs of anatomical structures that are then contoured, translated into an electronic format and overlaid with texture maps.

In order to create realistic virtual models for anatomical structures certain things had to be considered. What does anatomical structure being simulated look like? What does it feel like? How does it react to interaction with other organs and tools? These are not easy questions to answer. Researchers have to study the malleability of different tissues. They have to learn how structures can and can not be moved and how various tools interact with anatomical structures. One of the many things researchers have discovered is that it is fairly easy to determine where and how a tool interacts with these structures. What is much more difficult is determining how these structures respond when probed with fingers. Force feedback, the ability to feel how much pressure is being exerted on an object, is still a major stumbling block in the creation of lifelike medical scenarios. The sensors on such things as datagloves are not equivalent to the pressure sensors in human skin. Such things are being improved but progress is slow.

Unfortunately there are many problems inherent in current day virtual reality technology that can seriously impair its use as a training device. One problem is the lack of a reliable method to simulate odors. Because the recognition of odors is very important in the diagnosis and treatment of many injuries and diseases, the lack of ability to sense these odors puts the medical trainee using a VR simulator at a distinct and sometimes dangerous disadvantage.

Though there have been many attempts at the creation of surgical simulators, there has been little done in the field of emergency services. Take, for example, the assessment and treatment of a gunshot wound. This is a typical injury found both in military and civilian emergency situations yet it had never before been simulated in a way that would give medical trainees the true feeling of such an injury. This is the injury researchers Scott Delp and Arthur Wong, at MusculoGraphics Inc. (Evanston, Ill.), chose five years ago to inflict on a virtual leg.

Currently an assistant professor of Biomedical Engineering and Physical Medicine at Northwestern University, Dr. Scott Delp has been involved in the study of the human musculo-skeletal system for the past 10 years. Delp is also CEO and co-founder of MusculoGraphics Corporation, where his chief interest has been creating realistic computer-generated models of anatomical structures in order to enhance the learning process for medical students. Arthur Wong's training is in Bio and Mechanical engineering. His background is in software development. Over the years he's helped develop the software packages used in robotic and bionic apparatus. Currently he is the chief operations officer at MusculoGraphics. These two men became involved in this particular project, the training of emergency medical personnel because of their company's involvement in government contracts. "The reasoning behind the decision to develop this simulator," says Wong, "boiled down to the needs of the military. There was a perceived need by the Dept. of Defense. The initial target audience was for Special Operations Command but we want to be able to do both public sector and private applications such as for civilian emergency medical personnel. The basic skills are very similar. It's just that the military has a short-term requirement."

The thigh model was chosen because a gunshot to this area is the most common injury on the battlefield. Still, in working on this kind of injury, the researchers felt that emergency workers would be given the opportunity to see and feel how tissues can be torn and burned. They learn the costs involved in losing bone and muscle mass. They learn to appreciate how injury to one system affects other systems. In this model, for instance, the bullet itself doesn't do all the damage. While it does tear and burn both flesh and muscle, it also shatters bone. The model's shattered bone splits into pieces which continue to cut both muscle and nerve fibers. The resulting injury causes a wide range of problems, including loss of muscle strength, for the trainee to assess and treat.

The simulation developed at MusculoGraphics has two modes of training. While in Tutorial mode, medical trainees are able to 'fly' through a wound to study the various anatomical structures involved and see how traumatic wounds affect physiological processes. According to Delp, it is relatively simple to represent anatomical structures and the damage that can occur to them. What is more difficult to demonstrate is the physiology within the body and show students how wounds affect the normal operations of different systems. This is the focus of the Treatment training mode. During this mode of the simulation trainees actually get the chance to interact with the anatomical structures and wounds they are studying. Great lengths have been taken to allow users of this mode to learn about such things as the elasticity of various structures and the flow dynamics of blood in and around a wound. Algorithms built into the system prevent trainees from moving instruments and flesh in ways they wouldn't be able to move in the real world. Other algorithms lend kind of a time limit to the scenario. Through algorithms that control the bleeding process of the wound trainees are encouraged to move quickly.

Utilizing both training modes users of the simulation learn several of the skills they'll need in the real world. They'll learn the characterization of wounds, being able to distinguish between vital and devitalized flesh. Because the researchers went to great lengths to incorporate lifelike bleeding rates and flow patterns students will be able to practice hemorrhage control. After they've stopped the bleeding, medics will have the opportunity to assess damage done to the leg's flesh and muscle. They'll see where shrapnel and bone have caused damage and learn how to debride a wound. Finally, trainees will learn the steps necessary to prepare the wounded individual for transportation from field environments.

The long-term goal of this project is to demonstrate that computer-generated models of anatomical structures and injuries can be used to train medical personnel to handle emergency situations in both civilian and military environments. In the simulated environment users are able to develop and practice their skills at wound assessment and treatment. They learn the proper tools to use in dealing with a variety of injuries and how to use these tools effectively. The future of this project is a move toward greater complexity. Through the use of advances in force feedback technology both civilian and military trainees will be able to feel what they're doing. A larger database of traumatic injuries will be available, giving trainees the opportunity to learn to treat more rare types of injuries. Finally, several serious flaws in the simulation will be corrected. Researchers plan to incorporate software into the system that will evaluate the performance of those who use the simulations. Future uses for this system and others like it are quite broad, from resident education, to recertification and paramedical training for nurses and physician assistance.


Going Home Now!

For Reprint Rights on the Above Article,
Please contact RBennett@Bennet-Tec.Com