Skip to main content
Apply

Arts and Sciences

Open Main MenuClose Main Menu
Department of Computer Science

Design of a Human Centered Computing Based Reversed Shoulder Surgery Simulator

 

Objectives of this project

In virtual environments, virtual reality technologies are used to create a three-dimensional artificial or synthetic environment that allows users to do "what-if" analyses, comprehend target problems, and compare different solutions for a variety of fields. In this OCAST project, Dr. Cecil (P) is designing, building and assessing the impact of an HCI based Mixed Reality (MR) simulator to train medical residents in orthopedic surgery.  The benefits of creating such a MR simulator for training include:

  • Better Preparation of medical residents for Reverse Shoulder Surgery
  • Avoiding using human cadavers as there is risk of infection
  • Avoiding use of small animals such as rodents (as they is opposition from animal rights groups, which is part of the changing societal values)
  • Opportunity to practice repetitively without additional costs when compared to using physical models

According to Association for Computing Machinery (ACM), HCI is defined as discipline concerned with the Design, Evaluation and Implementation of interactive computing systems for human use and with the study of major phenomena surrounding them. An important facet of HCI is user satisfaction.

 

2.1 Participator design and Collaboration with medical surgeons

A key aspect of this project involves exploring participatory design principles where surgical experts are involved in the design of the MR environments. We are collaborating with a leading group of medical surgeons at Dignity Regional Medical Center (Phoenix, AZ). One of the innovative aspects in this participatory design process is the creation of an information model capturing the overall steps of the surgical procedures as well as identifying the functional and process relationships of the sub-activities and tasks within these surgical procedures. An elided version of this information model (eEML) is shown in Figures 1 and 2; the core task is represented as a verb phrase inside the rectangular (task) box; the arrows with concepts inside the elliptical box with arrows going north to south are information inputs necessary to complete each surgical step; the mechanisms or tools needed for the surgery are represented as concepts in the boxes with arrows going south to north (under the task box). The creation of the simulation modules progressed based on this information model; Dr. Cruz served as the knowledge source for this information model. As the 3D environments were being created, feedback was obtained from Dr. Cruz who interacted with the training simulation modules and provided changes/suggestions for the content and the manner of providing the cues /training assistance to the trainees.

 

eeml for humeral steps

Fig 1: Elided view of an information model (eEML) of the key steps in the reverse shoulder surgery (humeral) 

eeml for glenoid steps

Fig 2: Elided information model (eEML) of the key steps in the reverse shoulder surgery (glenoid)

2.2 Creation of Surgical training modules

Creation of training module for humerus bone surgical steps

We have completed the design and building of module 1 which enables trainees to perform steps required for cutting humeral head to assist medical residents to improve their skill level for the surgery. C# and the Unity game engine have been used for developing this module. Autodesk Maya and Pixologic ZBrush was used to build the CAD models for the Mixed Reality (MR) simulator. An assessment plan has been developed. Discussion of plan is in next section of this report. This MR simulator has following features:

  • The information model (fig 1) was used as the basis for creating the simulation module for training purposes
  • A medical Avatar (fig 3 a) was included for helping interactions with the user or medical residents; the avatar provides introduction to the training objectives as well as the function of various instruments used in the simulator for Reversed Shoulder Surgery procedure
  • Ability to interact with objects (picking up and insertion on virtual models, fig 3 c)
  • Cues including text introductions (showing next procedure for each step), animations (showing how each step should be performed), instrument labels (Showing a label for each instrument with the ability to track user eyes, see fig 3 b)

module1

Fig 3: (a) Left, Medical avatar (b) instrument labels. (c) Performing cutting section on humeral head virtually

 

Creation of training module for glenoid bone surgical activities

The use of MR in our simulation offers a potential way to increase the precision of glenoid pin insertion while simultaneously reducing substantial practical restrictions. For this module a realistically 3d printed of humerus and glenoid bones used as physical set-up, then exact CAD model of the same bone created by Pixologic Zbrush and Autodesk Maya and imported to Unity3D for MR simulator design and to be scripted in C#. Friedman line (red line), intermediate joint line (Bule line), and Scapula body line were drawn by simulation to train residents and enable them to better glenoid positioning in comparison to freehand placement. Five different realistic bones (humerus and glenoid) will be used to train users for positioning glenoid. This will prepare trainees for a variety of situations they may face during actual surgery. Furthermore, simulation addresses a transparent glenoid erosion.

The Physical Environment: A core part of our innovative approach is to explore training using an integrated cyber-physical approach. For this, a physical surgical setup (fig 4 b) has been created. We have created 3D printed bones and built a platform for them which trainees can rotate the bones and position them to have better understanding of anatomy and surgery process. For this purpose, we used an inline joint linkage with steel threaded rods. We also used an acrylic sheet as base for our setup in order to make it lightweight for the interactions.

 

module2

Fig 4: (a) Left: glenoid simulation with lines for positioning. (b) Middle:  physical set-up for bones. (c) Right adressing glenoid erosion

 

2.3 Overview of the assessment plan

In recent years, the role of MR in reversed shoulder surgery has been covered in several studies. Most of them did not consider the importance of Human Computer Interaction (HCI) to assess their output. To understand how and to what extent computers may aid learning, as well as how learning could be made more useful, engaging, effective, and approachable, HCI based assessment is necessary to throw light on  how humans interact with virtual environments including VR/MR based simulation environments. In year 1, an Assessment plans has been developed. A summary follows:

Our plan is to compare impact of teaching/training medical residents the key streps in the  reversed shoulder surgery’s using two different types of training approaches. One is for the medical resident to watch just one step in the surgery in the VR environment and then repeat it in the physical environment; the second approach is to observe, understand the entire set of surgical steps involved and then  repeat it  on the physical set-up. We also will assess a concept called dynamic affordance to see whether a specific view of the 3D training scene helps the users (a) understand specific concepts better (b) gain better surgical skills as a result of better understanding.  Furthermore, we will evaluate the role of distractors (such as visual distractions, sounds including alarms and interruptions during training)  in our MR-based environments. Our plan is to compare the knowledge and skills acquisition for both residents and nurses

 

Recorded video from the simulation:

https://www.dropbox.com/s/bgfxhy87f1547i5/GlenoidStepsHL2.mp4?dl=0

 

Publications

Alireza Sadeghi Milani, Aaron Cecil-Xavier, Avinash Gupta, J. Cecil & Shelia Kennison (2022) A Systematic Review of Human–Computer Interaction (HCI) Research in Medical and Other Engineering Fields, International Journal of Human–Computer Interaction, DOI: 10.1080/10447318.2022.2116530

 

 

Back To Top
MENUCLOSE