Remapping Control in VR for Patients with AMD

Aug 2021 - Dec 2021
Age-related Macular Degeneration (AMD) is the leading cause of vision loss among persons over 50. We present a two-part interface consisting of a VR-based visualization for AMD patients and an interconnected doctor interface to optimize this VR view. It focuses on remapping imagery to provide customized image optimizations. The system allows doctors to generate a tailored, patient-specific VR visualization. We pilot tested the doctor interface (n=10) with eye care professionals. The results indicate the potential of VR-based eye care for doctors to help visually-impaired patients, but also show a necessary training phase to establish new technologies in vision rehabilitation.  
My Contributions
Drawing-function programming, desk research, UI design, preliminary evaluation, usability testing preparation
Problem Statement
Despite recent advances in the treatment, many patients can progress to profound visual impairment that affect both distance and near vision. There is a need to develop interfaces that utilize improved display and computational features to help patients with AMD to live with low and often decreasing vision.
Research Question
How can a VR-based assessment system help the patients with AMD to improve their quality of life?

Advisor: Michael Nitsche
3+ Collaborators

Accepted to IEEE VR 2023
Desk Research
Patients with end-stage AMD experience central vision loss with paracentral blur. Ongoing research focuses on developing interfaces for these patients that utilize improved display and computational features.
Image Remapping
Image remapping, as opposed to magnification, consists of a visual optimization that targets the remaining macular or paramacular area by adjusting the projected image in a customized way onto the retina.

Ongoing research focuses on developing such interfaces for patients that utilize improved display and computational features. Commercial systems, like EyeDaptic introduce smart glasses with video displays that provide different forms of magnification (e.g. basic image magnification or “bubble” magnification), at times with additional image improvements. We developed two interconnected interfaces: a VR visualization tool for potential AMD patients and a control interface that allows doctors to manipulate this VR view to adjust it to each individual patient’s needs. Our system focuses on a remapping of the field of vision onto the remaining area of the macular. We present the implementation of the system using a HTC Vive Pro and a web- based control interface, as well as an evaluation with eye care professionals to assess the future value of such a VR-based assessment system for eye care practitioners.
Doctor's Interface
We identified a several similar user interfaces for Eye doctors such as Electronic Health Record (EHR) and ModMed. These interfaces have the following similar properties:
  • Access the history of the examinations and compare how macular degeneration has developed over time
  • Record relevant questions and answers
ModMed was closer to what we were looking for, because it allows the doctors to freely draw and annotate on the actual photo of the patient’s eye.

Based on the research, I iterated on the first prototype UI for doctors that allows assessment and image adjustment to present visual remapping to potential patients.
Previous User interface
The previous version of the Doctors' UI allowed the eye charts, and activate the warping effect while the patient experiences these changes in the VR viewpoint.
However. the key : hand-adjust the occluded area, which would allow more personalized adjustficiation for each participant. Also this version does not allow the doctors to adjust the conditions on each eye separately.
New Interface
I designed this interfaces and also programmed the drawing function.

History View
Doctors control the patient visualization through a web interface aimed at medical professionals to test and proscribe treatment. It provides access to features that affect a VR visualization in real-time.
Assessment View
Doctors can activate eye charts, hand-adjust the occluded area, and activate the warping effect while the patient experiences these changes in the VR viewpoint. This allows the necessary adjustment of the digital image and provides the medical professional with a range of tools needed to make VR-enhanced visualization beneficial for AMD patients. The design allows doctors to adjust the conditions on each eye separately.

The control interface allows the medical professional to import an Optical Coherence Tomography (OCT) image from the patient’s medical files. These images show the degenerated macular for a patient. With this information as a background, doctors can trace around the scotoma area and create a unique image that is then imported into the VR application.
The control interface also allows the medical professional to import an Optical Coherence Tomography (OCT) image from the patient’s medical files. These images show the degenerated macular for a patient. With this information as a background, doctors can trace around the scotoma area and create a unique image that is then imported into the VR application. This feature builds on traditional patient data and assessment methods but the drawing application allows the control application to send irregular-shaped scotomas to the VR visualization. The graphic image simulating the scotoma is generated and displayed in the patient interface. Doctors can still implement the warping effect, re-sizing and contrast features still apply.
Warp Control
Doctors can activate the warping effect while the patient experiences these changes in the VR viewpoint.
VR Training
Doctors can activate the warping effect while the patient experiences these changes in the VR viewpoint.
Patient's Interface
The VR prototype delivers a simulated central scotoma, which follows the movement of the eye using HTC Vive Pro’s integrated Tobii eye tracking. It also provides a linear remapping, which pushes the central image part outside the simulated scotoma to provide typical remapping. Both effects are rendered individually for each eye, allowing adjustment for different conditions.
This project runs on Unity 2018.4.15f, using SteamVR Unity Plugin v2.5 and ViveSR v1.1.0.1 (SRanipal Eye Tracking API). We utilize recently released consumer technology (HTC Vive Pro eye-tracking Head-Mounted Displays (HMD) with Tobii eye tracking embedded) and Vive’s v2 base station spatial tracking.

The doctor's interface works as a dynamic HTML page which connects to the patient's VR application via WebSocket. This software architecture allows for a cross-platform as well as remote assessment approach that can render out to different conditions. Likewise, the control interface can run on multiple systems (PC, laptop, tablet) as long as they are connected to the internet.
Preliminary Eval (n = 1)
We received feedback on the newly developed interface from Dr.Primo from the Emory Eye Center. We asked questions regarding the inclusion of a higher resolution 3D object for testing the warping, Adjusting the affected area, regarding the testing advances, and Supporting the doctor's view.
  • Draw interface would be very powerful
  • Historical data would be helpful (especially in earlier stages)
  • Warp effect control would be helpful
things to consider
  • black & white is good Red may give issues for vision impaired High contrast is good (red & green are bad colors), too much detail may also be bad?
  • Draw interface would be very powerful
  • patient adjusting (we did not even think about that)
  • add patient engagement (checker board, dominoes)
User Study (n=10)
The usability study was conducted with the old version of the interface. Participants first tested the doctor control interface, then they tested the VR visualization, before the session closed with a concluding exit interview. A single session lasted up to 40 minutes and testing was conducted on site of the Emory Eye Center. Data collected consisted of SUS, interviews, and video recordings of the actual interaction.
Participants were recruited from the professionals at the Emory Eye Center (n=10; 6=female/ 4=male). Overall, participants showed high levels of expertise in the field of eye care but less so in the use of VR.  
The results of the SUS questionnaires show a favorable overall response. The mean SUS value for the doctor interface was 68.89 and for the patient interface 68.25. That means, for both cases, the usability factor was above average compared to other interfaces.
After conducting usability tests with both the doctor’s interface and the patient interface, participants were interviewed and asked about their experiences interacting with the system.

Promising Tool
The qualitative feedback showed that both interfaces were accessible and allowed the users to engage with the tools in meaningful ways. There were differences regarding the comfort level of the participants, including the headset’s weight and the VR technology but the principal concept of remapping was seen as a promising approach. One participant highlighted that “the warping was incredible, you know, to be able to look around and see (…) the other parts of the image that you're trying to see”(P10) which was a sentiment that was echoed by others (P6).

Learning Curve
In order to bring this system to eye care practices multiple experts noted that there would have to be a learning process in place. P4 stated that “depending on who uses it and…how good they are of technical things, there will be a lot of training there”. Likewise, the doctor interface was perceived as in need of some additional assistance for its users. “Just because somebody is very smart academically doesn't mean that their visual perceptual skills are wonderful. So you need a tool to help them understand kind of what's going on.”(P2)

In Summary
The tool was perceived as helpful – at the same time, the learning curve to properly operate this VR tool was noted.  One perceived strength was the value of the combined applications. One expert acknowledged the connection between the doctor and patient interface in “the fact that you have the opportunity to alter the size and just check and double check and make sure that the patient actually interfaced well, with the size of the macular, that you're drawn out. (…) That was definitely a good feature.”  (P9)
We have demonstrated a two-part interface that utilizes the variable image corrections available in VR headsets with rich controls for doctors to effectively deploy these features and customize them. As has been shown, the VR and doctor interface have enabled us to create a tool that is useful to doctors and promises their patients with AMD a new type of care. While there is still some training that our users would need to expertly navigate both interfaces, the applications have a level of ease of access that enables a doctor to help build a unique visual experience for potential patients.
One weakness of this study is its focus on eye care specialists. An assessment of the overall system would benefit from a test with AMD patients. This would require clinical trials but allow for an evaluation of the patient interface. As the project continues, we hope to utilize the knowledge gleaned from the user test to create a tool that is accessible and easy to use for many individuals who might be of different ages and have different levels of technical literacy and expectations.