Read more

June 24, 2024
10 min read
Save

Beyond the metaverse: Spatial computing emerges in ophthalmology

You've successfully added to your alerts. You will receive an email when new content is published.

Click Here to Manage Email Alerts

We were unable to process your request. Please try again later. If you continue to have this issue please contact customerservice@slackinc.com.

It has been nearly 2 years since the first ophthalmology meeting was held in the metaverse.

The Digital Ophthalmic Society held its 2022 meeting in the virtual space just as the metaverse was blowing up across media and becoming the focus of more tech companies.

It is important that physicians understand the difference between augmented and virtual reality as the technologies are implemented into eye care, according to Eric D. Rosenberg, DO.

Source: Eric D. Rosenberg, DO

In those 2 years, virtual reality (VR) and augmented reality (AR) technologies have continued to grow, and not just for use as a virtual meeting space. As a traditionally tech-savvy specialty, ophthalmology has looked to these new realities for their potential in training, patient education and even applications during live surgery.

With all this new technology arriving so rapidly, Eric D. Rosenberg, DO, said it is important to understand how VR and AR are different and how they fit into the future of eye care.

“With virtual reality, we’re talking about a completely virtual environment,” he said. “You are closed in, where everything around you — all your surroundings — is populated from the computer.”

The best example of VR technology is the Oculus (Meta), he said. When someone puts on the headset, the real world is blocked out, and the user can see only what is projected inside.

Whereas VR is 100% computer generated, AR uses computer-generated images to augment the user’s real-world environment.

“You’re looking at your real environment around you, and there are overlay systems that help you navigate your current environment,” Rosenberg said. “The best example of that kind of headset would be the Apple Vision Pro. You can walk around in your own space but pull up your web browser, your calendar or your text messages.”

Whether it is VR, AR or a mix of the two, these technologies describe an emerging human interface known as spatial computing, Tommy Korn, MD, said.

“We’re entering an era where we finally touch and truly comprehend data in three dimensions,” he said. “Previously we thought flat computer monitors or tablets were sufficient to view 3D medical images such as CT or MRI scans. That pales in comparison when you see those same images in spatial computing.”

Tommy Korn

Applications

Within his practice at Sharp HealthCare in San Diego, Korn and his colleagues created the Spatial Computing Center of Excellence, where the health care system is investigating ways to integrate spatial computing into patient care. When the Apple Vision Pro was released earlier this year, Korn said Sharp HealthCare bought more than 30 headsets on the first day.

“We distributed them to digital health clinicians, surgeons, informaticists and developers within the health system to see what emerging digital services we could develop from this new technology,” he said.

Korn said their team worked with industry partners to create the first spatial computing application in ophthalmology, the Zeiss Surgery Optimizer.

“This AI-powered app helps prepare surgeons to prepare and analyze their cataract surgeries through simulation, analysis and optimization using the Artevo 850 microscope (Zeiss) and Apple Vision Pro,” he said. These contributions were mentioned in Apple’s keynote event in May by Apple CEO Tim Cook.

Rosenberg said that AR allows surgeons to sit in their own offices and discuss cases with colleagues across the world while bringing in computational features they might need, such as a CT scan or biometry.

“You have a virtual representation to be able to discuss what’s going on in this particular patient, which is a good example of a combination AR and metaverse application,” he said. “We had a pretty good showcase of this at the ASCRS Digital Clinical Committee. Tommy Korn was on stage looking out into an auditorium of empty seats, and he was able to bring up his biometry in one application, he was watching a surgical video in another, and what he was alluding to was the interconnectivity that is going to exist. It’s not quite there yet, but it’s going to exist by leveraging what each one of these components does well and then start bringing it together so we have more interoperability in our platforms.”

Figure 1. Example of a lecture being held in the metaverse.

Source: Eric D. Rosenberg, DO

In his experience so far with the Apple Vision Pro, Korn said it has been great at preparing for and analyzing cataract surgery through spatial simulation, video analysis and AI optimization of current techniques.

“These devices can’t be used in operating rooms yet as they require international safety certifications, despite stories of their use in live surgeries worldwide,” he said. “But we’re only in the first inning of spatial computing. If you connect the dots and look at what we’ve done so far, you can extrapolate what’s coming next.”

Looking to the future, “spatial computing in ophthalmology has three key benefits,” Korn said. “First, it allows physicians to view in true 3D. Second, it provides real-time assistance, showing exactly where to make incisions, align astigmatism implants, or overlay corneal and retina imaging for precise procedures. No more guessing, just safe precision. Lastly, it enhances communication among doctors. We rarely have time to talk, but with spatial computing, we can collaborate in real time, manipulating 3D models together using spatial telehealth apps.”

Having an AR component to ophthalmic surgery seems like a logical progression now that more companies have brought digital surgical systems to the market. S.K. Steven Houston III, MD, said the shift from analog to digital visualization in surgery allows for easy integration as technology advances.

“Once you open up that digital platform for surgical visualization, you can start to have that tech stack and continue to build upon it,” he said. “We’re already starting to see where augmented reality is definitely coming forward.”

Houston said the beginning stage of this integration is mostly apparent in cataract surgery, where computer-generated overlays can help with IOL alignment and other parts of surgical planning.

Education

In years past, Rosenberg said the only way to practice becoming a surgeon was to learn in the operating room. However, that requires a patient and is risky because of potential complications. Eyesi Surgical (Haag-Streit) and other simulators can provide lifelike experiences for trainees.

“In medicine, we’re eternally appreciative of those people who help us become the surgeons that we are,” he said. “Now, we kind of have this middle ground with these VR trainers that get people up to speed with their surgical techniques. With the limited time and limited resources that we have, we can really use experienced surgeons in the room — via the metaverse and virtual reality — to help people get out of bad situations, which can make you a better surgeon than just getting the basic steps down.”

Newer VR simulators designed by FundamentalVR, Alcon and others have already shown the potential for successful skills transfer in ophthalmic surgery training. Eventually, these devices could take the next step forward into surgical robotics, Rosenberg said.

“We’re probably going to see them augment what it is we do in terms of helping us accomplish fine motor tasks,” he said. “Once we get comfortable with it, we’re going to start seeing more semiautonomous and autonomous- based robotic solutions where the robots are really going to be doing a lot of major steps, and we’re going to be there assisting the robotic platform. We might even be monitoring and addressing multiple robots at the same time, helping multiple people at the same time, which is going to be a challenge going into the near future with the number of ophthalmologists that are being produced each year.”

While physical AR and VR devices can be the tools for the future of education, the real gold mine is in connecting the field’s top minds in the metaverse setting. As a professor of ophthalmology at Duke University School of Medicine, Sharon Fekrat, MD, FASRS, has seen how students and residents can benefit by learning and gaining experience from top surgeons around the world.

Sharon Fekrat

“What we’re able to show in the lecture hall or over Zoom is all two dimensional, but everything we do in real-life ophthalmology training is three dimensional,” she said. “In the metaverse, we can conduct educational sessions, teaching rounds and case conferences using three-dimensional video. You can invite expert ophthalmologists and retina specialists from anywhere in the world to engage in surgical teaching of others or to benefit from the interactive conversations.”

Having those conversations captured in the metaverse will also allow future students, trainees and others to go back to those stored sessions and re-live them, listen to the discussions that took place and learn from experts who may no longer be around, Fekrat said.

“You can actually go back in time and put yourself — your avatar — in that metaverse room and watch it all over again,” she said. “You can be part of it and listen to the conversations that happened among giants in the field.”

MetaMed has been hosting meetings and other events in the metaverse for a few years. Houston said there has been a lot of interest from the physician community, and a segment of particularly tech-savvy doctors has jumped all the way in. Every other Sunday, Rosenberg takes part in RetinaVerse Rounds in which retina surgeons from all over the world take part in grassroots discussions about how they manage different retina cases.

“It’s been a lot of fun because now we’re finally finding a time where we can all sit together, which I don’t get to do with my retina colleagues that often,” he said. “If I’m doing a scleral- fixated IOL and a pars plana vitrectomy, I can find out what I could be doing better. Who better to teach me than a world-renowned retina specialist?”

Houston said sometimes it can take a little convincing to get people to buy a VR headset for the full experience, but for some physicians, it just clicks.

“It’s the ones that come in and make it a regular occurrence,” he said. “They can get a feel of these types of things on an iPad or a desktop, but when they start to come around and spend more time at our rounds, they quickly start to ask about headsets and want to invest in one.”

MetaMed has been working with several societies to move forward with metaverse educational components, Houston said. Academic institutions and industry have also shown interest in finding ways to use the metaverse.

“These societies see that there’s potential, not in the way replacing in-person meetings but starting to explore ways that we can have more touchpoints, more regular interactions in these immersive, digital environments,” he said. “Ultimately, it’s really about trying to create an agnostic, immersive environment that has all stakeholders as part of it instead of building out a siloed experience. You can create a space where people can come and interact, and it can benefit all stakeholders in the process.”

Beyond physician education, spatial computing can provide a new way to guide patients through their journey as they undergo treatment. Rosenberg said spatial computing will be key in a shift to more asynchronous care rather than the current model of synchronous care.

“Synchronous care is defined as diagnostics and therapies that are administered at the time of the clinical visit,” he said. “That works for a lot of what we do, but it doesn’t work for everything.”

He said asynchronous care has potential in areas such as glaucoma screening. Typically, a patient must take time out of their schedule every 4 months to come into the clinic for visual field or retinal nerve fiber layer assessments or other tests. VR is already helping in this area, but expanding it to in-home care is in the works.

“When patients get comfortable with these technologies, they can use it in their own homes or at central stations or kiosks that are placed in more convenient places. With this asynchronous care, the kiosk can send the information to their provider to view the results. They don’t necessarily have to come in,” Rosenberg said.

The immersive experience can also provide better educational opportunities for patients, Fekrat said.

“Some of the next steps would involve patients in the waiting areas putting on a headset to see what their surgery would be like or learning about the risks and benefits,” she said. “Having interactive educational media while using a VR headset that are specific for their care plan would be a huge benefit over just watching someone talk at them through a TV screen.”

Limitations and next steps

Spatial computing technology is still in its early stages and has its limitations. The biggest barrier to wider use, particularly for AR, is the cost, Rosenberg said.

“The Apple Vision Pro starts at about $3,600, but if you look at the typical first-generation products from Apple, they’re usually quite expensive,” he said. “Ophthalmology as a whole is really technology driven. The problem is that we need to show them that there’s use cases, applicability and profitability there.”

Cost is also a limiting factor in training environments. Although Meta’s headsets are significantly less expensive than Apple’s ($299), the price may be an issue for some students, residents and fellows.

“I think using VR and even AR would be fabulous in surgical residency training programs,” Fekrat said. “There could be a time where we provide 3D headsets for the residents when they join the program, for example, similar to how some medical schools provide laptops to each student that are loaded with relevant software. If people have the hardware, the incorporation of VR into training programs will be much easier.”

Right now, however, everything comes piecemeal, with residents sharing or borrowing headsets, Fekrat said.

“I would like to see some kind of philanthropic program to donate the hardware to the students and residents for educational and training purposes,” she said.

“The first-generation spatial hardware is very promising,” Korn said, “but the software app ecosystem is still in its infancy. Early adopter startups and companies that develop the right spatial apps for ophthalmology will eventually be rewarded. It’s like planting a tree — you need to wait for it to bear fruit. As the spatial computing ecosystem grows, we’ll see more ophthalmology and health care uses in ways we’ve never imagined. Exciting times, so stay tuned.”

Click here to read the Point/Counter, “What is the most promising application of spatial computing in ophthalmology?”