Using AR and 3D technology for learning Neuroanatomy

Fireside Chat
January 10, 2022
2 minutes
read
By
Edify Admin

In this edition of the Edify Fireside Chat, we joined medical visualization and anatomy expert Yuliya Chystaya to talk about how she has used Augmented Reality (AR) and 3D technologies to create an innovative new way of teaching and learning the neuroanatomy, the structure of the human brain.  

Anatomy has long posed a challenge to teachers and learners: it’s hard to visualize and learn the complex and interlocking 3D structures in the body from 2D images such as medical textbooks and atlases like Gray’s Anatomy (the book, not the TV program—although medical students are advised against trying to learn anatomy from the show). This challenge is compounded when it comes to neuroanatomy. The human brain is one of the most complex structures known to exist, and medical students often struggle to apply what they’ve learnt in classrooms, libraries, and lecture halls, to real-life situations. This is such a common occurrence that in 1994, professor of neurology Ralph Jozefowicz noted that roughly half of all medical students experienced what he called ‘neurophobia,’ or fear of neuroscience, during their training. The gold standard for teaching anatomy has always been cadaveric dissection—demonstrating anatomy using embalmed corpses—but this technique comes with its own ethical and practical drawbacks.  It’s clear that new methods of teaching and learning anatomy are badly needed, especially in light of the COVID-19 pandemic which has made traditional, in-person teaching more difficult.

Yuliya Chystaya’s project is designed to address these issues. As the final project for her MSc in Medical Visualisation and Human Anatomy at the Glasgow School of Art’s School of Simulation and Visualisation, she created an Android app that uses AR visualization techniques to teach the anatomy of the brain. In developing her app, Yuliya used a suite of 3D modelling tools and techniques to turn MRI scans of the brain into 3D models that users could interact with in using her Android app. These included 3D Slicer and Autodesk 3ds Max for transforming the 2D MRI scan into 3D models, and Unity and AR Core to create an AR Android app. Because the brain is so complex, Yuliya had to identify different structures in the MRI scans pixel-by-pixel in order to create 3D models that would be detailed enough to teach neuroanatomy.

To make sense of the human brain’s dizzying complexity, Yuliya’s app has three elements, or views. The first of these is the 2D Scene, which compares traditional illustrations and MRI scans of six brain structures, and provides explanatory captions and text for these structures. Then there is the 3D Scene, which shows the brain in three dimensions. As well as rotating around the brain and zooming in and out, users can explode the model to see how all the tiny structures inside the brain fit together. Finally, there is the AR Scene. As well as making the Android app, Chystaya created a booklet with 2D images of brain structures, and when users use the app to scan the images in the booklet (in a similar way to scanning a QR code) the 3D model of that brain structure is superimposed over the image on the phone’s screen and users can zoom in and interact with that model. The app also has quizzes that test users’ knowledge of the material that it presents. Yuliya tested and iterated the app with the help of 12 participants who particularly appreciated the 3D models of brain structures and how the AR aspects of the app were interactive and emphasized active learning.

AR is not going to replace cadaveric dissections soon, and Chystaya is aware of its limitations. No two bodies or brains are alike, and no brain will look exactly like the 3D models in the app—cadaveric dissections are a very useful tool in showing just how bodies, organs, and brains differ from one another. But by the same token, handling embalmed cadaveric tissue is very different to handling living tissue, and as the visualization and simulation techniques develop, more detailed and realistic models making use of more sophisticated platforms can help to bridge that gap.  

AR and VR technology is becoming more and more widespread, and with tools like Yuliya’s app and Edify’s suite of VR and AR-based medical anatomy and medical lessons, the next generation of neurosurgeons could be trained in virtual reality. You don’t need a headset to see Edify in action: you can access all of Edify’s VR content through our desktop and mobile versions. Read more about Edify’s medical case studies here and see them in action by downloading our BETA.  

Curious about how you could leverage virtual reality to enhance learning outcomes? Find out more about how we partner with universities on our dedicated higher education page.

View presentation slides
Using AR and 3D technology for learning Neuroanatomy
10/1/2022
1/10/2022

Using AR and 3D technology for learning Neuroanatomy

Yuliya Chystaya is a MSc graduate of Medical Visualisation and Human Anatomy, experienced in visual communication, technology, and biological sciences, and holding a range of digital skills, as well as ability to collaborate and present ideas clearly.

In this edition of the Edify Fireside Chat, we joined medical visualization and anatomy expert Yuliya Chystaya to talk about how she has used Augmented Reality (AR) and 3D technologies to create an innovative new way of teaching and learning the neuroanatomy, the structure of the human brain.  

Anatomy has long posed a challenge to teachers and learners: it’s hard to visualize and learn the complex and interlocking 3D structures in the body from 2D images such as medical textbooks and atlases like Gray’s Anatomy (the book, not the TV program—although medical students are advised against trying to learn anatomy from the show). This challenge is compounded when it comes to neuroanatomy. The human brain is one of the most complex structures known to exist, and medical students often struggle to apply what they’ve learnt in classrooms, libraries, and lecture halls, to real-life situations. This is such a common occurrence that in 1994, professor of neurology Ralph Jozefowicz noted that roughly half of all medical students experienced what he called ‘neurophobia,’ or fear of neuroscience, during their training. The gold standard for teaching anatomy has always been cadaveric dissection—demonstrating anatomy using embalmed corpses—but this technique comes with its own ethical and practical drawbacks.  It’s clear that new methods of teaching and learning anatomy are badly needed, especially in light of the COVID-19 pandemic which has made traditional, in-person teaching more difficult.

Yuliya Chystaya’s project is designed to address these issues. As the final project for her MSc in Medical Visualisation and Human Anatomy at the Glasgow School of Art’s School of Simulation and Visualisation, she created an Android app that uses AR visualization techniques to teach the anatomy of the brain. In developing her app, Yuliya used a suite of 3D modelling tools and techniques to turn MRI scans of the brain into 3D models that users could interact with in using her Android app. These included 3D Slicer and Autodesk 3ds Max for transforming the 2D MRI scan into 3D models, and Unity and AR Core to create an AR Android app. Because the brain is so complex, Yuliya had to identify different structures in the MRI scans pixel-by-pixel in order to create 3D models that would be detailed enough to teach neuroanatomy.

To make sense of the human brain’s dizzying complexity, Yuliya’s app has three elements, or views. The first of these is the 2D Scene, which compares traditional illustrations and MRI scans of six brain structures, and provides explanatory captions and text for these structures. Then there is the 3D Scene, which shows the brain in three dimensions. As well as rotating around the brain and zooming in and out, users can explode the model to see how all the tiny structures inside the brain fit together. Finally, there is the AR Scene. As well as making the Android app, Chystaya created a booklet with 2D images of brain structures, and when users use the app to scan the images in the booklet (in a similar way to scanning a QR code) the 3D model of that brain structure is superimposed over the image on the phone’s screen and users can zoom in and interact with that model. The app also has quizzes that test users’ knowledge of the material that it presents. Yuliya tested and iterated the app with the help of 12 participants who particularly appreciated the 3D models of brain structures and how the AR aspects of the app were interactive and emphasized active learning.

AR is not going to replace cadaveric dissections soon, and Chystaya is aware of its limitations. No two bodies or brains are alike, and no brain will look exactly like the 3D models in the app—cadaveric dissections are a very useful tool in showing just how bodies, organs, and brains differ from one another. But by the same token, handling embalmed cadaveric tissue is very different to handling living tissue, and as the visualization and simulation techniques develop, more detailed and realistic models making use of more sophisticated platforms can help to bridge that gap.  

AR and VR technology is becoming more and more widespread, and with tools like Yuliya’s app and Edify’s suite of VR and AR-based medical anatomy and medical lessons, the next generation of neurosurgeons could be trained in virtual reality. You don’t need a headset to see Edify in action: you can access all of Edify’s VR content through our desktop and mobile versions. Read more about Edify’s medical case studies here and see them in action by downloading our BETA.  

Curious about how you could leverage virtual reality to enhance learning outcomes? Find out more about how we partner with universities on our dedicated higher education page.

No items found.
Subscribe to our newsletter
You may unsubscribe from these communications at any time. For information on how to unsubscribe, as well as our privacy practices and commitment to protecting your privacy, please review our Privacy Policy.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.