UGAR Scholars Win Research Prizes

UGAR scholars are the winners of the 2020 Neukom research prizes for outstanding undergraduate research in computational science.

2020 Neukom Research prizes for outstanding undergraduate research in computational science

1st prize:  How are Patients Moving Outside of the Clinic? Categorizing Activities using Remotely Captured Wearable Data and Machine Learning.

Megan McCabe (Thayer School), Advisor: Douglas Van Citters

Wearable sensors were leveraged to develop two methods for computing hip joint angles and moments during walking and stair ascent that are more portable than the gold standard. The Insole-Standard (I-S) approach replaced force plates with force-measuring insoles and achieved results that match the curvature of results from similar studies. Peaks in I-S kinetic results are high due to error induced by applying the ground reaction force to the talus. The Wearable-ANN (W-A) approach combines wearables with artificial neural networks to compute the same results. Compared against the I-S, the W-A approach performs well (average rRMSE = 18%, R2 0.77).

Chapman, RM, McCabe, MV, & Van Citters, DW. What are Patients Doing Outside the Clinic? Categorizing Activities using Remotely Captured Wearable Data and Machine Learning. ASME J Biomech Eng. Under Review.

Megan's UGAR program participation:

  • WISP intern 
  • Sophomore Research Scholar 
  • Presidential Scholar 
  • Conference grant to present at the Orthopedic Research Society 2019 Annual Meeting

2nd prize: A Deep Learning Approach to Understanding Real-World Scene Perception in Autism. 

Erica Busch (Psychological and Brain Sciences), Advisor: Caroline Robertson

Around 90% of individuals with autism experience sensory sensitivities, but our understanding of these symptoms is limited by past studies' unrealistic experimental designs and unreproducible results. We use a novel combination of virtual reality, eyetracking, and convolutional neural networks to model the staged of visual processing that predict differences in visual attention between individuals with and without autism. We find that even the earliest stages of the model can predict differences in gaze behavior between autists and controls. This suggests that visual processing differences in autism are not principally driven by the semantically-meaningful features within a scene but emerge from differences in early visual processing. 

Erica's UGAR program participation:

3rd prize: Evaluation of a Deep Neural Network for Automated Classification of Colorectal Polyps on Histopathologic Slides. 

Jason Wei (Computer Science), Advisor: Saeed Hassanpour

Histological classification of colorectal polyps plays a critical role in both screening for colorectal cancer and care of affected patients. An accurate, automated system for classifying colorectal polyps on digitized histopathology slides could benefit clinicians and patients. In this study, we developed a deep neural network for classification of four major colorectal polyp types based on digitized histopathology slides from the Dartmouth-Hitchcock Medical Center (DHMC). The neural network achieved performance comparable with pathologist diagnoses made at the point-of-care. If confirmed in clinical settings, our model could assist pathologists by improving the diagnostic efficiency, reproducibility, and accuracy of colorectal cancer screenings. 

Jason W. Wei, Arief A. Suriawinata, Louis J. Vaickus, Bing Ren, Xiaoying Liu, Mikhail Lisovsky, Naofumi Tomita, Behnaz Abdollahi, Adam S. Kim, Dale C. Snover, John A. Baron, Elizabeth L. Barry, Saeed Hassanpour, "Evaluation of a Deep Neural Network for Automated Classification of Colorectal Polyps on Histopathologic Slides", JAMA Network Open, 3(4):e203398, 2020.

Jason's UGAR program participation:

  • Sophomore Research Scholar 
  • Junior Research Scholar
  • 2 conference grants 

3rd prize: VR-Notes: A Perspective-Based,  Multimedia Annotation System in Virtual Reality

Justin Luo (Computer Science), Advisor: X.D. Yang

To improve user productivity in virtual reality (VR), annotation systems allow users to capture insights and observations while in VR sessions. I propose VR-Notes, a design for an annotation system in VR that captures the annotator's perspective for "doodle" and audio annotations, aiming to provide a richer viewing experience of these annotations at a later time. Early results from my experiment showed that the VR-Notes doodle method required less movement and rotation in the headset and controllers when compared to a popular freehand drawing method. Users also preferred and scored the VR-Notes doodle method higher than the freehand drawing method.

Justin's UGAR program participation: