Image

Watching the Mice DANNCE

Over the past few years, the lab has been working on developing an imaging box, which takes multi-angle videos of mice in a box, observing their locomotion and other behavior. It provides a way for us to more accurately quantify the behavioral phenotype we often observe in mice due to our experiments. Just recently, the box was finished and put into use, and now it’s time to test and verify its efficacy.

When we look at mice behavior in the lab, we typically use an algorithm like 3-Dimensional Aligned Neural Network for Computational Ethology (DANNCE), which is more robust than traditional techniques because by using machine learning, it can create a virtual diagram of a mouse using points in space and analyze how those points move about over time.

DANCCE Algorithm at Work from Dunn, T.W., Marshall, J.D., Severson, K.S. et al. Geometric deep learning enables 3D kinematic profiling across species and environments. Nat Methods 18, 564–573 (2021). https://doi.org/10.1038/s41592-021-01106-6

This gives us a more refined way to quantify mouse behaviors like grooming and turning associated with Parkinson’s disease, the lab’s ultimate focus.

Our experiment consists of testing two variables, drug dosage and the circadian rhythm, on mice behavior, locomotion specifically. By using a technique called Principal Component Analysis, we will take the data of mice moving in 3-dimensions and compress it onto a single image from which we can see differences in mice locomotion. With further analysis, we hope to be able to show that our box does indeed pick up on the differences, no matter how subtle, between mice behaviors in a quantitative and informative way.

Leave a Reply

Your email address will not be published. Required fields are marked *