Translating the 'language of behavior' with artificially intelligent motion capture

#artificialintelligence 

Now, a collaboration between the labs of Princeton professors Mala Murthy and Joshua Shaevitz has gone a step further, using the latest advances in artificial intelligence (AI) to automatically track animals' individual body parts in existing video. Their new tool, LEAP Estimates Animal Pose (LEAP), can be trained in a matter of minutes to automatically track an animal's individual body parts over millions of frames of video with high accuracy, without having to add any physical markers or labels. "The method can be used broadly, across animal model systems, and it will be useful to measuring the behavior of animals with genetic mutations or following drug treatments," said Murthy, an associate professor of molecular biology and the Princeton Neuroscience Institute (PNI). The paper detailing the new technology will be published in the January 2019 issue of the journal Nature Methods, but its open-access version, released in May, has already led to the software being adopted by a number of other labs. When the researchers combine LEAP with other quantitative tools developed in their labs, they can study what they call "the language of behavior" by observing patterns in animal body movements, said Shaevitz, a professor of physics and the Lewis-Sigler Institute for Integrative Genomics.