[ad_1]
Machine learning approach automates behavior analysis, outperforms human assessment in animal models of epilepsy.
By using state-of-the-art technology to analyze behavior patterns in mice with epilepsy, researchers may be able to study the disorder better and identify potential treatments.
Researchers funded by the National Institutes of Health used AI technology to determine behavioral “fingerprints” in mice not evident by the human eye.
Such automated behavioral phenotyping needed only one hour of video recording and did not require researchers to wait for the rare seizure event. The study, supported by the National Institute of Neurological Disorders and Stroke (NINDS), part of NIH, is published in Neuron.
Scientists found that this machine learning-assisted 3D video analysis outperformed the traditional approach, in which analyses rely on human observation to label the behavioral signs of epilepsy in animal models during seizures.
The labor-intensive process requires constant video monitoring of the mice over many days or weeks while recording their brain wave activity with electroencephalography (EEG). The team led by Stanford researchers studied mice with acquired and genetic epilepsies.
They found that machine analysis was better able to distinguish epileptic vs non-epileptic mice than trained human observers. The AI program also identified distinct behavioral phenotypes at different points in the development of epilepsy.
Notably, researchers were able to use the AI program to distinguish different patterns of behavior in mice after they were given one of three anti-epileptic drugs. This demonstrates that the tool could be used for rapid, automated anti-epileptic drug testing. Ultimately, the use of automated phenotyping for animal studies of the epilepsies could revolutionize how research is done, speeding discovery and reducing costs.
The machine-learning technology used in the study, called MoSeq(link is external) for Motion Sequencing, locates, tracks, and quantifies the behavior of freely moving mice in each frame of the video.
The information is used to train the unsupervised machine learning model to identify repeated motifs of behavior (called “syllables” – e.g., a right turn or headbob to the left). MoSeq predicts the order (or “grammar”) in which syllables occur, allowing fast and objective characterization of mouse behavior.
Source: NIH
[ad_2]
Source link