FulMAI: Revolutionizing Marmoset Behavior Analysis

In an article published in the journal Communications Biology, researchers from the Central Institute for Experimental Medicine and Life Science, Kawasaki, Japan, developed an innovative system named full monitoring and animal identification (FulMAI) for automatically tracking and analyzing the behavior of multiple marmosets in a family cage without using any artificial markers. Their device combines video tracking, light detection and ranging (LiDAR), and deep learning to obtain the three-dimensional (3D) trajectories of each marmoset and detect their social behaviors such as grooming.

Study: FulMAI: Revolutionizing Marmoset Behavior Analysis. Image credit: Iuliia Timofeeva/Shutterstock
Study: FulMAI: Revolutionizing Marmoset Behavior Analysis. Image credit: Iuliia Timofeeva/Shutterstock

Background

Marmosets are small nonhuman primates that have many behavioral and social characteristics similar to humans. They are widely used as model animals for studying brain function, development, aging, and neurological diseases. To evaluate the long-term changes in behavior and brain function of marmosets, it is important to capture their natural behavior in a stress-free environment and monitor their interactions with other individuals in a family group.

However, existing methods for tracking and analyzing marmoset behavior have several limitations, such as requiring artificial markers, being suitable only for single animals or short-term observations, or having low accuracy and resolution. Therefore, there is a need for a system that can capture the natural behavior of multiple marmosets in a free-moving condition and link it to other physiological or cognitive data.

About the Research

In the present paper, the authors designed FulMAI for the behavioral analysis and tracking of marmosets. Their tool effectively combined the potential of LiDAR, video tracking, and facial recognition to track the 3D position and behavior of each marmoset in a family group without any artificial labeling. LiDAR is a technology that uses laser beams to measure the distance and shape of objects. Video tracking is a technique that uses object detection algorithms to locate the marmosets in the video frames. Facial recognition is a method that uses deep learning to identify each marmoset by its facial features.

The proposed system consisted of four LiDARs and four cameras installed in front of a home cage with acrylic panels. The cameras captured the images of the marmosets, and the LiDARs measured the distance and shape of the objects in the cage. These data were utilized to identify the 3D coordinates and the video images of the marmosets, respectively. The video images were processed by you only look once (YOLO), an object detection algorithm usually employed to detect the face, body, and behavior of the marmosets.

The face images were then classified by a convolutional neural network (CNN) model called very deep convolutional networks (VGG19) to assign individual identity documents (IDs) to each marmoset. The body coordinates were combined with the LiDAR data to calculate the 3D trajectory of each marmoset. The behavior detection was also performed by YOLO, and the behavior data were linked to the nearest marmoset ID. Moreover, the same YOLO model was utilized to detect the grooming behavior of the marmosets, which was a typical social behavior.

Research Findings

The authors tested the performance and accuracy of their system using a family of three marmosets, including a father, a mother, and a juvenile in a cage for one month. They found that the new tool could track each marmoset continuously for an average of 2.8 minutes, achieving an overall identification accuracy of 98%.

Additionally, the system detected grooming behavior with a precision of 90.5% and a recall of 79.1%, along with capturing location preferences, inter-individual distances, and grooming behaviors. Operating for over a month without data accumulation or system failure, the system captured behavioral changes over time and across various situations without causing harm or stress to the marmosets or affecting their natural behavior.

FulMAI has proven itself as a powerful tool for tracking and analyzing marmoset behavior in a free-moving environment, providing detailed data on activity, position, interaction, and social behavior within a family group over their lifetime. Its potential extends to other fast-moving 3D animal species like tamarins, fish, and birds. Moreover, it could serve as a valuable resource for studying brain function, development, aging, and disease in marmosets and other animals, offering insights into how behavior responds to factors such as environmental enrichment, cognitive tasks, or pharmacological interventions.

Conclusion

In summary, the novel device is effective, accurate, flexible, and reliable for mitigating limitations and enabling longitudinal and comprehensive analysis of marmoset behavior in a home cage without using any artificial markers. It can efficiently track changes in the natural behavior and brain function of marmosets in a stress-free environment and over their lifespan.

The researchers acknowledged the limitations and challenges and suggested directions for future work. They recommended that further research could improve the system by adding more features, such as detecting more behaviors, incorporating unsupervised learning, and integrating with other devices, such as touch screens or starlight cameras. Moreover, they highlighted that the new system could contribute to advancing marmoset research and animal behavior science.

Journal reference:
Muhammad Osama

Written by

Muhammad Osama

Muhammad Osama is a full-time data analytics consultant and freelance technical writer based in Delhi, India. He specializes in transforming complex technical concepts into accessible content. He has a Bachelor of Technology in Mechanical Engineering with specialization in AI & Robotics from Galgotias University, India, and he has extensive experience in technical content writing, data science and analytics, and artificial intelligence.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Osama, Muhammad. (2024, February 28). FulMAI: Revolutionizing Marmoset Behavior Analysis. AZoAi. Retrieved on December 22, 2024 from https://www.azoai.com/news/20240228/FulMAI-Revolutionizing-Marmoset-Behavior-Analysis.aspx.

  • MLA

    Osama, Muhammad. "FulMAI: Revolutionizing Marmoset Behavior Analysis". AZoAi. 22 December 2024. <https://www.azoai.com/news/20240228/FulMAI-Revolutionizing-Marmoset-Behavior-Analysis.aspx>.

  • Chicago

    Osama, Muhammad. "FulMAI: Revolutionizing Marmoset Behavior Analysis". AZoAi. https://www.azoai.com/news/20240228/FulMAI-Revolutionizing-Marmoset-Behavior-Analysis.aspx. (accessed December 22, 2024).

  • Harvard

    Osama, Muhammad. 2024. FulMAI: Revolutionizing Marmoset Behavior Analysis. AZoAi, viewed 22 December 2024, https://www.azoai.com/news/20240228/FulMAI-Revolutionizing-Marmoset-Behavior-Analysis.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Archaeoscape Bridges Deep Learning and ALS to Transform Archaeological Discoveries