AI-Driven Agricultural Robots: Advancing Farming Efficiency

The rapid evolution of agricultural robots due to the advancements in control approaches, sensing, and computer science has significantly increased their importance in different farming applications. Moreover, the effectiveness of agricultural robots has also been improved by integrating control and execution, perception, and decision-making techniques. This article discusses different applications of artificial intelligence (AI) in agricultural robots used in farming.

Image credit: Scharfsinn/Shutterstock
Image credit: Scharfsinn/Shutterstock

Importance of AI in Agricultural Robots

In recent years, robotic systems have become more accessible and advanced, which led to their growing adoption in agriculture to reduce labor costs and increase crop yields. Advanced technologies, including AI, big data, and the Internet of Things (IoT), can be used with autonomous robotic systems to automatically assess crop status, interpret collected data, and plan timely and effective interventions to respond to unexpected events and changes in crop conditions.

The use of machine learning (ML) and AI techniques in agricultural robots can significantly increase their productivity and address their existing issues. For instance, vision systems coupled with AI can be used to improve the perception of robots. Agricultural robots must possess the ability to correctly perceive and understand their surroundings to achieve greater operational effectiveness.

Various AI-based Tools Used in Agriculture

Currently, different sensing devices, such as bump sensors, soil sensors, sonar systems, and red, green, and blue (RGB) cameras, and several AI algorithms, including computationally intensive and highly complex algorithms and faster to execute and less complex algorithms, are used in combination to realize enhanced vision-based robotic perception for specific agricultural tasks.

For instance, RGB cameras with low resolution coupled with different machine vision weed detection AI algorithms can be used in weeding robots to distinguish and recognize crops from weeds. Faster-region-based convolutional neural networks (R-CNN) algorithms can be deployed in robotic weeding systems for weed detection in peach, dandelion, garden cress, and red radish.

Similarly, extended skip network (ESNet), you only look once (YOLO)v3, mask RCNN, and Haar cascade classifier can be used for weed detection in rice, maize, common bean/maize, and cabbage, respectively. K-nearest neighbors (KNN), support vector machine (SVM), random forest (RF), decision tree, and convolutional neural network (CNN) can also be used along with RGB cameras for weed detection in 32 types of crops. All AI algorithms have demonstrated accuracies between 70% and 99.75%. Stereo cameras and infrared cameras can also be utilized as alternatives to RGB cameras.

Several robotic solutions have been developed for crop scouting to optimize costs and predict future crop needs. Spectral cameras, stereo cameras, color cameras, and different sensors are used as vision devices in these solutions. Different AI algorithms, including predictive, background segmentation, and object detection algorithms, are also employed to translate the information collected by the sensor into actionable and meaningful data.

For instance, a simple linear regression and normalized difference vegetation index (NDVI) based segmentation can be used with spectral cameras for image segmentation in soybeans. Similarly, faster-RCNN-based detection and centroid-based counting can be utilized with RGB cameras for fruit counting in greenhouse tomatoes.

Partial least squares (PLS) algorithm can be used with spectral/thermal cameras for determining water status in grapes. Thus, AI-assisted vision systems can effectively detect most attributes regarding plant health and status visually in the infrared, near-infrared, and visible regions.

Robotic phenotyping platforms equipped with vision devices, such as RGB, stereo cameras, and infrared sensors, can be used for plant phenotyping. AI plays a critical role in extracting plant trait information from the data collected by the vision sensors. ML techniques such as logistic regression function are employed for plant classification, while deep learning (DL) techniques such as faster RCNN and CNN detect objects such as leaves, stems, and stalks.

For instance, CNN and faster-RCNN/convex hull and plane projection can be used with RGB-Depth (RGB-D) sensors for maize stalk detection and stem detection, respectively. Similarly, transfer learning and CNN with Softmax layer replaced by SVM can be utilized with RGB cameras for corn stand counting.

RGB-D sensors can be equipped with structure from motion algorithms, SVM, and multilayer perceptron algorithms for leaf width, length, and area measurements in sorghum. Thus, phenotyping can be fully automated using machine vision algorithms, as most plant traits can be determined using visual information.

Insect/deficiency/disease detection is a critical task agricultural robots perform. Several vision-based anomaly detection robotics systems equipped with color cameras, thermal/spectral sensors, and RGB-D sensors have been developed for anomaly detection with enhanced accuracy and detection rates.

The robotic solutions also exploit multiple AI algorithms, including neural networks (NNs) such as SqeezeNet and AlexNet, K-means, and SVM, to realize high accuracy in anomaly detection, such as over 90% accuracy and over 98% accuracy in disease detection in cotton and greenhouse tomato, respectively. For instance, RGB sensors can be equipped with SVM/RF/CNN to detect leaf mold and yellow leaf curl virus in greenhouse tomatoes, while mask R-CNN and visual geometry group (VGG)16 can be used with thermal/spectral/RGB-D sensors for rust and scab detection in apples.

Similarly, Otsu segmentation and Hu moments are used with RGB sensors for Pyralidae insect detection. RGB equipped with NNs can detect bacterial blight/magnesium deficiency in cotton and anthracnose/leaf spot in groundnut. Robots with machine vision are increasingly being used for spraying applications to reduce harvest losses, prevent diseases, and control pests and weeds. For instance, RGB sensors/cameras with data fusion algorithm, Huff transformation, and image analysis can be utilized for spraying applications in different crops.

In the last few years, the steady decline in agricultural workforce availability and constantly rising workforce costs have reduced yields and increased production costs, which has increased the importance of agricultural robots for harvesting. For instance, RGB cameras equipped with CNN with improved YOLOv5, mask R-CNN, and custom vision strawberry localization method, SVM with radial basis function, and image pre- and post-processing, and YOLOv3 can be used for apple and zanthoxylum pepper harvesting, strawberry harvesting, apple harvesting, and apple and orange harvesting, respectively. 

Recent Developments

In a study recently published in the journal Acta Technologica Agriculturae, researchers demonstrated a prototype AI-driven small-scale autonomous mobile robot for precision agriculture. The study aimed to evaluate modern technologies that can be utilized to construct an affordable and easily operable robot that can perform simple tasks, including spraying and weed detection.

Researchers designed the robot as an end-user autonomous mobile system capable of self-localization and can inspect/map a specific farming area. The decision-making capabilities of the proposed agricultural robot were based on AI algorithms, which allowed it to perform specific actions depending on the situation and the surrounding environment.

The precision agriculture robotic platform depended on image processing algorithms and computer vision for automatic detection of colors and weeds, avoidance of obstacles, and visual navigation. Supervised ML problems, including object detection, instance segmentation, localization, and classification, were investigated and utilized to evaluate robot performance.

Deep neural classification models were employed to perform the object detection and image processing tasks. The TensorFlow object detection application programming interface (API) was used as the robot training platform for weed recognition.

A visual navigation solution based on depth and tracking cameras/GPS-based navigation solution was used to solve the navigation issues. The developed AI-driven prototype robot successfully performed all tasks, including color and weed detection and obstacle detection and avoidance, demonstrating the feasibility of developing inexpensive AI-driven robotic solutions for precision farming using modern technologies.

In another study published in the journal AI, researchers used spatial AI to design an autonomous robotic platform that can avoid collision while navigating crop rows to obtain more consistent measurements during the wheat growing season.

The MobileNet single shot detector (SSD) was used as the DL model to detect wheat in the field. Researchers trained the MobileNet SSD on the wheat images and used a new Luxonis Depth AI stereo camera to increase the frame rate for robot response to field environments in real time.

The camera and newly trained model achieved a frame rate of 18–23 frames per second (fps), which was sufficient for the robot to process its surroundings once every two to three inches of driving. Comparative analysis of the MobileNet SSD model with YOLOv5 and F-RCNN object detection models based on model precision and inference speed also displayed the effectiveness of the proposed model.

References and Further Reading

Beloev, I., Kyuchukova, D., Georgiev, G., Hristov, G., Zahariev, P. (2021). Artificial Intelligence-Driven Autonomous Robot for Precision Agriculture. Acta Technologica Agriculturae, 24, 48-54. https://doi.org/10.2478/ata-2021-0008

Cheng, C., Fu, J., Su, H., Ren, L. (2023). Recent Advancements in Agriculture Robots: Benefits and Challenges. Machines, 11(1), 48. https://doi.org/10.3390/machines11010048

Fountas, S., Malounas, I., Athanasakos, L., Avgoustakis, I. (2022). AI-Assisted Vision for Agricultural Robots. AgriEngineering, 4(3), 674-694. https://doi.org/10.3390/agriengineering4030043

Gunturu, S., Munir, A., Ullah, H., Welch, S., Flippo, D. (2022). A Spatial AI-Based Agricultural Robotic Platform for Wheat Detection and Collision Avoidance. AI, 3(3), 719-738. https://doi.org/10.3390/ai3030042

Last Updated: Jul 29, 2023

Samudrapom Dam

Written by

Samudrapom Dam

Samudrapom Dam is a freelance scientific and business writer based in Kolkata, India. He has been writing articles related to business and scientific topics for more than one and a half years. He has extensive experience in writing about advanced technologies, information technology, machinery, metals and metal products, clean technologies, finance and banking, automotive, household products, and the aerospace industry. He is passionate about the latest developments in advanced technologies, the ways these developments can be implemented in a real-world situation, and how these developments can positively impact common people.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Dam, Samudrapom. (2023, July 29). AI-Driven Agricultural Robots: Advancing Farming Efficiency. AZoAi. Retrieved on December 22, 2024 from https://www.azoai.com/article/AI-Driven-Agricultural-Robots-Advancing-Farming-Efficiency.aspx.

  • MLA

    Dam, Samudrapom. "AI-Driven Agricultural Robots: Advancing Farming Efficiency". AZoAi. 22 December 2024. <https://www.azoai.com/article/AI-Driven-Agricultural-Robots-Advancing-Farming-Efficiency.aspx>.

  • Chicago

    Dam, Samudrapom. "AI-Driven Agricultural Robots: Advancing Farming Efficiency". AZoAi. https://www.azoai.com/article/AI-Driven-Agricultural-Robots-Advancing-Farming-Efficiency.aspx. (accessed December 22, 2024).

  • Harvard

    Dam, Samudrapom. 2023. AI-Driven Agricultural Robots: Advancing Farming Efficiency. AZoAi, viewed 22 December 2024, https://www.azoai.com/article/AI-Driven-Agricultural-Robots-Advancing-Farming-Efficiency.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.