AI Powered Autonomous Robotic Rover for Pest Management: An Experimental Study

Amrutha A. Nair, E. Vaishnavi, Rana Noufal, Ann Mathew and M. S. Soorya
Department of Electronics and Communication, Vimal Jyothi Engineering College, Kerala, India Research Organization Registry (ROR)
Correspondence to: Amrutha A. Nair, amruthaanair96@gmail.com

Premier Journal of Science

Additional information

  • Ethical approval: N/a
  • Consent: N/a
  • Funding: No industry funding
  • Conflicts of interest: N/a
  • Author contribution: Amrutha A. Nair, E. Vaishnavi, Rana Noufal, Ann Mathew and M. S. Soorya – Conceptualization, Writing – original draft, review and editing.
  • Guarantor: Amrutha A. Nair
  • Provenance and peer-review: Unsolicited and externally peer-reviewed
  • Data availability statement: N/a

Keywords: Autonomous pest-control rover, YOLOv8
real-time detection, Roboflow image dataset, Nodemcu ESP32 controller, Targeted pesticide spraying.

Peer Review
Received: 15 August 2025
Last revised: 17 December 2025
Accepted: 17 December 2025
Version accepted: 5
Published: 31 January 2026

Plain Language Summary Infographic
“Bright cinematic infographic illustrating an AI-powered autonomous robotic rover for pest management in agriculture. The visual shows a four-wheel rover navigating crop fields with an IP camera and GPU processing unit, using YOLO-based object detection to identify pests and activate targeted pesticide spraying, reducing chemical usage while improving crop yields and sustainability.”
Abstract

Pest infestations significantly threaten crop production, leading to economic losses and food insecurity. Traditional chemical pesticides cause environmental pollution and pest resistance. This article presents an autonomous robotic rover employs a four-wheel, Mars rover-inspired platform with anti-slip rubber grips for robust navigation across agricultural terrains. Real-time pest detection is achieved using an IP camera and a GPU accelerated laptop, leveraging Roboflow for dataset annotation and You Only Look Once version for efficient object detection. Upon detection, a targeted pesticide spray is activated, minimizing pesticide. The methodology focuses on a real-time pest detection pipeline and a custom-designed controller using NodeMCU ESP32. Preliminary testing demonstrates accurate pest detection and targeted spraying, reducing pesticide usage by compared to traditional methods. This approach offers an environmentally friendly alternative to conventional pest control, enhancing crop yields and minimizing environmental impact.

Introduction

Agriculture plays a pivotal role in global food security, yet it faces persistent challenges from pest infections. These infestations lead to substantial crop damage, economic losses, and diminished food availability. Traditional pest control methods, heavily reliant on chemical pesticides, have exacerbated environmental issues such as soil degradation, water contamination, and the development of pest resistance. Consequently, there is an urgent need for sustainable and innovative pest management solutions that safeguard both environmental integrity and agricultural productivity. The limitations of conventional pest control methods, including manual spraying and broad-spectrum pesticide application, necessitate the development of precise and automated systems.

Manual spraying is labor-intensive, time-consuming, and often results in the over-application of chemicals, leading to environmental harm and increased costs. Broad-spectrum pesticides, while effective in the short term, contribute to pest resistance and disrupt ecological balance. Furthermore, the lack of real-time pest detection capabilities in traditional methods hinders timely intervention, allowing infestations to spread and cause significant damage. To address these challenges, this article introduces an AI-powered autonomous robotic rover designed for precise pest management in agricultural fields. This innovative system leverages advanced image processing and machine learning (ML) algorithms, specifically You Only Look Once version (YOLOv8), to detect pests in real-time.

An integrated IP camera captures video feeds, which are then processed by a GPU-accelerated laptop to identify pests accurately. Upon detection, the rover initiates a targeted pesticide spray, minimizing chemical usage and environmental Impact. The rover’s robust design, featuring a four-wheel, Mars rover-inspired platform with anti-slip rubber grips, enables it to navigate diverse agricultural terrains efficiently. This project aims to revolutionize pest management by offering an intelligent, responsive, and environmentally sustainable alternative to traditional methods. By automating pest detection and control, the rover reduces dependency on manual labor, minimizes chemical usage, and enhances crop yields. Ultimately, this research contributes to a future of farming that prioritizes both productivity and ecological responsibility.

Literature Survey

The paper ‘Robotic solutions for precision agriculture’ presents an in-depth review of robotic solutions in precision agriculture, detailing how robotics is revolutionizing farming by enhancing efficiency, sustainability, and productivity. The article outlines the growing importance of automation in agricultural practices due to increasing global food demands, climate change, and labor shortages. Robots are utilized in tasks such as seeding, weeding, harvesting, spraying, pruning, and monitoring, often equipped with AI, ML, sensors, GPS/GNSS, and unmanned aerial systems. These technologies enable data-driven decision-making, minimize resource waste, and promote environmentally friendly practices.

Examples include robotic transplanters, autonomous fruit pickers, aerial disease detection drones, and precision sprayers. The paper also discusses applications in livestock management, where robots aid in feeding and health monitoring. While the advantages include reduced labor costs, real-time data collection, and improved crop quality, the authors also acknowledge challenges such as high costs, technical limitations, lack of scalability for small farms, and the need for robust navigation and interaction systems. The review emphasizes the necessity of collaborative research, policy support, education, and infrastructure development to overcome these barriers and realize the full potential of robotics in sustainable agriculture.1


‘Recent advancements in agriculture robotics: Benefits and Challenges’ This article provides a thorough review of the recent advancements, applications, and challenges of agricultural robots, highlighting their transformative role in modern precision farming. The authors categorize agricultural robots into three main groups—field robots, fruit and vegetable robots, and animal husbandry robots—based on their operational domains and functionalities. Field robots are primarily used in large-scale, open-field farming tasks such as tillage, seeding, pesticide spraying, harvesting, and real-time crop monitoring. These robots utilize technologies such as RTK-GNSS for navigation, AI-based perception systems, and multi-sensor integration to achieve precision and efficiency, with notable examples including high-accuracy seeding robots and autonomous pesticide sprayers designed to minimize chemical usage and human exposure.

Fruit and vegetable robots operate mostly in controlled environments like greenhouses, where tasks such as transplanting, patrolling, spraying, gardening, and harvesting require delicate handling and advanced perception. These systems leverage deep learning algorithms (e.g., YOLOv4, convolutional neural network [CNNs]), RGB-D cameras, soft robotic grippers, and intelligent control systems for accurate detection, disease monitoring. Across all categories, the paper underscores that key enabling technologies include AI, machine vision, sensor fusion, IoT, and autonomous navigation, which together support robots in performing tasks under dynamic, complex agricultural environments.

Despite these advancements, several challenges persist—high development and deployment costs, inadequate legislative frameworks, limited performance under unstructured outdoor conditions, and lack of scalability for small farms. The authors also highlight that most robots remain at the prototype stage, with commercialization efforts lagging due to technical and economic constraints. To bridge this gap, the review calls for interdisciplinary collaboration, policy support, and further research in areas such as human–robot interaction, sensor robustness, affordability, and full automation. The paper provides a comprehensive snapshot of the current state and potential of robotics in agriculture, reinforcing its crucial role in advancing sustainable, high-efficiency food production systems.2

The article ‘Evaluation of machine learning approaches for precision farming in smart agricultural system’ presents a comprehensive review of the role of ML in revolutionizing modern agriculture through smart farming (SF) and precision agriculture technologies. The authors explore how ML, combined with IoT, AI, drones, and other ICT tools, can automate and optimize various stages of farming from pre-harvest activities like soil analysis, seed selection, and crop disease detection, to harvesting, post-harvest processing, and yield forecasting. The study underscores how ML algorithms such as SVM, ANN, CNN, and DL models are used for classification, disease diagnosis, phenotyping, irrigation management, and pest control. Precision agriculture benefits from real-time data collection through sensors, UAVs, and robotic systems, which enable targeted interventions, efficient resource usage, and reduced environmental impact. The paper discusses specific ML applications in disease detection using image processing, where models like Alex Net, VGG-16, Res Net, and Squeeze Net show high accuracy in identifying diseases in crops like tomatoes, rice, wheat, and citrus.

It also emphasizes the value of multimodal data fusion and deep learning for early-stage detection. Beyond disease classification, the study explores advanced phenotyping, efficient pesticide and fertilizer application, automated weed detection, and smart irrigation using predictive models and fuzzy logic systems. Moreover, the role of robotic solutions and UAVs in reducing labour, improving crop monitoring, and enhancing harvest timing is detailed with practical implementations such as See & Spray systems, Guardian-Z10 drones, and robotic arms using SSDs for fruit picking. The paper highlights the increasing use of smart systems in post-harvest operations, addressing packaging, transportation, and spoilage reduction, all backed by ML analytics. Challenges such as high implementation costs, data privacy, connectivity in rural areas, and the complexity of integrating diverse systems are acknowledged. The authors conclude that despite these barriers, ML and AI-driven smart agriculture hold immense potential for transforming global food production, sustainability, and economic resilience.3

In the paper ‘The technologies driving the shift to smart farming’, introduces a systematic and expansive review of the technologies driving the shift from traditional to SF, addressing the growing global demand for food, limited agricultural labor, climate challenges, and the need for resource efficiency. The paper surveys 588 IEEE publications using the Cochrane systematic review methodology to identify and analyze major enabling technologies and integration challenges in SF. The study identifies five core technological themes: sensors, communication, big data, actuators and machines, and data analysis. Sensors, including wireless sensor networks, are foundational in SF for monitoring environmental, soil, and crop conditions, and the authors explore both stationary and mobile deployments, highlighting trade-offs in cost, coverage, and power efficiency.

Communication technologies are categorized into machine-to-gateway and machine-to-cloud protocols, encompassing Zigbee, Bluetooth, Wi-Fi, LPWAN (e.g., LoRa, NB-IoT, Sigfox), and internet-based protocols like MQTT and CoAP. The authors stress the balance between data rate, power consumption, and coverage, particularly when choosing communication technologies for rural or large-scale applications. The paper also examines robotic actuators and automated machinery in agriculture, including robotic platforms for seeding, spraying, harvesting, and soil plowing, alongside advancements in autonomous irrigation systems. Energy efficiency and system robustness are recurring themes, with solutions such as solar-powered systems, energy harvesting, and adaptive motion control proposed to address limitations in rural deployment.

On the data side, the paper reviews big data management and analysis strategies, focusing on cloud, edge, and fog computing architectures for handling massive sensor data, and enhancing data reliability through ML, blockchain, and interpolation techniques. Advanced ML and deep learning models, particularly CNNs and RNNs, are discussed for applications like disease detection, yield prediction, and intelligent control of inputs. The authors highlight the importance of intelligent decision-making systems that support autonomous and remote monitoring, irrigation, fertilization, and climate control, relying heavily on real-time sensor data and predictive analytics. Despite the rapid advancement, the paper notes critical challenges in integration, cost, scalability, energy management, and the lack of techno-economic evaluations in current literature.4

Methodology

Modern agriculture faces significant challenges due to pests that affect crop yield and quality (Figure 1). To address this issue, we have developed an AI-based agriculture robot capable of detecting pests in farmland and spraying pesticides precisely when needed. This system integrates deep learning, real time video streaming, and cloud-based control mechanisms to ensure efficient and targeted pest management.5–7 The project began with collecting a diverse dataset of pest images from the Roboflow is a web-based platform that helps users create, manage, and deploy computer vision datasets and models with ease.8–10 The collected images were preprocessed and labeled to improve model accuracy.

This AI-based pest management system uses a YOLOv8 CNN model trained on a agricultural pest dataset collected and annotated using Roboflow.11,12 The model is trained in Google Colab with GPU acceleration to detect and classify pests in real-time from crop images. Roboflow provides an easy interface to prepare and export the dataset in YOLOv8 format, while YOLOv8 processes the images by dividing them into grids and predicting bounding boxes and class probabilities for pests within each grid. Once trained, the model can identify pests on new images with high speed and accuracy, enabling timely and automated pest monitoring in agricultural fields.

Fig 1 | Block diagram
Figure 1: Block diagram.

The training process involved multiple iterations to fine tune accuracy, have demonstrated identification and the obtained results high YOLOv8’s training process includes tuning hyperparameters, utilizing transfer learning, and evaluating performance through metrics like precision and recall. Once trained, the model is deployed for real time pest monitoring, integrated into systems like cameras or drones, providing timely alerts for pest control. Continuous training with new data and feedback ensures the model remains accurate and effective.13 After precision achieving in pest satisfactory performance, the trained model was deployed on a remote GPU-supported server to handle real-time pest detection. The robot is equipped with an IP camera that continuously captures video data and transmits it to the detection server via the RTSP protocol. Upon processing the incoming video feed, the server predicts pest presence and updates the count in Google Firebase in real-time. The control unit of the robot is directly linked to Firebase.

The robot can also be manually controlled using a custom-built Android app, named ‘Agri bot’. Agri Bot is developed using Kodular app, designed to remotely control robot for navigation and pest pumping operations. The app provides farmers with the ability to operate the robot from anywhere, enabling remote field monitoring and pest control. Agri Bot connects to the robot’s hardware via Wi-Fi, allowing for real-time control of movement and pump activation. This system not only reduces manual labor but also enhances precision in SF Practice. User actions update Firebase RTDB, prompting the robot to perform specific tasks. This system integrates artificial intelligence and robotics to detect and eliminate pests efficiently in agricultural fields. The robotic rover is controlled using a NodeMCU, while an IP camera mounted on the robot provides live video streaming. Instead of performing image processing onboard, the visual data is transmitted in real time to an external device such as a PC or cloud server, where an AI model YOLOv8 identifies the presence of pests. Once pests are detected, the rover can either be remotely controlled by a user or operate autonomously to perform targeted pesticide spraying, making the system cost-effective, scalable, and suitable for SF.

Working Principle

An AI-powered autonomous robotic rover for pest management uses artificial intelligence and robotics to detect and eliminate pests efficiently. This system can be built using NodeMCU ESP32 for robot control and an IP camera for live video streaming. Instead of processing images on the robot itself, the system transmits live visuals to an external device, such as a PC or cloud server, where AI-based pest detection is performed. Once pests are detected, the rover can be controlled remotely or programmed for autonomous pesticide spray. The working of this system consists of four key stages: robot navigation, live video streaming enabling seamless communication between the detection system and the robotic platform. When a pest is detected, a corresponding value is updated in Firebase, triggering the spraying mechanism through a relay-controlled pesticide pump., pest detection, and targeted pesticide spraying.

Robot Navigation and Control Using NodeMCU

The ESP32 acts as the brain of the entire setup, coordinating various components for autonomous operation. The robot uses four motors – Front Left (FL), Front Right (FR), Back Left (BL), and Back Right (BR), which are connected to a motor driver. The motor driver (L298N) receives movement commands from the ESP32, enabling the robot to move forward, backward, or turn as the requirement.

An IP camera mounted on the robot captures real-time video of the surroundings. This video feed is used by a pest detection model, likely implemented using ML, to identify the presence of pests. When pests are detected, the model sends a signal to the ESP32, which then activates a relay module. This relay module switches on the spray pump to dispense pesticide onto the affected area. Additionally, a servo motor, also controlled by the ESP32, adjusts the direction of the spray to target pests more accurately. The ESP32 integrates data from the camera and pest detection model to make intelligent decisions about navigation and pest control. It drives the motors for movement, controls the spraying mechanism through the relay, and positions the spray nozzle using a servo motor, making the robot capable of autonomous pest detection and elimination.

Live Video Streaming Using an IP Camera

To enable real-time monitoring of pests in agricultural fields, an IP (Internet Protocol) camera is mounted on the robot, playing a vital role in continuously capturing video of the surroundings. This camera converts the captured footage into digital signals and streams it wirelessly over Wi-Fi through the ESP32 Node MCU, which acts as the network interface. Using streaming protocols such as RTSP or HTTP/MJPEG, the camera ensures low-latency transmission of the live video feed. For automated pest detection, the live footage can also be transmitted to a cloud server or local computer where an AI model processes the video in real time to identify pests. Upon detection, the system sends a signal back to the ESP32, which triggers the pesticide spray mechanism and adjusts the servo motor to aim accurately. Additionally, users can manually view the live feed and control the robot remotely via a web browser or mobile app, providing both autonomous and manual operation modes. This integration of live video streaming with real-time AI processing enhances the efficiency and intelligence of the pest control system.

AI-Based Pest Detection

An AI-powered pest detection system using YOLOv8 and a Roboflow dataset involves a precise and transparent workflow designed for robust performance in real-world agriculture. The dataset is labeled with fine-grained pest and crop disease classes, including armyworm, fall webworm moth, green caterpillar, ladybug, late faw infestation, phasmids, red bug, damaged crop, bacterial blight, early faw infestation, and background; each class is carefully annotated and counted, with Roboflow providing visualizations to balance the number of instances per category. YOLOv8 is a state-of-the-art object detection model known for its high speed and accuracy, making it ideal for tasks that require real-time detection and efficient learning from custom datasets. The training process involved configuring the dataset, setting model parameters, and running multiple training epochs, enabling the model to effectively learn to recognize and distinguish between the different pest species provided within the dataset.

Images are gathered from diverse field and lab conditions, capturing a range of light, backgrounds, and image quality to match practical deployment scenarios. The dataset is split into training, validation, and test sets, typically in 70/20/10 ratios, within Roboflow to ensure unbiased model assessment. In the training phase, the annotated dataset is exported in YOLOv8 format and augmented with techniques such as flipping, rotation, brightness scaling, and noise, enhancing model resilience against varied environments. Ultralytics’ YOLOv8 is then trained on cloud GPUs Google Colab with hyperparameters meticulously managed, including a learning rate around 0.01, weight decay, batch size adjusted to hardware, and chosen optimizer, while metrics like loss, precision, recall, and mean Average Precision (mAP) are monitored to track learning progress. Ablation studies are implemented by removing or changing particular augmentations or hyperparameters and noting the resulting changes in key metrics, an approach vital for understanding which components contribute most to performance. After robust validation on the test split, model confidence threshold sensitivity is examined by varying the detection threshold and analyzing the effects on precision and recall.

The live video feed from the IP camera is processed in real-time using an external system equipped with AI capabilities. This system can either be a local PC running a cloud-based AI service such as Google Colab. The video stream is transmitted to this processing unit, where an AI model typically a CNN analyses individual frames to detect pests.14 By focusing pesticide application specifically on these detected regions, the model helps in precise and efficient pest control, reducing unnecessary chemical use and potentially improving crop health through targeted intervention.

The bounding boxes serve as actionable zones for spraying pesticides for a fixed duration, ensuring each infested or damaged area is treated sufficiently. The model performs image classification to accurately identify pests and pinpoint the exact location of infestations within the camera’s field of view. Once a pest is detected, the system can take one of two actions: it can notify the user through alerts for manual control, or it can automatically send a command back to the ESP32 NodeMCU. The NodeMCU then activates the relay module to turn on the pesticide spray and adjusts the servo motor to target the affected area. This real-time analysis and response mechanism allows the system to perform intelligent pest management efficiently and with minimal human intervention (Figure 2).

Fig 2 | Bounding box
Figure 2: Bounding box.

Targeted Pesticide Spraying

When a pest is detected by the AI-based system, the NodeMCU (ESP32) initiates a targeted pesticide spraying mechanism. It activates a 12V water pump through a relay module, which powers the spraying system. A solenoid valve is used to precisely control the flow of pesticide, ensuring it is only released when needed. When the pests are detected, Image with bounding boxes around detected pests and damaged crop areas is created, the system responses and a pesticide spray is recommended for 2 seconds for each bounding box identified. etc as well as areas of “damaged crop” on plant leaves. The type of nozzle selected for pesticide spraying depends on several crucial factors including the density of the spray and the quantity of pesticide to be dispersed, which ensure optimal coverage and minimal waste.

To further enhance the effectiveness of the spraying system, an additional small 3D model mechanism is integrated to enable the adjustment of the nozzle, allowing for precise directional control and better adaptability to various spraying scenarios and crop needs. This approach supports efficient distribution, reduces environmental impact, and improves pest management outcomes. Simultaneously, a servo motor adjusts the direction of the spray based on the pest’s detected location, allowing the system to target specific infected areas rather than spraying indiscriminately. This approach significantly reduces chemical usage and environmental impact by applying pesticides only where they are required. The integration of an IP camera with the NodeMCU enables a cost-effective, wireless, and AI-driven robotic rover for intelligent pest management. By combining real-time video streaming, remote manual control, and automated AI-based detection, this system empowers farmers to monitor crops remotely and perform precise pest control with minimal effort.

Future Work

Future Work for the Autonomous Robotic Rover for Pest Management can focus on enhancing its capabilities, expanding its functionality, and improving accessibility. Here are several potential directions for future development:

Enhanced Pest Detection Accuracy: To improve the precision of pest detection, future iterations could incorporate more advanced ML algorithms, such as deep learning models specifically trained on diverse pest datasets. By using neural networks with larger datasets of various pest types and environmental conditions, the system could better distinguish between pests and harmless elements like eaves, shadows, or other small objects. Additionally, a feedback loop allowing the rover to learn from its errors could further increase accuracy over time.

Multi-Sensor Integration:15 Beyond image-based pest detection, integrating additional sensors like temperature, humidity, and soil moisture sensors could provide a more comprehensive understanding of conditions that affect pest presence and crop health. This multi-sensor approach would allow the system to predict potential pest infestations based on environmental conditions, enabling proactive pest management strategies.

Autonomous Navigation and Mapping: While this rover relies on remote control for navigation, future models could incorporate advanced autonomous navigation features. Using GPS, LiDAR, or other localization techniques, the rover could map and navigate the field independently, covering crops more efficiently and avoiding obstacles. This upgrade would enable it to operate without constant human supervision, reducing the need for manual control and further automating pest management.

Future work on the Autonomous Robotic Rover for Pest Management should focus on enhancing detection capabilities, improving energy efficiency, adding autonomy, and integrating with larger farming ecosystems. By addressing these areas, the rover could become a more effective, adaptable, and accessible tool for sustainable agriculture, ultimately supporting global efforts to increase food security and reduce environmental impact.

Conclusion

Autonomous Robotic Rover for Pest Management presents an innovative, AI driven solution to one of agriculture’s most persistent challenge effective pest control. By combining ML, robotics, and precision pesticide application, this rover aims to address the limitations of traditional pest management methods. Its ability to autonomously detect pests and target affected areas for pesticide spraying not only reduces chemical usage but also minimizes environmental impact, contributing to sustainable farming practices. The rover’s design supports adaptability to various terrains, making it suitable for a wide range of agricultural environments. Its use of a NodeMCU as the central controller enables seamless integration of pest detection, navigation, and spraying functions, all of which work in harmony to maximize efficiency.

The system also collects valuable data on pest activity, which can be used for further analysis and improvement of pest control strategies. This project not only demonstrates the potential of AI and robotics in agriculture but also highlights the need for sustainable solutions in modern farming. By reducing dependency on manual labor, cutting down pesticide usage, and improving crop yields, this robotic rover offers a cost-effective and environmentally friendly alternative to conventional pest management. While challenges like high initial costs and maintenance requirements remain, advancements in technology and scalability could make such systems accessible to a broader range of farmers in the future.

Performance of AI-Powered Pest Detection

The AI-powered robotic rover was tested in agricultural fields with various crops to assess its pest detection capabilities. The system, utilizing ML algorithms and real-time image processing, demonstrated high accuracy in identifying and classifying pests.16 The use of an IP camera and a GPU-powered laptop enabled efficient processing, ensuring timely detection and response. The key findings indicate that the rover achieved an 75%–85% detection accuracy, depending on environmental conditions such as lighting, crop density, and pest movement. False positives were minimal, mainly occurring due to similarities between pests and crop debris. However, the system’s adaptability to different terrains ensured consistent performance across varied agricultural landscapes.17–19

The rover can be controlled via a mobile application, designed using Kodular for intuitive UI elements such as buttons, sliders, and touch interfaces. These elements send control signals to the rover for movement commands like forward, backward, left, and right. The app utilizes Bluetooth or Wi-Fi connectivity to establish communication between the mobile device and the rover, ensuring seamless remote operation. Figure 3 shows splash screen and login screen.

Fig 3 | App interface
Figure 3: App interface.

Key outcomes of this system compared to other methods: The system cut pesticide consumption by 40%–60% leading to lower input costs for farmers. Improved pest control efficiency since pesticides were applied directly to affected areas, their effectiveness increased, leading to faster pest eradication. The precision-based application prevented pests from developing resistance due to overexposure, ensuring long-term effectiveness. The rover’s ability to deliver pesticides only where necessary not only improved pest control but also aligned with sustainable farming practices by reducing chemical exposure to non-target organisms. Prototype of the project developed is a four-wheel rover, on which whole system is mounted is given in Figure 4.20–24

Fig 4 | F1-confidence curve
Figure 4: F1-confidence curve.
Expiremental Results

This graph in Figure 5 shows the training and validation performance metrics of an object detection model. The first row represents training losses: train/box_loss, train/cls_loss, and train/dfl_loss, all decreasing steadily across epochs, indicating better localization, classification, and distribution fitting.25 The precision and recall curves show the model’s accuracy in detecting objects, with recall improving consistently toward ~0.9 and precision stabilizing around ~0.85–0.9. The second row represents validation metrics: val/box_loss, val/cls_loss, and val/dfl_loss, which also decrease, confirming reduced overfitting. The final two plots show performance in terms of mAP. At IoU 0.5 (mAP50), the score reaches about ~0.9, and for the stricter mAP50-95, it improves steadily toward ~0.7. Overall, the model demonstrates effective learning, reduced loss values, high precision and recall, and strong mAP scores, reflecting good generalization and object detection performance (Figure 6).26,27

Fig 5 | Test results of pest detection
Figure 5: Test results of pest detection.
Fig 6 | Validation results of YOLOv8 model
Figure 6: Validation results of YOLOv8 model.

The model reached an mAP50 of 0.916, indicating very high accuracy in detecting objects when using the IoU threshold of 0.5. However, the stricter mAP50-95 was 0.702, which shows that performance drops under tighter localization requirements, though this is still a good result. Most classes, such as ladybug and late faw infestation, scored exceptionally well with mAP50 values close to 1.28,29 Overall, the high mAP50 demonstrates that the model is reliable for practical detection tasks, with room for improvement in precise bounding box localization (Figure 7).30

Fig 7 | Recall-confidence curve
Figure 7: Recall-confidence curve.

Recall–Confidence Curve generated during YOLOv8 model validation shows how recall (the ability to detect all true objects) changes as the confidence threshold for predictions increases. At low confidence thresholds, the model detects nearly all objects (high recall), but as the confidence threshold rises, recall drops because the model becomes stricter about what it counts as a detection. The thick blue line represents the overall performance across all classes, while the thin colored lines represent individual classes.31,32 Most classes maintain high recall at reasonable confidence values, but some (like damaged crop and green caterpillar) drop off faster, reflecting weaker detection reliability. Overall, the model achieves high recall across most classes, confirming its strong detection ability, though a few classes need improvement.

The Precision–Confidence Curve of YOLOv8 model is depicted in Figure 8. It shows how precision (the proportion of correct detections out of all detections) varies as the confidence threshold changes. At very low confidence values, the model accepts many detections, including false positives, leading to lower precision. As the confidence threshold increases, the model becomes stricter, reducing false positives and driving precision closer to 1. The thick blue line represents the average performance across all classes, which reaches 1.00 precision at a confidence of 0.915, indicating excellent reliability when predictions are made with high confidence. Most individual classes (like ladybug, late_faw_infestation, and fall_webworm_moth) show consistently high precision.

Fig 8 | Precision-confidence curve
Figure 8: Precision-confidence curve.

This confusion matrix Figure 9 illustrates the performance of a multi-class classification model used to identify various agricultural pests, diseases, or background elements from images. Each row represents the actual class, while each column indicates the predicted class. The diagonal elements show the number of correct predictions for each class, highlighting the model’s strength in identifying certain categories like “armyworm” (261 correct predictions), “early_faw_infestation” (103), and “late_faw_infestation” (110). However, the matrix also reveals areas of confusion; for instance, “armyworm” was misclassified as “background” 54 times, and some “early_faw_infestation” instances were confused with “late_faw_infestation.” The F1-Confidence curve shows that the model achieves a strong overall F1-score of 0.87 at a confidence threshold of 0.147, meaning it maintains a good balance between precision and recall Figure 10. Most classes perform consistently well, though a few like damaged crop and green caterpillar show weaker F1 scores, indicating they are harder for the model to detect accurately.33,34

Fig 9 | Confusion matrix
Figure 9: Confusion matrix.
Fig 10 | Normalized confusion matrix
Figure 10: Normalized confusion matrix.

The F1-confidence curve visualizes the relationship between the F1 score—a balanced measure that combines both precision and recall—and varying confidence thresholds used by the object detection model. Each colored line represents a different pest class, while the bold blue line indicates the F1 curve averaged over all classes. As the confidence threshold increases, the model becomes more selective: typically, precision rises but recall may fall. The F1 score peaks at a threshold where this balance is optimal (in your case, reaching as high as 0.87 at a confidence of 0.147), indicating the point where both false positives and false negatives are minimized as much as possible. This curve helps you select the best operating point for your detector—maximizing reliable detections while limiting errors—and also reveals which pest classes are consistently easier or harder for the model to identify, as those with lower or more unstable F1 scores would benefit from additional targeted data or further model tuning (Figures 4 and 11).35,36

Fig 11 | Prototype of the project
Figure 11: Prototype of the project.

The rover is built on a four-wheel platform, housing a white enclosure that contains core electronics. On top, the mounted camera serves as the primary vision sensor for real-time pest detection by capturing images of crops as the rover traverses the field. The camera sends these images to the onboard or remote AI system, which analyzes them using the trained YOLOv8 model to identify pests. The robotic rover’s design supports automated navigation and integrates a spray mechanism (visible in the image) for targeted pesticide application when pests are detected. This mobile system enables efficient, precise pest management while minimizing chemical usage and manual labor, and forms a scalable solution for smart agriculture applications (Figure 12).

Fig 12 | Evaluation on different types of pests
Figure 12: Evaluation on different types of pests.
References
  1. Kolapo F, Lamidi S , Idika A, Philip OB, Mba KM , Olayinka KE,
    et al . Robotic solutions for precision agriculture. J Agric Sci Pract. 2024;9(4):123–30. https://doi.org/10.31248/JASP2024.483
  2. Cheng C, Fu J, Su H, Ren L. Recent advancements in agriculture robots: benefits and challenges. Machines. 2023;11:48. https://doi.org/10.3390/machines11010048
  3. Mohyuddin G, Khan MA, Haseeb A, Mahpara S, Waseem M. Evaluation of machine learning approaches for precision farming in smart agriculture system: a comprehensive. https://doi.org/10.1109/ACCESS.2024.3390581
  4. Elbeheiry N, Balog RS. Technologies driving the shift to smart farming: a review. IEEE Sens J. 2023;23(1):1752–69. https://doi.org/10.1109/JSEN.2022.3225183
  5. Chen C-J, Huang Y-Y, Li Y-S, Chang C-Y, Huang Y-M. An AIoT based smart agricultural system for pests detection. IEEE Access. 2020;8:180750–61. https://doi.org/10.1109/ACCESS.2020.3024891
  6. Ramalingam B, Mohan RE, Pookkuttath S, Gómez BF, Sairam Borusu CS, Wee Teng T, et al. Remote insects trap monitoring system using deep learning framework and IoT. Sensors. 2020;20(18):5280. https://doi.org/10.3390/s20185280
  7. Singh, Kumar P, Singh V. Deep learning for pest detection and classification in agriculture. Comput Electron Agric. 2021;184:106123.
  8. Rewar E, Singh BP, Sharma OP. Evaluation of foliar diseases using image processing. In: Proceedings of the international conference on current trends in computer, electrical, electronics and communication (CTCEEC), Mysore, India; 2017. p. 611–5. https://doi.org/10.1109/CTCEEC.2017.8454987
  9. Bharate AA, Shirdhonkar MS. A review on plant disease detection using image processing. In: Proceedings of the international conference on intelligent sustainable systems (ICISS), Palladam, India; 2017. p. 103–9. https://doi.org/10.1109/ISS1.2017.8389326
  10. Kumar SS, Raghavendra BK. Diseases detection of various plant leaf using image processing techniques: a review. In: Proceedings of the 5th international conference on advanced computing & communication systems (ICACCS), Coimbatore, India; 2019. p. 313–6. https://doi.org/10.1109/ICACCS.2019.8728325
  11. Saranya T, Deisy C, Sridevi S, Anbananthen KSM. A comparative study of deep learning and Internet of Things for precision agriculture. Eng Appl Artif Intell. 2023;122:106034. https://doi.org/10.1016/j.engappai.2023.106034
  12. Ünal Z. Smart farming becomes even smarter with deep learning, a bibliographical analysis. IEEE Access. 2020;8:105587–609. https://doi.org/10.1109/ACCESS.2020.3000175
  13. Solanke S, Mehare P, Shinde S, Ingle V, Zope S. IoT based crop disease detection and pesting for Greenhouse—a review. In: Proceedings of the 3rd international conference for convergence in technology (I2CT), Pune, India; 2018. p. 1–4. https://doi.org/10.1109/I2CT.2018.8529156
  14. Nebot P, Torres-Sospedra J, Recatala G. Using neural networks for maintenance tasks in agriculture: precise weed detection. In: Proceedings of the international conference of agricultural engineering cigr ageng; 2012.
  15. Johnson N, Kumar MBS, Dhannia T. A study on the significance of smart IoT sensors and data science in digital agriculture. In: Proceedings of the advanced computing and communication technologies for high performance applications (ACCTHPA), Cochin, India; 2020. p. 80–8. https://doi.org/10.1109/ACCTHPA49271.2020.9213207
  16. Al-Hiary H, Bani-Ahmad S, Reyalat M, Braik M, Alrahamneh Z. Fast and accurate detection and classification of plant diseases. Int J Comput Appl. 2011;17(1):31–8. https://doi.org/10.5120/2183-2754
  17. Pawar A, Pawaskar M, Ghodke S. Review of plant disease detection and diagnosis using deep learning model. Mukt Shabd Journal. 2020;9(6):1522–6.
  18. Gerten D, Heck V, Jägermeyr J, Bodirsky BL, Fetzer I, Jalava M, et al. Feeding ten billion people is possible within four terrestrial planetary boundaries. Nat Sustain. 2020;3:200–8. https://doi.org/10.1038/s41893-019-0465-1
  19. Gharde Y, Singh PK, Dubey RP, Gupta PK. Assessment of yield and economic losses in agriculture due to weeds in India. Crop Prot. 2018;107:12–8. https://doi.org/10.1016/j.cropro.2018.01.007
  20. Kamble PL, Pise AC. Review on agricultural plant disease detection by using image processing. In: International journal of latest trends in engineering and technology (IJLTET); 2019.
  21. Albani D, IJsselmuiden J, Haken R, TrianniV. Monitoring and mapping with robot swarms for agricultural applications, presented at the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore; 2017. https://doi.org/10.1109/AVSS.2017.8078478
  22. Lee J, Park K, Kim S. Swarm robotics for pest control in agriculture. J Robot. 2022;2022:123456.
  23. Ayaz M, Ammad-Uddin M, Sharif Z, Mansour A, Aggoune EM. Internet-of-Things (IoT)-based smart agricul ture: toward making the fields talk. IEEE Access. 2019;7:129551–83. https://doi.org/10.1109/ACCESS.2019.2932609
  24. Sharma BB, Kumar N. Internet of Things-based hardware and software for smart agriculture: a review. In: Proc ICRIC. Cham, Switzerland: Springer; 2020. p. 151–7. https://doi.org/10.1007/978-3-030 29407-6_13
  25. Sreekantha DK, Kavya A. Agricultural crop monitoring using IoT—a study. In: Proceedings of the 11th international conference on intelligent systems and control (ISCO); 2017. p. 134–9.
  26. Devi RK, Muthukannan M. An Internet of Things-based economical agricultural integrated system for farmers: a review. In: Proceedings of the 4th international conference on intelligent computing and control Systems, Madurai, India; 2020. p. 666–73. https://doi.org/10.1109/ICICCS48265.2020.9121006
  27. Sarker MNI, Wu M, Chanthamith B, Yusufzada S, Li D, Zhang J. Big data driven smart agriculture: pathway for sustainable development. In: Proceedings of the 2nd international conference on artificial intelligence and big data (ICAIBD), Chengdu, China; 2019. p. 60–5. https://doi.org/10.1109/ICAIBD.2019.8836982
  28. Singh S, Singh P, Kaur A. A survey on image processing techniques for seeds classification. In: Proceedings of the international conference on computer science (ICCS), Jalandhar, India; 2018. p. 143–50. https://doi.org/10.1109/ICCS.2018.00032
  29. Hu Z, Xu L, Cao L, Liu S, Luo Z, Wang J, et al. Application of non-orthogonal multiple access in wire less sensor networks for smart agriculture. IEEE Access. 2019;7:87582–92. https://doi.org/10.1109/ACCESS.2019.2924917
  30. Kimball JW, Kuhn BT, Balog RS. A system design approach for unattended solar energy harvesting supply. IEEE Trans Power Electron. 2009;24(4):952–62. https://doi.org/10.1109/TPEL.2008.2009056
  31. Bodic M, Vukovic P, Rajs V, Vasiljevic-Toskic M, Bajic J. Station for soil humidity, temperature and air humidity measurement with SMS forwarding of measured data. In: Proceedings of the 41st international spring seminar on electronics technology (ISSE), Zlatibor, Serbia; 2018. p. 1–5. https://doi.org/10.1109/ISSE.2018.8443618
  32. Bayrakdar ME. A smart insect pest detection technique with qualified underground wireless sensor nodes for precision agriculture. IEEE Sens J. 2019;19(22):10892–7. https://doi.org/10.1109/JSEN.2019.2931816
  33. Sumathi N, Venkatalakshmi B. Design of 433 MHz compatible matching network of wake-up receiver for wireless sensor node. In: Proceedings of the global conference on communication technologies (GCCT), Thuckalay, India; 2015. p. 807–11. https://doi.org/10.1109/GCCT.2015.7342774
  34. Boulou M, Yelemou T, Rollande DA, Tall H. DEARP: Dynamic energy aware routing protocol for wireless sensor network. In: Proceedings of the IEEE 2nd international conference on smart cities and communities (SCCIC), Oua gadougou, Burkina Faso; 2020. p. 1–6. https://doi.org/10.1109/SCCIC51516.2020.9377331
  35. Gulec O, Haytaoglu E, Tokat S. A novel distributed CDS algorithm for extending lifetime of WSNs with solar energy harvester nodes for smart agriculture applications. IEEE Access. 2020;8:58859–73. https://doi.org/10.1109/ACCESS.2020.2983112
  36. Chatterjee B, Seo D-H, Chakraborty S, Avlani S, Jiang X, Zhang H, et al. Context-aware collaborative intelligence with spatio-temporal in-sensor-analytics for efficient communication in a large-area IoT testbed. IEEE Internet Things J. 2021;8(8):6800–14. https://doi.org/10.1109/JIOT.2020.3036087


Premier Science
Publishing Science that inspires