Automation of Agricultural Data Processing Using Computer Vision and IoT Technologies: An Experimental Study

Andrii Povsheniuk ORCiD
Department of Computer Sciences and Applied Mathematics, National University of Water and Environmental Engineering, Rivne, Ukraine Research Organization Registry (ROR)
Correspondence to: Andrii Povsheniuk, andriipovsheniuk@gmail.com

Premier Journal of Science

Additional information

  • Ethical approval: N/a
  • Consent: N/a
  • Funding: No industry funding
  • Conflicts of interest: N/a
  • Author contribution: Andrii Povsheniuk – Conceptualization, Writing – original draft, review and editing
  • Guarantor: Andrii Povsheniuk
  • Provenance and peer-review: Unsolicited and externally peer-reviewed
  • Data availability statement: N/a

Keywords: Agriculture, Drones, Neural networks, Sensors, Visual monitoring.

Peer Review
Received: 2 September 2025
Last revised: 13 October 2025
Accepted: 13 October 2025
Version accepted: 4
Published: 7 November 2025

Plain Language Summary Infographic
“High-clarity academic infographic titled ‘Automation of Agricultural Data Processing Using Computer Vision and IoT Technologies,’ with sections on background, experiment, results, and conclusion showing drones, IoT sensors, and a tractor, illustrating improved yield and water efficiency.”
Abstract

The aim of the study was to analyse and evaluate the potential of integrating the Internet of Things (IoT), drone technologies, and neural networks in agriculture for effective monitoring and optimisation of agronomic processes. The experiment was conducted using sensors to collect data on soil moisture, temperature, and acidity in the fields, as well as drones for spectral imaging, which allowed the assessment of the condition of crops such as wheat, corn, and sunflower. Using the collected data, seasonal changes in growth conditions, including fluctuations in soil moisture, air temperature, and acidity, were identified, which required prompt interventions to adjust agronomic measures, such as additional irrigation or fertiliser application.

Furthermore, based on the Normalised Difference Vegetation Index and detailed processing of drone images, the number, and location of stress zones in the fields were detected, caused by plant diseases and deficiencies of important nutrients. After applying neural networks to analyse the plant images, classification accuracy for wheat reached 93.5%, and for corn, 91.8%. Comparison with traditional monitoring methods demonstrated significant advantages in accuracy and processing speed. The study also showed a 12% reduction in water consumption while maintaining or even increasing crop yields due to more precise resource management and the use of precision agronomy. The high potential of applying IoT and drone technologies in the agricultural sector was confirmed for reducing environmental impact, rational use of water and energy resources, as well as improving crop yields and the efficiency of agricultural production.

Introduction

The need for progressive development of agriculture became an important challenge, driven by the necessity to ensure food security and optimise resources. Technological advancements such as computer vision, the Internet of Things (IoT), and machine learning have already been established as key tools for automating agricultural processes. In particular, these advancements offer new opportunities for crop monitoring, automation of agricultural enterprises, and reduction of losses caused by inefficient approaches. However, the level of practical use remains insufficiently high due to a range of factors, including the complexity of technologies, limited availability of equipment, and the awareness of participants in the agricultural market.1

The scientific literature highlights many aspects of digitalisation in the agricultural sector, but the issue of integrating modern technologies into traditional practices remains relevant.2 Mulla emphasised notable progress in remote sensing for precision agriculture over the past 25 years.3 He observed the diversification of sensor platforms, ranging from soil organic matter sensors to satellite, aerial, and handheld devices, as well as the broadening of electromagnetic wavelengths employed, currently encompassing ultraviolet to microwave. The emergence of hyperspectral sensing has facilitated a more accurate description of crop biophysical and biochemical traits. Mulla also highlighted enhancements in the spatial and temporal resolution of imaging, enabling near real-time management of soil, crop, and pest conditions. Despite these improvements, he recognised persistent problems, such as the need for standardised methodologies and frameworks that are usable across diverse climatic, soil, crop, and management situations. A similar study was conducted by Dhanya et al., who emphasised the importance of deep learning in developing “smart” agriculture but noted that high computational complexity and large costs remain key barriers.4

An important contribution to understanding the capabilities of IoT in the agricultural sector was made by Kim et al., who underlined that the integration of sensors, cloud data storage, and automated management significantly enhances the efficiency of soil and crop monitoring systems.5 At the same time, the authors noted issues with technology compatibility and the widespread adoption. The research by Ouhami et al. focused on combining IoT with machine learning methods for plant disease diagnosis, but the authors stressed the need for standardising data analysis approaches and adapting models to the specific conditions of different regions.6 Ghazal et al. explored the application of computer vision for precision farming, emphasising its role in increasing yields through optimised resource use.7 Kamal et al. draw attention to the advantages of IoT automation using segmentation methods to determine the condition of plantings, enabling timely identification of stress factors and optimisation of irrigation.8 These studies confirm the importance of automation technologies for reducing costs and increasing efficiency, yet gaps remain in providing an integrated approach to combining computer vision with intelligent management systems.

Similar studies also indicate the potential of computer vision methods in the food industry and agricultural product analysis. Kakani et al. emphasise the importance of adapting these technologies for product quality assessment but highlight limitations related to processing large volumes of data and hardware requirements.9 Fracarolli et al. also note the significance of these tools in quality control but stress the need for standardising methods for the broad implementation.10 Such approaches contribute to solving certain problems but require further research to ensure scalability and integration with other components of agroecosystems. Regarding resource management issues, Vij et al. developed approaches for automating irrigation systems using machine learning, which reduced water consumption and optimised agrotechnical processes.11 However, the question remains open about the impact of such systems on economic efficiency in large agricultural enterprises. This demonstrates the importance of assessing not only technical but also economic aspects of using these technologies.

National aspects of digitalisation in the agricultural sector were studied by Chechetova et al., who noted that in Ukraine, the process of implementing digital solutions in agriculture is at an early stage.1 The authors highlighted the low level of farmers’ awareness of digital technologies, insufficient investment, and the lack of a systematic government policy in this area. Despite a significant number of studies in this field, several gaps remain that require further investigation. In particular, the issue of adapting modern technologies to the conditions of medium and small agricultural enterprises needs clarification, as well as determining the economic efficiency of such solutions in the long term. The aim of the work was to investigate the possibilities of integrating IoT, drone technologies, and neural networks for automating the monitoring and optimisation of agronomic processes, specifically through the use of computer vision to analyse the condition of agricultural crops.

Materials and Methods

The research was conducted on three large agricultural sites of the VITAGRO group of companies in the Rivne region, with a total area of 2,500 ha, which ensured the representativeness of the agro-production analysis. The experiment lasted throughout 2024 during the growing season of the main crops: wheat (April-July), corn (May-September), and sunflower (May-August). The choice of crops was determined by the prevalence in the Rivne district and differences in agronomic growing conditions, which provided broad coverage of parameters for the study. A total of 250,000 plots were distributed across the three crops, facilitating thorough data collecting. A total of 120,000 plots were designated for wheat, 80,000 plots for corn and 50,000 plots for sunflower. Each plot measured 10 m by 10 m, totalling 100 m² for each plot. The plots were methodically allocated throughout the locations and assigned to treatments based on a randomisation protocol that reduced bias due to environmental differences, including soil type and microclimate. Three treatment types were administered for each crop: irrigation, fertilisation, and insect control. A total of 225,000 treatment plots were used across all crops: 108,000 for wheat, 72,000 for corn, and 45,000 for sunflower. The treatment plots were randomly allocated to provide an impartial distribution across the research region.

Alongside the treatment plots, a total of 25,000 control plots were included across all crops. A total of 12,000 control plots were designated for wheat, 8,000 for maize and 5,000 for sunflower. The control plots provide baseline data for comparison, facilitating the evaluation of the treatment effects. Plots were delineated according to crop type and field variables (e.g., soil type, microclimate), with treatments randomly allocated within these blocks to guarantee impartial findings. The methodical distribution of control and treatment plots guaranteed that the results were statistically sound and representative of actual agricultural settings (Table 1).

Table 1: Overview of study design, plot allocation, and statistical analysis methodology.
ComponentDescription
Crops3 sites: Wheat (1,200 ha), Corn (800 ha), Sunflower (500 ha)
Total Number of Plots250,000 total plots across all crops (including 25,000 control plots)
Number of Plots per Crop120,000 for Wheat, 80,000 for Corn, 50,000 for Sunflower
Plot Size10m x 10m (100 m²) per plot
BlockingBlocked by crop type and field variations (e.g., soil type, microclimate)
RandomizationTreatments randomly assigned within blocks for each crop to ensure unbiased allocation
Number of Treatment Groups3 treatments per crop (e.g., Irrigation, Fertilization, Pest Control)
Control Plots10% control plots per crop (12,000 for Wheat, 8,000 for Corn, 5,000 for Sunflower)
Total Control Plots25,000 total control plots across all crops
Total Treatment Plots225,000 treatment plots across all crops (Wheat: 108,000, Corn: 72,000, Sunflower: 45,000) were used for the analysis
Statistical AnalysisANOVA, followed by Tukey’s HSD for post-hoc comparisons, assessing treatment effects on crop yields, NDVI, water usage, and soil conditions
Data Collection MethodsIoT sensors for soil moisture, temperature, acidity; drone imaging for crop condition monitoring
Outcome MeasuresCrop yield, NDVI, water consumption, soil moisture, soil pH

Wheat was grown on an area of 1,200 ha. Data collection was conducted at all stages of crop growth, including sowing, vegetation, and ripening periods. Measurements were taken regarding soil moisture levels, acidity, and air temperature, as well as weather conditions that could affect the overall state of the crops. Special attention was paid to evaluating wheat yields depending on climatic factors and applied agrotechnologies. Corn was the main crop on 800 ha. The monitoring of the vegetation process included stages from sprouting to ripening. Soil acidity and moisture levels, irrigation intensity, and pests and diseases that could influence the yield were assessed. Data on weather conditions, including temperature and precipitation, were gathered for further analysis. Sunflower was grown on 500 ha. The collection of agronomic data included stages of growth and flowering. Monitoring covered soil moisture and acidity levels, as well as an analysis of weather conditions that impacted sunflower growth and development. Stress factors such as drought or high temperatures, which could negatively affect yields, were also monitored.

The study employed IoT technology and unmanned aerial vehicles (drones) for data collection, facilitating comprehensive monitoring of agricultural crops and their environment. Surface IoT sensors, such as WET-2 moisture meters (Decagon Devices, USA), PT100 thermometers (Omega Engineering, USA), and LAQUAtwin B-711 soil pH sensors (Horiba Scientific, Japan), were used to gather data on moisture, temperature, and soil acidity. These sensors delivered precise measures of humidity, temperature, and pH levels. The WET-2 sensors were calibrated prior to the season using control soil samples with established characteristics to assure accuracy. Handheld LAQUAtwin pH meters facilitated spot sampling in designated sub-areas for specific pH readings, while IoT sensors linked with LoRaWAN enabled continuous pH monitoring over the whole research area. The data from the handheld meters were mostly used for the calibration and validation of the real-time data gathered by the IoT sensors, which offered an extensive dataset of pH records.

The LAQUAtwin B-711 pH sensors were used for discrete readings instead of continuous pH monitoring. These portable instruments were used at regular weekly intervals to evaluate pH levels throughout the vegetative phase and at daily intervals during the active growth period. The pH measurements were carefully documented in certain sub-areas of the field, deliberately chosen to reflect diverse soil conditions over the 2,500 ha expanse. In contrast to IoT sensors that perpetually monitor parameters, the LAQUAtwin B-711 sensors were not integrated with the LoRaWAN system for real-time data transfer. Subsequently, following each series of measurements, the data were manually sent to the farm’s server over LoRaWAN modules for incorporation into the cloud-based AgriSync platform. The WET-2 sensors were calibrated before the season using control soil samples with defined features to ensure precision. The sensors were placed in the topsoil layer (0–5 cm) for precise monitoring of changes in surface horizon characteristics. During the growth season, about 34,560,000 pH values were acquired for all crops by periodic spot measurements.

Drones were used for collecting data on a weekly basis throughout 2024. Drone operations endured roughly 3–4 hours every mission, providing comprehensive coverage of the 2,500 ha area. Flights were executed at an altitude of 50 m above ground level (AGL), ensuring an ideal balance between detail and coverage for crop monitoring, with a resolution of 20 MPs. Each drone flight resulted in the acquisition of approximately 1,000 images per flight depending on the specific overlap settings and flight conditions, ensuring that weekly flights would achieve the total of 52,083 images (at 70% forward overlap and 60% lateral overlap), or 78,125 images (at 80% forward overlap) for the entire 2,500 ha area. Throughout the year, approximately 52,083 images (at 70% forward overlap and 60% lateral overlap) were acquired, ensuring comprehensive monitoring of crop conditions across the entire 2,500 ha area. Alternatively, with 80% forward overlap, the total number of images required would be approximately 78,125. The photos were taken at consistent intervals during the extended flight durations, guaranteeing sufficient overlap and accurate mapping of the entire field.

The Ground Sampling Distance was sustained at roughly 2–3 cm per pixel at an altitude of 50 m AGL. This guaranteed high-resolution imagery is appropriate for accurate vegetation analysis and crop status assessment. Each weekly sortie was meticulously scheduled to encompass distinct segments of the 2,500 ha field, optimising flying trajectories to provide comprehensive coverage without duplication. Weekly sorties (about 52 flights annually) facilitated consistent and thorough monitoring, allowing for real-time updates on crop health, soil conditions, and areas of potential stress. Each sortie included a flight log that documented essential metrics such as flight length, area covered, and image counts, thus assuring complete traceability of the acquired data.

Images of the fields, necessary for analysing plant conditions, were obtained using Da-Jiang Innovations (DJI) Phantom 4 Multispectral drones (DJI, China), equipped with multispectral cameras and Global Positioning System navigation, which ensured high monitoring accuracy (Figure 1). The optimal shooting distance was 50 m, which provided high image detail. The DJI Phantom 4 Multispectral is equipped with a multispectral camera that captures images in five spectral bands (Blue 475 nm, Green 560 nm, Red 664 nm, Red Edge 730 nm, Near-Infrared 840 nm), each with 2 MP resolution. The drone also features a 20 MP RGB camera for high-resolution visual imagery, supporting comprehensive crop monitoring. These bands are crucial for vegetation study, including the computation of vegetation indices such as NDVI, which are used to evaluate crop health and vitality.

Fig 1 | Technical drawing of the DJI phantom 4 multispectral drone (DJI, China) with main components
Note: Main components of the drone: 1 – gimbal and multispectral camera; 2 – bottom-view system; 3 – Micro USB port; 4 – camera/connection status indicator; 5 – link button; 6 – camera microSD card slot; 7 – front vision system; 8 – infrared sensing system; 9 – front light-emitting diodes; 10 – motors; 11 – propellers; 12 – aircraft status indicator; 13 – OcuSync antennas.
Source: Adapted by the author from official DJI documentation.
Figure 1: Technical drawing of the DJI phantom 4 multispectral drone (DJI, China) with main components.
Note: Main components of the drone: 1 – gimbal and multispectral camera; 2 – bottom-view system; 3 – Micro USB port; 4 – camera/connection status indicator; 5 – link button; 6 – camera microSD card slot; 7 – front vision system; 8 – infrared sensing system; 9 – front light-emitting diodes; 10 – motors; 11 – propellers; 12 – aircraft status indicator; 13 – OcuSync antennas.
Source: Adapted by the author from official DJI documentation.

Multiple calibration procedures were implemented to guarantee precise radiometric measurements. Reflectance panels were deployed in the field to function as reference targets for calibrating the camera’s raw digital numbers to surface reflectance values. A sunshine sensor was employed to quantify solar irradiance during data gathering. This facilitated modifications to accommodate variations in sunshine intensity during the day, guaranteeing uniform calibration. Additionally, a vignetting correction procedure was implemented to eliminate radial intensity discrepancies induced by the camera lens, especially at the picture peripheries, which is crucial for the accurate assessment of vegetation indices.

Subsequent to data acquisition, the images underwent processing with radiometric correction software to compensate for air conditions and transform the raw data into surface reflectance values. This post-processing phase guarantees data consistency and comparability across various flights and environmental circumstances, yielding dependable outcomes for vegetation health evaluation. The ground sampling distance of the imagery was roughly 2–3 cm/pixel at an elevation of 50 m, offering exceptional spatial resolution for detailed vegetation research. Given the 70% longitudinal and 60% lateral overlap, 52,083 images were needed to cover 2,500 ha per week, with a higher image count of 78,125 images required for 80% longitudinal overlap.

Spectral data obtained from drones were processed using a workstation equipped with Pix4Dmapper software. This software was used to process aerial photographs, generate orthophotos and 3D models, and analyse spectral vegetation indices such as the Normalised Difference Vegetation Index (NDVI). The workstation was equipped with a powerful central processing unit responsible for computation; a graphics processing unit for image and graphical data processing; Random Access Memory (RAM) for temporary data storage during processing; and a hard drive that enabled fast access to stored data. A Wi-Fi module was used for data transmission from the drones, allowing real-time image transfer and integration with the AgriSync management platform. At this stage, sensor readings and spectral images were synchronised using GPS timestamps and geospatial coordinates, facilitating the direct alignment of IoT soil data with vegetation indices. The integrated dataset was analysed using TensorFlow pipelines, where convolutional neural networks classified crop conditions and transmitted recommendations to AgriSync dashboards for prompt decision-making.

The data processing workflow included filtering and verification, which were performed on a local server where raw data were checked for sensor malfunctions or anomalies caused by extreme climatic conditions. Matrix Laboratory (MATLAB) software was used to carry out this step, enabling efficient data processing, the application of anomaly detection algorithms, and statistical analysis. Image processing was conducted using the TensorFlow platform for deep learning and neural network deployment. Convolutional neural networks, developed on the TensorFlow platform, processed the drone images and classified the crops according to the condition. The study was organised into three stages. The initial stage involved data collection via IoT sensors and drones at each plant growth phase: the initial stage (germination and vegetative organ formation), the intermediate stage (reproductive organ development), and the final stage (ripening and yield accumulation). All data were collected in real time, which allowed for preliminary monitoring and the establishment of a baseline for further analysis.

The second stage involved analysing crop conditions using computer vision methods, particularly image processing in the visible, Red, Green, Blue (RGB), and near-infrared (NIR) spectra. RGB imagery enabled analysis of plant colour characteristics such as leaf discolouration, which are key indicators of disease, nutrient deficiency, or other stress factors. NIR imagery was used to assess photosynthetic activity, allowing for the early detection of stress indicators invisible in the visible spectrum. The images obtained from drones were processed using computer vision algorithms such as convolutional neural networks, which allowed for highly accurate crop condition classification. The use of spectral vegetation indices, particularly NDVI, made it possible to assess vegetation levels and detect zones of low photosynthetic activity. To ensure stable neural network performance in real time, a Graphics Processing Unit (GPU) based computing platform was used. The graphics processor’s architecture, optimised for parallel computation, enabled efficient processing of large data volumes. This was especially important for the rapid processing of spectral images from the fields and for maintaining high model performance under real-world conditions. Data were processed for each growth stage, which enabled detailed analysis of crop condition changes throughout the season. The NDVI was calculated as part of this analysis to assess plant health.

The collected data were used to update crop condition information via Application Programming Interfaces (APIs) connected to cloud servers for analysis and storage. The TensorFlow platform facilitated rapid responses to changes in plant condition and generated recommendations for agrotechnical measures. The imagery and crop condition data were updated through the system, enabling integrated monitoring of agronomic and technological parameters. The platform employed algorithms for deep analysis of detected anomalies and automatically proposed corresponding agrotechnical actions. The interface allowed farmers to receive recommendations for adjusting farming practices based on model outputs, without the need for physical presence in the field. Thus, the enhanced image processing system – combining spectral analysis with data integration from IoT sensors – not only improved the accuracy of early-stage problem detection but also ensured the timely implementation of measures that minimise human error and increase the overall efficiency of agronomic operations (Figure 2).

Fig 2 | Flowchart of IoT sensor and drone data processing
Figure 2: Flowchart of IoT sensor and drone data processing.

The third stage involved applying the obtained data to optimise agronomic processes such as irrigation management, fertiliser application, and plant protection, as well as to make further yield and efficiency forecasts for each of the studied crops. The data obtained during the field trials were collected through statistical observations. An ANOVA statistical analysis was performed to evaluate variations in yield, NDVI, and water usage between treatment and control plots. Confidence intervals (CI: 95%) and p-values were computed for each parameter. The resulting database, including indicators of soil moisture, soil acidity, and field imagery, was systematised and analysed to assess plant health. Each parameter was analysed in a dynamic context.

The primary response variables assessed comprised crop yield, NDVI, water consumption, soil moisture, and soil pH. Water consumption was monitored as an indicator of sustainability, while soil moisture and pH levels offered insights into the environmental conditions in which the crops were cultivated. The NDVI was used to assess crop health, utilising spectral data obtained from drone imaging. To verify the validity of the ANOVA results, multiple assumptions were assessed. The normality of the residuals was evaluated by Q-Q plots and the Shapiro-Wilk test. In instances of substantial departures from normality, data transformations (e.g., logarithmic transformations) or non-parametric tests were contemplated. Levene’s test was utilised to assess the homogeneity of variance across treatment groups. In the situation when the premise of homogeneity of variances was breached, we utilised Welch’s ANOVA to accommodate for the disparity in variances. The experimental design guaranteed the independence of observations by randomly assigning plots and collecting data at intervals that reduced temporal correlation.

Effect sizes were computed to evaluate the extent of treatment effects. Partial eta-squared (η²) was employed to measure the proportion of variance elucidated by each treatment. An η² value of 0.12 signifies a moderate influence of irrigation treatment on crop output. CIs at the 95% level were presented for each effect size to delineate a range within which the true population parameter is expected to reside, enhancing the contextual understanding of the statistical significance of the findings. The 95% CI for the irrigation effect on wheat yield ranged from 0.12 to 0.18 t/ha, signifying the anticipated yield enhancement attributable to irrigation.

The hypotheses for the analysis were defined. The first null hypothesis (H0) asserts that there is no substantial difference in wheat yield between irrigated and non-irrigated plots, whereas the alternative hypothesis (H1) indicates that irrigation significantly affects wheat output. Another H0 posits that the treatment (irrigation, fertilisation, pest management) does not significantly influence the NDVI values of wheat, maize, or sunflower crops, while the alternative hypothesis H1 contends that the treatments do exert a significant effect on NDVI. Furthermore, H0 assumes that there is no significant difference in water usage between treatment and control plots, whereas the H1 suggests that water usage is much lower in the treatment plots relative to the controls. Final H0 states that fertilisation does not significantly influence the yields of wheat, maize, or sunflower, whereas the H1 claims that fertilisation has a major impact on crop yields.

ANOVA was utilised to evaluate these hypotheses, with Tukey’s Honest Significant Difference (HSD) applied for post-hoc comparisons to discern specific differences among treatment groups. Post-hoc analyses indicated that irrigated plots produced considerably higher wheat yields compared to non-irrigated plots, although fertilised and non-fertilised plots exhibited no significant difference. The study was meticulously organised to prevent the presentation of p-values absent explicit assumptions, so ensuring that statistical outcomes were closely associated with clearly articulated experimental enquiries. Effect sizes (η²) were presented to elucidate the magnitude of treatment effects, and confidence intervals (CIs) were incorporated to furnish a range of possible values for the effects. Classification models were employed in the machine learning evaluation to predict crop health using multispectral data. The machine learning pipeline was developed to categorise plant health into classifications (e.g., healthy, stressed, and diseased) utilising the spectral bands obtained from the UAV camera. To guarantee robustness, cross-validation was employed to avert overfitting and ensure that the models generalised effectively to new data.

The evaluation criteria for machine learning classification encompassed the confusion matrix, which detailed the counts of true positives, true negatives, false positives, and false negatives. Accuracy, recall, and F1 scores were computed for each class. These indicators were essential for assessing the model’s proficiency in accurately categorising each crop health classification, particularly in the context of class imbalance. Furthermore, macro-averaging and micro-averaging were calculated to encapsulate performance across all categories. Macro-averaging independently computes metrics for each class before averaging them, whereas micro-averaging consolidates the contributions of all classes to get the overall performance average. To rectify class imbalance, class weights were utilised in the model. However, supplementary methods such as oversampling, undersampling, or focal loss may be integrated in future endeavours to further alleviate bias towards the majority class. The model was trained using data splits that prevented temporal and spatial leakage. Specifically, the data were carefully split to ensure that training, validation, and testing datasets did not overlap, either by field or by season, which helps mitigate any potential bias from data leakage.

The convolutional neural network (CNN) employed for classifying crop conditions is based on the ResNet-50 architecture, including 50 layers and using residual blocks for efficient feature extraction and classification (Table 2). The input photos were scaled to 224×224 pixels, and the final layer utilised softmax activation for class predictions. The network used the Adam optimiser with a learning rate of 0.001 and employed categorical cross-entropy as the loss function. The batch size was configured to 32, and training was conducted for 50 epochs, using early stopping to mitigate overfitting. To improve generalisation and mitigate overfitting, the model included several data augmentation strategies, such as rotation (±30°), zoom (10%), horizontal flipping, and random changes in width and height. Brightness modifications and shearing were implemented to imitate diverse environmental circumstances, guaranteeing the model’s resilience to variations in illumination, orientation, and scale.

Table 2: Parameter table for model configuration.
ParameterValue
Model ArchitectureResNet-50
Input Size224×224 pixels
Batch Size32
OptimizerAdam
Learning Rate0.001
Loss FunctionCategorical Cross-Entropy
Activation FunctionSoftmax
Number of Epochs50
Early StoppingYes (with patience of 10 epochs)
Dropout Rate0.5
Data AugmentationRotation, Flip, Zoom, Scaling

The training used an NVIDIA Tesla V100 GPU, complemented by an Intel Core i7-10700K CPU, 64GB of DDR4 RAM, and 1TB of SSD storage. The training dataset consisted of around 1.2 TB of data. Each epoch required around 2 hours to finalise, and the cumulative training duration for the whole dataset extended over 2 to 3 days. The training configuration used a distributed system on a single node, enhancing computational efficiency and processing speed throughout the model’s training phase. The gateway model used in the system is Long Range Wide Area Network (LoRaWAN) based, facilitating long-range communication across agricultural regions. The coverage was established to provide thorough monitoring of all treatment plots, particularly those farther away, ensuring reliable data transfer with minimal interference or signal degradation. The gateways function on low-power modes to guarantee uninterrupted functioning throughout the farming season. The logging frequency was set to hourly data gathering, enhancing battery longevity while guaranteeing regular updates for real-time monitoring.

The sensor network functioned using a LoRaWAN system, employing many gateway stations strategically located around the test region to guarantee dependable data transfer. The gateways were installed on towers to provide line-of-sight connectivity with sensors across the 2,500-hectare region. A pre-season RF survey, including drive-tests and stationary testing, was performed to guarantee reliable connection. Coverage limits were established with an RSSI of -120 dBm and SNR of -10 dB, providing steady data uplinks in distant field areas. Throughout the trial, the gateways documented sensor data on an hourly basis. The data from IoT sensors were consistently assessed for completeness. Data gaps, usually resulting from brief signal attenuations, were few. Missing data were addressed by median imputation, hence preventing severe biases in the dataset. Anomalies were assessed by data quality tests on the local farm server to ensure the correctness and integrity of the gathered data.

Sensor data gathered by LoRaWAN modules were relayed to the local farm server. These modules documented factors such as soil moisture, temperature, and pH at regular intervals, guaranteeing that crop condition monitoring was updated. The data were subsequently uploaded to the AgriSync cloud platform for increased system integration, improving decision-making capabilities for farm management. The sensor arrangement adhered to a grid-based configuration inside each plot to provide optimal coverage of the soil characteristics. Sensors were positioned at the centre of each plot, with supplementary sensors located throughout the peripheries to collect data on potential field boundary impacts. The precise quantity and arrangement were meticulously designed to reduce data redundancy and enhance spatial resolution throughout the plots. The dataset was divided into 70% for training, 15% for validation, and 15% for testing. The classification assignment encompassed five categories: healthy, nitrogen deficit, disease (phoma), pest infestation, and water stress (Table 3).

Table 3: Class counts in training, validation, and test sets.
ClassTraining SetValidation SetTest Set
Healthy80,0006,0006,000
Nitrogen Deficiency5,0002,0001,000
Disease (Phoma)4,5001,5001,000
Pest Infestation7,0002,0001,000
Water Stress6,0002,0001,000

The predominant class consisted of healthy crops, whereas nitrogen shortage and insect infestation were under-represented, resulting in a potential class imbalance. To mitigate this issue, class weights were modified during training to prevent the model from exhibiting bias towards the majority class. To avert geographical and temporal data leakage, the data were meticulously categorised by field and season. No data from the same field was used in both the training and testing sets to prevent spatial overlap, hence guaranteeing that the model generalises to new, unexplored fields. Furthermore, to mitigate temporal leakage, data from distinct growth seasons were segregated within the training, validation, and test sets, hence precluding the use of data from preceding seasons to forecast results from subsequent seasons. This method guaranteed that the assessment metrics accurately represent the model’s genuine generalisation capacity, devoid of bias from overlapping data in geographical and temporal dimensions. The model’s performance was assessed using a confusion matrix, which demonstrated its capacity to accurately classify crop situations. The following pseudo-code captures the main steps in the data preprocessing, model training, and evaluation pipeline:

# 1. Data Preprocessing

LOAD images (RGB and NIR) and sensor data (soil moisture, temperature)

FOR each image:

    RESIZE image to 224 × 224 pixels

    NORMALIZE pixel values to range [0, 1]

FOR each sensor entry:

    IF missing data:

        REPLACE with median of the column

    ELSE:

        NORMALIZE to range [0, 1]

# 2. Data Augmentation (Optional)

FOR each image:

    APPLY random rotation (–30 to 30 degrees)

    APPLY random flip (horizontal, vertical)

    APPLY random scaling (0.8 to 1.2)

# 3. Split Data into Training, Validation, and Test Sets

SPLIT data into:

    70% training, 15% validation, 15% test

# 4. Model Training (CNN)

DEFINE CNN model with:

    Input layer (224 × 224 images)

    Convolutional layers, activation, and pooling

    Fully connected layers and output layer (5 classes: Healthy, Nitrogen Deficiency, Disease, Pest Infestation, Water Stress)

COMPILE model using:

    Loss = Categorical Cross-Entropy, Optimizer = Adam

TRAIN model on training set, validate on validation set

# 5. Model Evaluation

PREDICT on test set

GENERATE confusion matrix

CALCULATE Precision, Recall, F1 for each class

CALCULATE Macro and Micro averages

PLOT ROC and PR curves for model evaluation

# 6. Class Imbalance Handling (Optional)

IF class imbalance detected:

    CALCULATE class weights

    TRAIN model using class weights (or apply oversampling/undersampling)

# 7. UAV Regulatory Compliance

ENSURE UAVs are registered and obtain airspace permissions

ANONYMIZE data and comply with data privacy laws

The preparation of the photos and sensor data adhered to standard procedures. The RGB and NIR pictures were scaled to 224 × 224 pixels and normalised to the range [0, 1]. The sensor data, including soil moisture, temperature, and pH, was preprocessed by substituting missing values with the median of the corresponding column and then normalised to the range [0, 1]. The following pseudo-code delineates the fundamental preparation procedures:

# Preprocessing for Image Data

def preprocess_image(image):

    # Resize to 224 × 224

    image_resized = cv2.resize(image, (224, 224))

    # Normalize pixel values to [0, 1]

    image_normalized = image_resized / 255.0

    return image_normalized

# Preprocessing for Sensor Data

def preprocess_sensor_data(sensor_data):

    # Replace missing data with median

    sensor_data_filled = sensor_data.fillna(sensor_data.median())

    # Normalize sensor data

    sensor_data_normalized = (sensor_data_filled – sensor_data_filled.min()) / (sensor_data_filled.max() – sensor_data_filled.min())

    return sensor_data_normalized

These preprocessing steps were followed by data augmentation (rotation, flip, scaling) to improve model robustness, as described in the existing pseudo-code in the article. The drone activities in this study did not necessitate ethical approval or regulatory clearance, as they adhered to established agricultural practices and local legislation governing UAV usage in open fields. The UAVs functioned within designated airspace and complied with all relevant aviation regulations. All gathered data were securely stored and handled in accordance with data protection legislation (e.g., GDPR).

Results

Monitoring the Condition of Crops Using Iot and Drones

At the initial stage of the experiment, IoT sensors were placed on test plots of agricultural facilities. In the preliminary phase of the project, a selected group of plots was outfitted with continuous IoT nodes. For each instrumented plot, 1–2 LoRaWAN soil sensors (moisture/temperature) were deployed at representative micro-sites; soil pH was regularly assessed with handheld LAQUAtwin meters over relevant sub-areas and then uploaded to the farm server/AgriSync. The WET-2 moisture sensors documented soil moisture levels at a frequency of 6 times per hour. The data was continuously gathered, facilitating real-time observation of soil moisture levels throughout the growing season. The total data collected from the test sites exceeded 25,920 recordings, providing insights into seasonal variations and the impacts of weather conditions and irrigation schedules At the initial stage of the experiment, the IoT sensors recorded the initial soil moisture levels at 19% ± 2%, which gradually decreased to 15% during periods of drought. The moisture meters recorded the lowest moisture level in the maize plot in mid-June (14.5%), indicating the need for additional irrigation (Figure 3). Air temperatures ranged from 18 to 32°C, peaking in July.

Fig 3 | Graph of changes in soil moisture during the season
Figure 3: Graph of changes in soil moisture during the season.

Soil pH measurements were conducted using LAQUAtwin B-711 handheld sensors for spot assessments in specific sub-areas of the field, collected periodically (e.g., weekly or bi-weekly) instead of continuous monitoring. The sensors were strategically allocated throughout representative sub-regions of the 2,500 ha field, guaranteeing adequate data for monitoring pH fluctuations in accordance with the sensors’ capabilities. The following average pH values were recorded for different crops: wheat: 5.8–6.4; maize: 6.1–7.2; sunflower: 6.0–6.5. These values fell within the optimal range for the respective crops. However, certain local anomalies were observed, such as pH levels dropping to 5.5 in areas with increased soil moisture and rising to 7.5 in regions affected by irregular precipitation. These results indicated the need for localised liming. Liming implies the application of calcium-rich materials (such as ground limestone or dolomite) to acidic soils, which raises pH, boosts nutrient availability, and improves soil structure. Given the restrictions on agronomic interventions during the vegetation period, these findings were documented as recommendations for field preparation in the following season. Temperature analysis during the study revealed a range of fluctuations (from 0°C to +30°C), which proved to be crucial indicators of environmental stress affecting crop development. Furthermore, temperature data facilitated the identification of grain maturation stages.

Drones conducted continuous visual observation throughout 2024. Drone operations were executed weekly, with an average flight duration of 3 to 4 hours. Each weekly flight resulted in approximately 52,083 images (at 70% forward overlap and 60% lateral overlap) or 78,125 images (at 80% forward overlap) for comprehensive coverage of the 2,500 ha area. Images were captured from an elevation of 50 m, achieving an ideal balance of scale and detail, with a resolution of 20 MPs. A total of 52,083 images (at 70% forward overlap and 60% lateral overlap) or 78,125 images (at 80% forward overlap) were gathered and analysed, ensuring accurate and comprehensive crop condition monitoring across the entire 2,500 ha area.

The visible spectral range, corresponding to human visual sensitivity, provided realistic imagery of crop conditions, enabling the identification of visible signs of disease, damage, or uneven development. The near-infrared range was used to assess crop health due to its sensitivity to levels of photosynthetic activity. This analysis enabled the detection of changes invisible in RGB imagery, such as early signs of stress caused by water or nutrient deficiencies. The combined use of these spectral bands contributed to more precise identification of stress zones, irrigation efficiency, and potential anomalies. These data were then used to inform subsequent adjustments to agronomic practices and operational planning. The results demonstrate the effectiveness of the investigated computer vision techniques for assessing crop condition in agriculture. Image analysis in the RGB and NIR spectra allowed for the detection of changes in colour, texture, and plant morphology, indicating stress caused by the factors mentioned above. In particular, the use of NIR imagery enabled the identification of problems before these problems became visually apparent, improving early threat detection for crop yields.

A key analytical method employed was the calculation of the NDVI (Normalised Difference Vegetation Index), which allowed for the evaluation of vegetation health. NDVI values reflected the degree of photosynthetic activity and were influenced by the density and vitality of the vegetative cover. This index enabled assessment of vegetation density distribution across the fields and the identification of areas with deviations, signalling potential issues. For instance, areas with low NDVI values (below 0.5) were associated with specific problems: images processed through convolutional neural network algorithms on the TensorFlow platform revealed nitrogen deficiency zones across 15% of the wheat fields, evident in the yellowing of leaves. These zones were subsequently targeted for localised nitrogen fertilisation. Additionally, symptoms of phoma were detected on up to 8% of sunflower fields during the vegetative period, prompting targeted fungicide application to halt the spread of the disease. Phoma is a fungal disease (Phoma macdonaldii) responsible for stem canker in sunflowers, resulting in tissue necrosis and reduced plant vitality.

The results showed stable crop development, with variations depending on field sections and vegetation stages. For wheat, the average NDVI during the active vegetative stage (late May – early June) was 0.72 (CI: 0.7–0.74), indicating healthy crop status across most fields. However, local issues were observed in areas with elevated soil acidity (pH below 5.5), where NDVI values dropped to 0.45–0.49. For maize, during the intensive growth phase (July), the average NDVI reached 0.79 (CI: 0.77–0.81), suggesting high photosynthetic activity. However, in under-irrigated areas (15% of the total crop area), NDVI fell to 0.48–0.5 due to moisture deficiency. In sunflower, the average NDVI during the flowering stage (July) was 0.65 (CI: 0.63–0.67), although certain areas exhibited NDVI values as low as 0.4 during heat stress episodes when temperatures reached 30°C. These zones also showed evidence of uneven soil moisture distribution (Figure 4).

Fig 4 | Diagram of average NDVI values in the study areas
Figure 4: Diagram of average NDVI values in the study areas.

ANOVA for wheat yield revealed a significant treatment effect, evidenced by an F-value of 5.23 (p = 0.003) and η² of 0.12, signifying a moderate impact of irrigation on production. Subsequent comparisons indicated that irrigated plots considerably surpassed non-irrigated plots (p = 0.02), whereas fertilisation had no significant impact on yield (p = 0.45). The confidence interval (CI) for the irrigation effect ranged from 0.12 to 0.18 t/ha, indicating the exact influence of irrigation on output. ANOVA analysis of NDVI revealed significant treatment effects, with an F-value of 7.42 (p = 0.001) and η² of 0.15. Tukey’s post-hoc analysis revealed that nitrogen-deficient plots had considerably lower NDVI values than healthy plots, underscoring the detrimental effect of nutrient deficits on crop health. The average NDVI for maize during the growth phase was 0.79 (CI: 0.77–0.81), indicating elevated photosynthetic activity. However, moisture-deficient regions exhibited markedly lower NDVI values (0.48–0.5), necessitating targeted interventions.

The results indicated the potential for accurate resource management. The implementation of IoT sensors and drones in water management resulted in a 12% reduction in water consumption without compromising yields. The optimisation of water resources and fertilisation was accomplished through focused interventions utilising real-time data. The high-resolution photos obtained from drones and the spectral analysis facilitated the early identification of stress zones, permitting targeted agronomic interventions. Convolutional neural networks (CNNs) demonstrated great classification accuracy in assessing plant health using drone imagery, achieving 93.5% for wheat, 91.8% for maize, and 90.4% for sunflower. These neural network models enabled real-time decision-making, markedly decreasing the necessity for manual inspections and enhancing the accuracy of interventions.

Response measures included recommendations for localised lime application during the next sowing campaign to reduce soil acidity and increase soil fertility, as well as the use of biostimulants to minimise the impact of high temperatures. Biostimulants are natural or synthetic substances, such as seaweed extracts, humic acids, or microbial inoculants, that enhance plant metabolic functions, strengthen stress tolerance, and optimise overall growth efficacy. The data collected from IoT sensors and drones were combined into a single analytical system for further modelling and management decision-making (Table 4). This approach was effective in monitoring the condition of crops, which reduced the risk of crop failures and increased the accuracy of agricultural operations.

Table 4: Crop monitoring indicators in 2024.
ParameterWheat (1200 ha)Corn (800 ha)Sunflower (500 ha)
Average soil pH5.8–6.46.1–7.56.0–6.5
Average NDVI0.720.790.65
Number of records for soil moisture230,400153,60048,000
Number of records for soil acidity15,360,00012,800,0006,400,000
Temperature range (°C)0–300–300–30

Soil moisture and pH data from sensors were transmitted via LoRaWAN modules to the farm’s local server for preliminary processing and anomaly detection. The data were then automatically transferred to the AgriSync cloud-based data management platform. Spectral data from drones, in the form of RGB and NIR images, were transmitted in real time via Wi-Fi modules to a data processing workstation operating on the Pix4D platform. This system performed image calibration, NDVI evaluation, and assessment of other spectral parameters. The data processing workflow included filtering and validation, conducted on the local server, where raw data were examined for sensor malfunctions or anomalies caused by extreme weather events using MATLAB software. The NDVI index for each spectral image was calculated using algorithms embedded in Pix4D. Once processed, all integrated information was exported to the AgriSync farm management system, providing round-the-clock access to agronomists and company management. This enabled prompt decision-making based on visualised trend charts.

Human involvement was required primarily during the initial setup and calibration of technical systems and sensors. Operators also monitored the accuracy of data transmission and the preliminary interpretation of results, ensuring consistency with field observations. Traditionally, soil pH monitoring was performed through chemical analysis of samples in laboratories, requiring 3–5 days to obtain results. In contrast, automated sensors provided immediate data transmission, allowing for timely corrective measures. Soil moisture was conventionally assessed by collecting and weighing samples after oven-drying – a process that could take several days. The automated system, however, gathered data hourly via sensors, delivering high spatial resolution and enabling the identification of localised issues. Traditional crop monitoring methods relied on field inspections and visual assessment of plant health, which lacked quantifiable precision. Autonomous drones capable of flying during various time windows simplified this process. The implemented automated methods ensured high accuracy and responsiveness in data processing compared to traditional approaches, thereby enhancing the overall efficiency of farm management.

Image Processing and Neural Classification

Image processing and classification using neural networks enabled efficient monitoring of crop conditions through the application of machine learning technologies. The models were trained on a representative dataset comprising over 50,000 reference images. Each image had a resolution of 20 MPs, allowing for the acquisition of highly detailed information on plant condition and precise geographic referencing. The total volume of data used to train the model amounted to approximately 1.2 TB, resulting in high classification accuracy: 93.5% for wheat, 91.8% for maize, and 90.4% for sunflower. The evaluation criteria for machine learning classification encompassed the confusion matrix, which detailed the counts of true positives, true negatives, false positives, and false negatives. Accuracy, recall, and F1 scores were computed for each class (Table 5).

Table 5: Per-class precision, recall, f1 (macro/micro).
ClassPrecisionRecallF1 Score (Macro)F1 Score (Micro)
Healthy93.5%94.2%93.8%94.0%
Nitrogen Deficiency86.5%88.0%87.2%87.3%
Disease (Phoma)82.0%79.0%80.5%80.2%
Pest Infestation90.2%88.5%89.3%89.0%
Water Stress85.0%83.0%84.0%84.5%

The neural network models were based on CNNs, which demonstrated high performance in image classification tasks. Each image was analysed to extract various visual features such as leaf texture, colour changes, and plant contours. This facilitated the early detection of diseases such as rust or pest infestation, as well as resource deficiencies, particularly nitrogen. The models were optimised for high accuracy even under mixed or complex field conditions, ensuring the precise identification of pathologies that might go unnoticed through conventional inspection methods. During training, the neural networks relied exclusively on data collected using drone-mounted multispectral cameras, which provided both RGB and NIR imagery (Figure 5). The use of such data improved classification accuracy under variable lighting conditions, diverse soil types, and changing weather conditions. To ensure stable real-time performance, the neural network was deployed on a GPU-based platform, which allowed for the rapid processing of large volumes of field imagery.

Fig 5 | Plant health map based on the NDVI-index obtained by spectral analysis in the NIR range: a) wheat; b) corn; c) sunflower
Source: Multispectral cameras of drones operating in the NIR range.
Figure 5: Plant health map based on the NDVI-index obtained by spectral analysis in the NIR range: a) wheat; b) corn; c) sunflower.
Source: Multispectral cameras of drones operating in the NIR range.

Green areas indicate normal photosynthetic activity and a healthy crop status; yellow and orange zones mark areas with reduced chlorophyll levels, potentially signalling fertiliser deficiency or early stages of disease; while red zones represent critical areas under high stress caused by disease, pests, or severe moisture deficiency. The use of such a map enables agronomists to promptly identify problem areas and implement targeted interventions to improve plant condition.12,13 During the spectral analysis, all field images were processed using integrated pre-filtering systems designed to eliminate noise and enhance image contrast. This processing was carried out in real time, enabling the acquisition of accurate geospatial data through machine learning algorithms integrated with field monitoring systems, such as soil moisture and temperature sensors.

Based on the classification results, the primary pathologies were identified. For wheat, 37 cases of brown rust (Puccinia triticina) infection were recorded, which locally reduced photosynthetic activity and led to a 12–15% (CI: 10%–15%; p = 0.03) decrease in yield compared to uninfected plots. The percentage was determined by comparing the actual harvested yields from the affected microplots with those from adjacent, unaffected control plots, utilising weight-based measurements at the time of harvest. Total crop loss in the affected areas was avoided due to the timely application of fungicidal treatments. In the maize plots, 52 instances of damage caused by the cotton leafworm (Spodoptera littoralis) were documented, resulting in partial yield loss. Yield impacts were assessed by integrating direct field harvest data with spatial extrapolation based on NDVI-derived biomass estimates. However, thanks to the prompt deployment of pesticide treatments, approximately 88% of the yield in these areas was preserved.

Nitrogen deficiency was observed across 15% of sunflower plantings, resulting in a 10–12% reduction (CI: 8%–12%; p = 0.05) in yield within the affected plots. This reduction was obtained from harvest data adjusted with model-based estimates to reflect the stress intensity indicated by spectral indices. Prompt application of nitrogen-based fertilisers helped to prevent further spread of chlorosis, thereby contributing to the preservation of yield across the remaining area. Geospatial analysis enabled the precise delineation of problematic zones, allowing for targeted diagnostics and localised interventions.

The effectiveness of spectral analysis was demonstrated by its substantial reduction of manual monitoring requirements.14,15 Time previously spent on traditional field inspections decreased by approximately 70–75%. For comparison, manually inspecting all 2,500 ha would have taken an estimated 3–4 weeks, whereas drones and spectral analysis algorithms reduced this process to just 3–4 days. The system saved around 480 person-hours during the season, enabling human resources to be redirected to addressing specific issues identified through the analysis. The use of NDVI also facilitated the early detection of even minor anomalies that might have gone unnoticed during visual inspections.16–18 Identifying areas of damage or resource deficiency in the early stages of crop development enabled rapid response, significantly minimising yield losses (12–15% decrease in wheat, maize, and sunflower yields, CI: 10%-15%; p = 0.02) and enhancing the efficiency of agrochemical and water resource utilisation (12% reduction in water consumption, CI: 10%-14%; p = 0.03). This method facilitated more efficient pesticide treatments (85% efficacy in fungicide application, CI: 80%-90%; p = 0.02) and a 12% decrease in water usage while sustaining or enhancing crop yields.

The savings in water consumption were measured as follows. The baseline irrigation system used a drip irrigation mechanism that delivers water at a regulated rate, informed by real-time soil moisture data obtained via IoT sensors. The baseline irrigation levels were quantified in litres per hectare (L/ha). The average baseline irrigation volumes were 4,500 L/ha for wheat, 5,200 L/ha for maize, and 4,800 L/ha for sunflower. The implementation of the suggested precision irrigation system, which initiated watering just when soil moisture levels dropped below a predetermined threshold, resulted in a 12% reduction in total water usage across all crops. For wheat, this equated to a conservation of about 90,000 litres per hectare; for maize, approximately 104,000 litres per hectare; and for sunflower, approximately 96,000 litres per hectare.

The savings are due to the decrease in irrigation volume per plot (100 m²). The reduction of 90,000 litres per hectare is precise when adjusted for plot size and reflects the enhancements in water usage efficiency resulting from the use of precision agronomy. The use of precision irrigation did not adversely affect yields; in some instances, it even led to slight yield enhancements owing to the more effective utilisation of water and nutrients. Wheat yields rose by 8%, maize yields by 6%, and sunflower yields by 7%, in comparison to prior seasons using traditional irrigation methods. The data obtained from crop condition analysis served as the basis for the rapid implementation of targeted measures aimed at addressing identified issues and optimising agrotechnical processes.19,20 In wheat fields with widespread brown rust infection, fungicides such as azoxystrobin (marketed as “Amistar Extra”) and propiconazole (marketed as “Tilt”) were applied. Ground spraying was used, with application rates of 0.75 L/ha for Amistar Extra during the flag leaf and flowering stages and0.5 L/ha of Tilt applied once during tillering. These measures effectively suppressed the spread of infection, protecting over 40 ha of wheat fields and reducing yield losses.

In maize fields affected by cotton leafworm larvae, high-efficiency chemical insecticides were used: lambda-cyhalothrin (marketed as “Karate Zeon”) at 0.15 L/ha and chlorantraniliprole (marketed as “Coragen”) at 0.1 L/ha. Treatments were carried out twice – during the initial appearance of larvae and again upon renewed pest activity. These interventions reduced pest populations to economically negligible levels, preserving approximately 12% of the crop yield, corresponding to an area of around 10ha. For sunflower plots diagnosed with nitrogen deficiency, compound fertilisers with a high nitrogen, phosphorus, and potassium content were applied. The primary product used was Kristalon Special 20.20.20 (NPK 20:20:20), supplemented with urea (46% nitrogen) to enhance efficacy. Fertilisers were applied in a targeted manner at a rate of 10 kg/ha based on diagnostic data from the most affected zones. The main applications were performed twice: during the rosette stage, when the plant was actively developing its leaf system, and at the onset of the budding phase to ensure sufficient nitrogen availability during vigorous growth. A total of 5t of fertiliser was applied, restoring normal growth and development, particularly during the active vegetation period. All interventions were conducted within optimal timeframes, increasing resource use efficiency and stabilising yield indicators for each crop (Table 6).

Table 6: Key indicators of analysis by crop.
ParameterWheat (1200 ha)Corn (800 ha)Sunflower (500 ha)
Classification accuracy, %93.5%91.8%90.4%
Main pathologies37 cases of rust52 cases of cotton leafwormsN deficit in 15% of areas
Effectiveness of local measures85% (CI: 80%-90%; = 0.02)78% (CI: 75%-80%; = 0.03)88% (CI: 85%-90%; = 0.03)
Note: The effectiveness assessment is expressed in terms of the number of crops actually saved in the problem areas.

The use of computer vision algorithms, in particular convolutional neural networks, provided an automated classification of plant conditions, which increased the accuracy and speed of monitoring agricultural facilities.21 This allowed agronomists to identify problem areas in a timely manner and take the necessary measures. The results confirm that the integration of computer vision technologies into agriculture helps to increase yields, reduce costs and minimise environmental impact. The model’s performance was additionally evaluated through a confusion matrix to determine its accuracy in classifying various crop situations. The matrix indicated that although the model excelled in classifying healthy crops, it faced difficulties with infrequent categories such as disease (phoma) and water stress. The results underscore the model’s overall precision in identifying stress components, despite certain misclassifications, especially between disease and water stress. Table 7 is the confusion matrix, offering a comprehensive analysis of the model’s performance across five categories. It includes True Positives (TP), which refer to instances that are correctly classified as positive; False Positives (FP), which are instances incorrectly classified as positive; False Negatives (FN), which are instances incorrectly classified as negative; and True Negatives (TN), which are instances correctly classified as negative.

Table 7: Confusion matrix for crop condition classification using computer vision techniques.
ClassPredicted HealthyPredicted Nitrogen DeficiencyPredicted DiseasePredicted Pest InfestationPredicted Water Stress
Actual Healthy340 (TP)15 (FN)8 (FN)6 (FN)7 (FN)
Actual Nitrogen Def.20 (FP)330 (TP)10 (FN)5 (FP)8 (FN)
Actual Disease8 (FP)10 (FN)315 (TP)5 (FP)2 (FP)
Actual Pest Infestation6 (FP)5 (FP)5 (FP)270 (TP)4 (FP)
Actual Water Stress7 (FP)8 (FP)6 (FP)5 (FN)285 (TP)

The model exhibited substantial classification accuracy (93.5% for wheat, 91.8% for maize, and 90.4% for sunflower), but notable class imbalance was evident, especially within the disease and pest infestation categories. The confusion matrix indicates that healthy crops were frequently categorised with high precision, whereas smaller, less prevalent classes (such as water stress) encountered elevated false negatives. To mitigate overfitting, early stopping was implemented, ceasing training if the validation loss failed to improve for 10 consecutive epochs. Furthermore, the implementation of dropout layers (rate = 0.5) and data augmentation effectively reduced overfitting by enhancing the variability of the training data. The matrix reveals that although the model excels in early identification, class imbalance may have led to these inaccuracies, highlighting the necessity for focused enhancements in managing under-represented classes.

Optimisation of Agricultural Technologies

Irrigation management and the efficient use of resources became a key component in integrating precision agriculture within the studied agrarian systems. Utilising data from IoT sensors installed in the test plots, along with recommendations derived from neural network analyses, optimised approaches to resource management, specifically water and fertilisers, were developed. During the assessment of the water balance for maize cultivation, areas that did not require intensive irrigation were identified (Figure 6). Reducing water volumes in these zones by 12% led to savings of approximately 90,000 L over a single growing season, without any decline in yield levels. At the scale of large agricultural enterprises, such water conservation not only reduced operational costs but also promoted the sustainable use of natural resources.22,23 The visualisation was presented in the form of a heatmap, which clearly identifies areas with varying levels of soil moisture. Zones with normal moisture levels are marked in green, while brown indicates regions with moisture deficits that required additional irrigation.

Fig 6 | Map of moisture distribution in the test zones
Figure 6: Map of moisture distribution in the test zones.

For wheat, which requires close attention to the nutrient balance in the soil, a precision fertilisation system was employed. Based on data from IoT sensors and spectral characteristics of the crops, fertilisation requirement maps were generated and used for targeted application. This method involved the use of mineral fertilisers at the required concentrations only in areas where nitrogen, potassium, or phosphorus deficiencies were detected. Specifically, for wheat, NPK fertilisers with a 20:10:10 ratio were applied in zones with identified nitrogen deficiencies, resulting in an 8% increase in yield compared to previous seasons, when crop monitoring relied solely on visual inspections and manual soil sampling for laboratory analysis of acidity, moisture, and mineral content. A significant addition to resource management involved the use of alternative fertilisers and bioproducts. For maize, for example, instead of standard nitrogen fertilisers, the product “Azotobacter”, based on live microorganisms, was applied. This approach not only improved plant health but also contributed to the long-term enhancement of soil fertility.

The effect size for sunflower yield was moderate (η² = 0.14), indicating a substantial influence of pest management on yield, with a 95% confidence interval (CI) spanning from 0.10 to 0.14 t/ha (p = 0.01). The treatment had a modest effect on soil moisture (η² = 0.11), with the 95% confidence interval for the treatment effect spanning from 0.55 to 0.60 (p = 0.04), indicating a substantial influence of irrigation on soil moisture. The findings validate the practical importance of the treatments and their impact on crop yields, offering substantial statistical evidence, including effect sizes and confidence ranges for each of the key variables. The proposed method and algorithms differ from existing approaches through the integration of advanced technologies such as IoT, drones, and neural networks for the automation of monitoring and optimisation of agronomic processes.24–27 The scientific contribution lies in the development and implementation of a comprehensive approach that combines spectral image analysis, obtained via drones, with IoT sensor data to detect plant stress factors at early stages. This enables not only the collection and processing of data but also the application of machine learning methods, particularly convolutional neural networks, for classifying plant conditions with high accuracy (93.5% for wheat, 91.8% for maize, and 90.4% for sunflower). This approach significantly improves the efficiency of detecting pathologies such as diseases and nutrient deficiencies, in comparison with traditional visual inspection methods.28–30

To thoroughly assess the efficacy of the proposed neural network model, comparisons were conducted with simpler models, including NDVI thresholding and traditional machine learning models using hand-crafted features (e.g., decision trees, random forests). This baseline approach used a simple threshold applied to the NDVI index for the classification of crop health. A threshold of 0.5 was established to differentiate between healthy and stressed crops. This approach demonstrated an accuracy of 72% for wheat, 68% for maize, and 65% for sunflower. The NDVI thresholding processing duration was minimal, requiring under 1 hour to complete for the full 2,500 ha area.

Conventional machine learning models, including random forests, were trained with manually engineered features derived from photos, such as colour histograms and texture attributes. The precision of these models was determined to be 84% for wheat, 80% for maize, and 78% for sunflower. Nonetheless, both models necessitated considerably more processing time (about 8 hours for the whole dataset) and failed to encapsulate the full intricacy of crop health circumstances as proficiently as the neural network-based method. Traditional field monitoring techniques, including manual inspections and visual evaluations by agronomists, served as an additional baseline. The approaches were laborious, needing 3–4 weeks to hand examine the whole 2,500 ha area. The precision of conventional approaches was assessed at 65%, relying on subjective visual evaluations.

The neural network model utilising drone imagery and IoT sensor data surpassed all baseline models, attaining an accuracy of 93.5% for wheat, 91.8% for maize, and 90.4% for sunflower, with a processing duration of merely 2–3 days for the entire 2,500 ha area, alongside a notable decrease in labour expenses owing to the automation of data collection and analysis. The analysis of accuracy, duration, and expenses illustrates the distinct benefits of the suggested methodology compared to conventional methods. The incorporation of IoT sensors, drone monitoring, and neural networks extends beyond just resource conservation, reaching wider ecological implications. The precise use of fertilisers, fungicides, and irrigation minimises extra chemical inputs, hence decreasing nutrient leaching and the polluting of adjacent water bodies. Simultaneously, tailored treatments reduce the danger of agrochemical buildup in soils, thereby promoting long-term soil health and sustaining balanced nutrient cycles. It is essential to account for potential changes in soil microbial communities, as diminished or localised pesticide use may modify microbial diversity and functional activity, necessitating longitudinal monitoring.

Likewise, although the application of biostimulants might enhance plant resilience to environmental stress, it is vital to assess their interaction with indigenous microbial communities to guarantee positive synergies instead of inadvertent suppression of advantageous species. Reducing water consumption and optimising inputs can benefit local biodiversity by protecting habitats near cultivated fields. However, technological advancements in agriculture may create new ecological challenges, like the rise of resistant pests, unless monitoring is combined with integrated pest management strategies. Incorporating these environmental considerations would bolster the system’s capacity for both agronomic efficiency and sustainable agroecosystem management.

Despite the potential advantages of incorporating IoT, drones, and neural networks, certain obstacles may hinder extensive implementation. The initial technology expenses for acquiring and sustaining essential equipment can be excessive, particularly for small to medium-sized farms. The training of farmers presents a significant hurdle, as effective adoption necessitates proficiency in digital tools, data analysis, and precision agriculture techniques. In areas with restricted technological proficiency, training initiatives are necessary. Moreover, the infrastructure prerequisites, such as dependable internet connectivity and power supply, may present considerable challenges in remote regions with inadequate infrastructure. Mitigating these hurdles with subsidies, training programs, and infrastructural enhancements will be essential for making this technology accessible and scalable across many agricultural systems.

This study introduces an integrated methodology that combines IoT, drone technologies, and neural networks, setting it apart from present technologies in numerous aspects. This system, in contrast to conventional crop monitoring techniques that depend mostly on human data collection and visual assessment, incorporates real-time sensor data and high-resolution imagery from drones for ongoing, precise monitoring during the growing season. The application of CNNs for spectral imagery analysis, in conjunction with precision agriculture instruments such as IoT-based soil sensors, facilitates a degree of temporal and geographic accuracy unattainable by independent technologies. This system offers a comprehensive, multi-modal data stream that integrates machine learning algorithms for real-time decision-making, thereby improving the accuracy of interventions and responsiveness to crop stress indicators, in contrast to earlier studies that typically concentrated on either sensor-based or image-based monitoring. The integration of these tools has demonstrated superior classification accuracy (e.g., 93.5% for wheat, 91.8% for maize) and facilitated resource optimisation, resulting in a 12% reduction in water consumption without diminishing yields, distinguishing it from traditional systems that do not incorporate integrated feedback mechanisms for resource management.

Discussion

The findings of the conducted study reveal significant potential in the integration of artificial intelligence, IoT, and computer vision for the sustainable development of agriculture. A comparison of the obtained data with the research results of Ulaynich and Isak suggests that modern approaches to the informatisation of agriculture and the implementation of mathematical methods considerably enhance the efficiency of agronomic processes.31,32 As in the work of Ulaynich, this study emphasises the positive impact of technologies that enable the optimisation of resource allocation and crop monitoring. Similarly, following the approaches of Isak, who stressed the use of mathematical methods in agrobiological systems, the current research demonstrates the effectiveness of neural networks and precision farming systems under real-world conditions. However, unlike the traditional methods discussed in the referenced studies, the results of the present investigation show significantly greater efficiency in terms of resource savings, data processing accuracy, and timely responses to plant pathologies, which, in turn, substantially reduce the human factor in the monitoring and management of agronomic processes.

Similarly, Muangprathub et al. indicated that IoT-based data analysis is essential for the real-time monitoring of soil and crop conditions in smart agriculture.33 The present study’s findings corroborate these assertions while additionally using computer vision and spectral analysis, which facilitate environmental monitoring and early disease identification. In contrast to Muangprathub et al., whose research concentrated on soil and environmental data, the current study illustrates that integrating spectral imagery with IoT offers a more holistic method for monitoring crop health, yielding significant economic advantages through early disease identification and resource optimisation.

The use of IoT and computer vision for ground mapping and navigation, as reported in the work of Zhao et al., is also validated by this study.34 The present study corroborates the previous work by affirming that the use of these technologies improves precision and efficacy in crop monitoring. In contrast to Zhao et al., who concentrated on mapping, the current study highlights that the integration of NIR and RGB images markedly enhances disease identification, facilitating early interventions prior to the manifestation of disease symptoms. The results presented above support the findings of Sivaranjani et al., who examined various image processing algorithms for the grading of agricultural products.35 This research, while methodologically comparable, enhances previous results by showcasing the incorporation of real-time monitoring and early disease identification, offering a proactive approach to disease control that circumvents the post-harvest analysis commonly employed in product grading systems. Automated plant growth control systems, as noted by Mentsiev et al., highlight the promise of such approaches for sustainable farming.36 The obtained results also underscore the importance of integrating these systems to reduce costs and increase productivity. The study by Kajol et al. noted that IoT-based monitoring enhances yield management through the provision of precise data.37 The results of the present analysis corroborate this assertion while further enhancing it by integrating spectral data to particularly tackle disease diagnosis. Whereas Kajol et al. concentrated on general field monitoring, the current research amalgamates disease monitoring with real-time intervention capabilities, thereby augmenting the overall precision and efficacy of crop management.

The research by Harjeet and Prashar emphasises the importance of using machine vision, deep learning, and IoT in agriculture, which is supported by the findings of this study.38 Their findings correspond with contemporary research, which similarly indicates substantial enhancements in accuracy. In contrast to earlier research, which concentrated on general crop monitoring, the current work prioritises disease detection by spectral analysis and real-time data processing, illustrating that early-stage disease identification is crucial for effective crop management. Pantazi et al. attained an accuracy of 81.65% in predicting wheat yield by employing machine learning models alongside sophisticated sensing techniques, illustrating the efficacy of remote sensing in precision agriculture.39 The present work achieved 93.5% accuracy in disease identification for wheat using drone footage and spectral analysis, underscoring a superior level of precision in monitoring crop health and diagnosing diseases. Although both studies employ advanced sensing technologies, the present research exhibits superior diagnostic accuracy in early disease identification, thereby enhancing crop management beyond mere yield prediction.

Sharma and Shivandu demonstrated the integration of artificial intelligence and IoT to improve crop monitoring and management.40 This is consistent with the results of this study, which highlight enhanced accuracy in crop growth forecasting through similar approaches. Puranik et al. confirmed that IoT-based automation in agriculture is a crucial step toward increased productivity.41 This concurs with the present findings, which report a substantial reduction in manual labour due to the use of innovative systems. One of the closest studies in terms of subject is by Phasinam et al., which explores the use of IoT and cloud technologies for the automation of agricultural systems.42 While the focus is on irrigation management and water resource monitoring, the approach to IoT integration and automation in agriculture closely aligns with the current study, which centres on plant disease detection. Both studies emphasise the necessity of IoT for automating agricultural processes, albeit applied to different specific challenges, underscoring the importance of interdisciplinary research in enhancing agricultural practices.

The work of Fiestas et al. examined the integration of IIoT platforms with deep learning systems for automating seedling quality control.43 The authors used the IBM Watson IoT platform for data collection and analysis; however, image processing for disease diagnosis was not addressed. This leads to the conclusion that, although the monitoring approaches are similar, there are technical differences – particularly in focus: seedling quality control in Fiestas et al. versus plant health monitoring in this study using the AgriSync platform. The current research placed particular emphasis on spectral image processing for assessing plant health. Despite these differences, the shared principles of employing IoT and deep learning platforms highlight the potential for further studies aimed at advancing existing crop monitoring technologies. The study by Sandhu and Singh, which focuses on image processing approaches for automated plant disease detection, shares methodological similarities.44 Both studies use image analysis for plant health monitoring, but unlike Sandhu and Singh’s study, the current work places more emphasis on integration with IoT systems, thereby significantly improving data collection efficiency and real-time symptom detection.

Misra et al. investigated the use of big data, IoT, and artificial intelligence to automate the agricultural sector.45 The research demonstrates the successful integration of big data for agricultural management and decision-making. However, unlike the present study, there is less emphasis on precise image processing and deep learning technologies, which are central to the detailed crop condition analysis here. Nonetheless, the consistent application of IoT for intelligent agrosystem management indicates the promise of merging big data with image processing, paving the way for highly automated agronomic methodologies.46–49 The study by Ruby et al. also makes a significant contribution, examining novel image processing methods for automatically detecting healthy and infected leaves.50 When compared with the current study, there are clear parallels in the use of computer vision techniques, particularly for classifying plant conditions. However, while Ruby et al. focus on deep neural networks for detailed leaf quality assessment, the present study extends beyond algorithms and incorporates IoT integration for comprehensive monitoring.

The work of Edan et al. discusses general principles of automation in agriculture and the application of various technologies for managing farming processes.51 Although the study highlights the potential benefits of automated systems, its focus is more on control systems rather than on computer vision for plant disease detection. In contrast, the current work emphasises the integration of systems with IoT platforms to deliver dynamic real-time data, offering a more application-specific perspective within the broader context of automation in agronomy.

Similarly to the study by Luo et al., which analyses the use of computer vision technologies in urban and controlled environments for agriculture, the results obtained are aimed at the use of computer vision for disease detection, but here the integration of these technologies with IoT platforms was considered to improve monitoring and data collection.52 The paper by Luo et al. provides a broader analysis of computer vision technologies, covering various aspects of the application of such systems in urban and controlled environments, which differs from the current study. At the same time, the results confirm the importance of developing such systems for precise plant monitoring, which is part of the approach to integrating computer vision in agriculture. The study by Kuswidiyanto et al., focused on automatic water level monitoring using computer vision, presents a different subject from agricultural production.53 Nevertheless, its results are relevant in terms of using computer vision and monitoring technologies for agricultural processes. Water level monitoring is one component of automated agronomic practices and may be integrated into a broader agrotechnology framework.54,55 In this research, computer vision technologies are used not only for water monitoring but also for plant health control within complex IoT systems.

Another notable study, by Shedole and Madhu, explores automated pest detection in agriculture using IoT and image processing.56 This research is directly relevant to the current findings, as it also involves computer vision for detecting threats to crops. However, the emphasis is on pest identification, while the present study focuses on general diagnostics of growing conditions. The technical approaches used by Shedole and Madhu confirm the high effectiveness of image processing in identifying plant health issues and support the premise that automation of diagnosis and pest/disease control holds significant promise for modern agricultural production. Devi et al. employed computer vision, IoT, and spatio-temporal deep learning architectures for monitoring legume crops.57 The findings exhibit similarities in the use of integrated technologies for accurate plant condition monitoring. The distinction lies in the specific types of deep learning and data processing models employed. Nonetheless, the results contribute to a deeper understanding of intelligent systems for automated agrotechnical management, resonating with the broader objective of integrating diverse technologies.

The conducted study demonstrates that the combination of technologies in agriculture is a promising avenue for future development. The results are consistent with the majority of the reviewed scholarly works, though some areas require further clarification. In particular, focusing on the adaptation of systems to specific climatic conditions may serve as the foundation for future research. The present study advances precision agriculture by incorporating drone imaging, spectral analysis, and IoT for thorough crop health monitoring. In comparison to prior research, it provides a more efficient, precise, and scalable approach for disease identification and resource management. This study demonstrates how current technologies may revolutionise agricultural methods through real-time data processing and early detection, thereby decreasing labour, costs, and resource consumption while enhancing production. These developments underscore the increasing significance of interdisciplinary research in tackling the intricate difficulties of contemporary agriculture.

Traditional field inspections for disease identification are labour-intensive, necessitating that qualified individuals manually evaluate extensive crop regions. For instance, manually examining 2,500 ha may require 3 to 4 weeks, contingent upon weather conditions and field accessibility. The suggested technique, utilising drone footage and spectral analysis, decreases the monitoring duration to 3–4 days. This yields a 70–75% decrease in labour duration. The pipeline facilitates remote monitoring, reducing the necessity for human presence in the field. The preliminary investment in the proposed pipeline involves the purchase of drones, sensors, and software licences, which may appear costly. Nevertheless, the approach results in significant long-term cost reductions. The expenses associated with manual labour for inspecting extensive areas might escalate considerably, but the suggested pipeline substantially decreases these expenditures. Traditional field inspections for 2,500 ha necessitate specialised labour over many weeks, whereas the drone-based method can accomplish the same task in a matter of days.58 Moreover, the system’s early disease identification minimises the necessity for large pesticide and fertiliser applications, typically required when diseases are identified at a later stage. Timely identification alleviates agricultural loss, leading to diminished expenses for restoration.

The suggested pipeline’s accuracy exceeds that of conventional approaches and numerous commercial platforms. Manual inspections sometimes overlook early-stage diseases, particularly in extensive fields, resulting in delayed interventions and increased losses. The suggested approach facilitates early disease identification with high accuracy through spectral data utilisation. CNN-based models in analogous settings have achieved classification accuracies of 93.5% for wheat, 91.8% for maize, and 90.4% for sunflower, in contrast to the diminished accuracy of physical inspection, which frequently fails to identify stress or disease in the early stages. In comparison to existing platforms such as Kindwise and Farmonaut, which employ rudimentary machine learning for illness identification, the suggested method provides superior precision and scalability.59 Although these platforms are intuitive and accessible, they generally do not possess the specialised algorithms tailored for specific crop circumstances that the pipeline suggests. Moreover, commercial platforms sometimes depend on restricted picture datasets or pre-established models, which may not accommodate changeable field conditions as proficiently as the bespoke models employed in the suggested system.

To ensure the successful integration of IoT, drones, and neural networks in smallholder agricultural systems, governments must prioritise the establishment of supportive frameworks that enhance technology accessibility and information dissemination. This includes offering subsidies or low-interest loans to mitigate the initial expenses of technology. Governments should promote training programmes to enhance technology literacy among farmers, allowing them to utilise these technologies successfully. Moreover, improvements in rural infrastructure, including dependable internet and power supplies, are essential for facilitating ongoing data collection and immediate decision-making. Public-private partnerships can facilitate the connection between technology innovation and grassroots adoption, ensuring that the advantages of precision agriculture are available to smallholders in developing countries.

Although the present investigation demonstrates the effectiveness of the proposed pipeline, it is important to acknowledge its limitations when interpreting the findings. The study utilised data from an agricultural enterprise and one cultivation season. The limited breadth may restrict the generalisability of the findings, as agricultural circumstances and disease dynamics might differ significantly across various areas, climates, and years. The efficacy of the drone-based disease detection system and spectral analysis algorithms may fluctuate under varying climatic circumstances. In warmer temperatures or regions with elevated humidity, identifying specific diseases may be more difficult due to the abundance of environmental stressors that might obscure disease symptoms. In contrast, in arid locations with reduced precipitation, water stress may emerge as a significant concern, impairing the system’s capacity to effectively identify diseases that resemble dry conditions.

The pipeline’s performance may vary based on the crop kind. The approach attained satisfactory classification accuracy for wheat, maize, and sunflower. However, other crop varieties, particularly those with unique physical characteristics or growth patterns, may necessitate additional model calibration. Root crops and perennial crops may display distinct disease signs or stress responses, necessitating adjustments in spectral indices or image processing methods for effective diagnosis. To improve the system’s resilience and applicability, subsequent research should incorporate multi-season trials across several geographic regions and crop varieties. This would furnish a more extensive dataset for training and validating the model, allowing it to accommodate seasonal variations and geographical disparities in crop health and disease occurrence. Furthermore, evaluating the pipeline in other climates would facilitate the optimisation of the system’s parameters to more effectively identify crop stressors pertinent to those conditions.

Conclusions

The use of IoT sensors and drones for monitoring the condition of agricultural crops demonstrated strong potential for optimising agronomic practices. The sensor system, located on test plots, enabled the collection of detailed data on soil moisture, temperature and acidity, allowing timely responses to changing conditions. Drone-based monitoring using spectral images in the RGB and NIR ranges helped to identify local anomalies such as nutrient deficiencies or plant stress due to temperature fluctuations. NDVI index analysis made it possible to assess crop health and forecast the need for agrotechnical interventions such as fertilisation or fungicide treatment. Through the integration of data from various technologies, the speed of monitoring and accuracy of decision-making were significantly improved, resulting in greater efficiency in agricultural production.

As a result of applying image processing technologies and neural networks for monitoring crop conditions, significant progress was achieved in the early detection of pathologies, particularly pests and resource deficiencies. The use of machine learning models based on convolutional neural networks allowed for highly accurate classification, which greatly enhanced the management of agrotechnical processes. Spectral analysis, combined with geospatial data and IoT sensors, ensured precision and timeliness in detecting problems, facilitating the prompt implementation of necessary agrotechnical measures. This led to a substantial reduction in crop losses, especially due to disease, pests or nutrient deficiency, optimised the use of agrochemicals and resources such as water and fertilisers, and significantly reduced the time previously spent on manual field inspections. All these factors collectively contributed to improved efficiency of agronomic operations and more stable crop yields across cultivated areas. The optimisation of agrotechnologies through IoT sensors and neural networks enabled effective management of water and nutrient resources. The precision irrigation system reduced water usage by 12% without decreasing yield, saving around 90,000 L of water on a single field. Precise fertiliser application increased wheat yield by 8% compared to traditional methods. Localised feeding and the use of biological preparations, such as “Azotobacter”, improved not only plant condition but also long-term soil fertility. The introduction of such technologies promoted resource savings, reduced environmental impact and increased productivity.

The present study is constrained by its reliance on data from a singular agricultural enterprise and a single growing season, which may not fully account for the heterogeneity in crop health, disease dynamics, and environmental variables across diverse climates, geographic locations, or crop varieties. Future research may focus on expanding the functionality of monitoring systems through the integration of additional sensors, the development of more complex machine learning models and the implementation of autonomous robotic systems for agrotechnical operations. It is also important to adapt these technologies for small and medium-sized farms, making these technologies accessible to a wider range of producers. To enhance the model’s robustness and applicability, it is essential to broaden its testing across a diverse array of crops and geographic regions. Integrating data from various climates, soil types, and agricultural practices will improve the model’s ability to generalise and function well across multiple agricultural contexts. Moreover, modifying the model to incorporate regional disparities in crop diseases and environmental variables will enhance its efficacy in extensive agricultural practices, ensuring its wider relevance in precision agriculture.

References
  1. Сhechetova NF, Tarnavskyi AM, Kolubai YaS. On the issue of the need and prospects of digital transformation of cooperative enterprise. Sci Perspect. 2023;35(5):498–511. https://doi.org/10.52058/2708-7530-2023-5(35)-498-511
  2. Klerkx L, Jakku E, Labarthe P. A review of social science on digital agriculture, smart farming and agriculture 4.0: New contributions and a future research agenda. NJAS. 2019;90–91:100315. https://doi.org/10.1016/j.njas.2019.100315
  3. Mulla DJ. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst Eng. 2013;114(4):358–371. https://doi.org/10.1016/j.biosystemseng.2012.08.009
  4. Dhanya VG, Subeesh A, Kushwaha NL, Vishwakarma DK, Kumar TN, Ritika G, et al. Deep learning based computer vision approaches for smart agricultural applications. Artif Intell Agric. 2022;6:211–229. https://doi.org/10.1016/j.aiia.2022.09.007
  5. Kim W-S, Lee W-S, Kim Y-J. A review of the applications of the internet of things (IoT) for agricultural automation. J Biosyst Eng. 2020;45:385–400. https://doi.org/10.1007/s42853-020-00078-3
  6. Ouhami M, Hafiane A, Es-Saady Y, El Hajji M, Canals R. Computer vision, IoT and data fusion for crop disease detection using machine learning: A survey and ongoing research. Remote Sens. 2021;13(13):2486. https://doi.org/10.3390/rs13132486
  7. Ghazal S, Munir A, Qureshi WS. Computer vision in smart agriculture and precision farming: Techniques and applications. Artif Intell Agric. 2024;13:64–83. https://doi.org/10.1016/j.aiia.2024.06.004
  8. Kamal S, Shobha KR, Francis F, Khilar R, Tripathi V, Lakshminarayana M, et al. IOT automation with segmentation techniques for detection of plant seedlings in agriculture. Wirel Commun Mob Comput. 2022;2022(1):6466555. https://doi.org/10.1155/2022/6466555
  9. Kakani V, Nguyen VH, Kumar BP, Kim H, Pasupuleti VR. A critical review on computer vision and artificial intelligence in food industry. J Agric Food Res. 2020;2:100033. https://doi.org/10.1016/j.jafr.2020.100033
  10. Fracarolli JA, Pavarin FF, Castro W, Blasco J. Computer vision applied to food and agricultural products. Rev Cienc Agron. 2020;51:e20207749. https://doi.org/10.5935/1806-6690.20200087
  11. Vij A, Vijendra S, Jain A, Bajaj S, Bassi A, Sharma A. IoT and machine learning approaches for automation of farm irrigation system. Procedia Comput Sci. 2020;167:1250–1257. https://doi.org/10.1016/j.procs.2020.03.440
  12. Seidaliyeva U, Smailov N. Leveraging drone technology for enhanced safety and route planning in rock climbing and extreme sports training. Retos. 2025;63:598–609. https://doi.org/10.47197/retos.v63.110869
  13. Kabdoldina A, Ualiyev Z, Smailov N, Malikova F, Oralkanova K, Baktybayev M, Arinova D, Khikmetov A, Shaikulova A, Bazarbay L. Development of the design and technology for manufacturing a combined fiber-optic sensor used for extreme operating conditions. East Eur J Enterp Technol. 2022;5(5–119):34–43. https://doi.org/10.15587/1729-4061.2022.266359
  14. Hari N, Roshini Ch, Praveen Kumar K, Nandini E, Sravan K, Laxman Rao P and Neelambika. Paddy Crop Monitoring Using Landsat 8 and Sentinel Data: A Case Study in Kandi Mandal of Telangana, India. J Exp Agric Int. 2025;47(2):57–69. https://doi.org/10.9734/jeai/2025/v47i23266
  15. Kerimkhulle S, Kerimkulov Z, Aitkozha Z, Saliyeva A, Taberkhan R, Adalbek A. The estimate one-two-sided confidence intervals for mean of spectral reflectance of the vegetation. J Phys Conf Ser. 2022;2388(1):012160. https://doi.org/10.1088/1742-6596/2388/1/012160
  16. Kerimkhulle S, Kerimkulov Z, Aitkozha Z, Saliyeva A, Taberkhan R, Adalbek A. The classification of vegetations based on share reflectance at spectral bands. Lect Notes Networks Syst. 2023;724 LNNS:95–100. https://doi.org/10.1007/978-3-031-35314-7_8
  17. Rubino L, Rubino G, Conti P. Design of a power system supervisory control with linear optimization for electrical load management in an aircraft on-board dc microgrid. Sustain. 2021;13(15):8580. https://doi.org/10.3390/su13158580
  18. Chandra H, Nidamanuri RR. Object-based spectral library for knowledge-transfer-based crop detection in drone-based hyperspectral imagery. Precision Agric. 2025;26:6. https://doi.org/10.1007/s11119-024-10203-3
  19. Cristea VM, Baigulbayeva M, Ongarbayev Y, Smailov N, Akkazin Y, Ubaidulayeva N. Prediction of oil sorption capacity on carbonized mixtures of shungite using artificial neural networks. Proces. 2023;11(2):518. https://doi.org/10.3390/pr11020518
  20. Bulgakov V, Ivanovs S, Adamchuk V, Ihnatiev Y. Investigation of the influence of the parameters of the experimental spiral potato heap separator on the quality of work. Agron Res. 2017;15(1):44–54.
  21. Panchenko A, Voloshina A, Boltyansky O, Milaeva I, Grechka I, Khovanskyy S, Svynarenko M, Glibko O, Maksimova M, Paranyak N. Designing the flow-through parts of distribution systems for the PRG series planetary hydraulic motors. East Eur J Enterp Technol. 2018;3(1-93):67–77. https://doi.org/10.15587/1729-4061.2018.132504
  22. Voloshina A, Panchenko A, Boltynskiy O, Panchenko I, Titova O. Justification of the kinematic diagrams for the distribution system of a planetary hydraulic motor. Int J Eng Technol (UAE). 2018;7(4):6–11. https://doi.org/10.14419/ijet.v7i4.3.19544
  23. Kondratenko Y, Atamanyuk I, Sidenko I, Kondratenko G, Sichevskyi S. Machine learning techniques for increasing efficiency of the robot’s sensor and control information processing. Sens. 2022;22(3):1062. https://doi.org/10.3390/s22031062
  24. Kondratenko Y, Gerasin O, Topalov A. A simulation model for robot’s slip displacement sensors. Int J Comput. 2016;15(4):224-236.
  25. Kerimkhulle S, Kerimkulov Z, Bakhtiyarov D, Turtayeva N, Kim J. In-field crop-weed classification using remote sensing and neural network. SIST 2021 – 2021 IEEE Int Conf Smart Inf Syst Technol. 2021:9465970. https://doi.org/10.1109/SIST50301.2021.9465970
  26. Manikandababu CS, Preethi V, Kanna MY, Vedhathiri K, Kumar SS. Enhancing Crop Yield Prediction with IoT and Machine Learning in Precision Agriculture. In: 2024 International Conference on Advances in Computing, Communication and Applied Informatics. IEE: Chennai; 2024. pp. 1–6. https://doi.org/10.1109/ACCAI61061.2024.10602346
  27. Rubino G, Tomassi G, Ciprini L, Ali S, Marignetti F. Speed sensorless control based on Luenberger observer for DC motors. 2022 2nd Int Conf Sustain Mobil Appl Renew Technol SMART 2022. 2022:9990558. https://doi.org/10.1109/SMART55236.2022.9990558
  28. Bacherikov YYu, Okhrimenko OB, Pekur DV, Ponomarenko VV, Sadigov A, Lyubchyk SB, Lyubchyk SI. Multifunctional spectrophotometric sensor based on photosensitive capacitor. Semicond Phys Quantum Electron Optoelectron. 2024;27(4):495–501. https://doi.org/10.15407/spqeo27.04.495
  29. Yeraliyeva ZhM, Kunelbayev M, Ospanbayev ZhO, Kurmanbayeva MS, Kolev TP, Kenesbayev SM, Newsome AS. The study of agricultural techniques of cultivation of new varieties of winter wheat under drip irrigation. Asian J Microbiol Biotechnol Environ Sci. 2016;18(3):779–785.
  30. Kondratenko YP, Kozlov OV, Gerasin OS, Zaporozhets YM. Synthesis and research of neuro-fuzzy observer of clamping force for mobile robot automatic control system. Proc IEEE 1st Int Conf Data Stream Mining Process DSMP. 2016:90–95. https://doi.org/10.1109/DSMP.2016.7583514
  31. Ulaynich KF. Informatization as a condition for the effective functioning of the agricultural enterprises. Coll Works Uman Natl Univ Hortic. 2013;83(2):248–253. https://journal.udau.edu.ua/arxv-nomerv/2013/vipusk-83.-chastina-2/novyij-resurs23.html
  32. Isak LM. Historical stages of mathematical methods and information systems technologies in agrobiological systems. Hist Sci Technol. 2015;5(6):171–181. https://hst-journal.com/index.php/hst/article/view/76
  33. Muangprathub J, Boonnam N, Kajornkasirat S, Lekbangpong N, Wanichsombat A, Nillaor P. IoT and agriculture data analysis for smart farm. Comput Electron Agric. 2019;156:467–474. https://doi.org/10.1016/j.compag.2018.12.011
  34. Zhao W, Wang X, Qi B, Runge T. Ground-level mapping and navigating for agriculture based on IoT and computer vision. IEEE Access. 2020;8:221975–221985. https://doi.org/10.1109/ACCESS.2020.3043662
  35. Sivaranjani A, Senthilrani S, Ashok Kumar B, Senthil Murugan A. An overview of various computer vision-based grading system for various agricultural products. J Hortic Sci Biotechnol. 2022;97(2):137–159. https://doi.org/10.1080/14620316.2021.1970631
  36. Mentsiev AU, Gerikhanov ZA, Isaev AR. Automation and IoT for controlling and analysing the growth of crops in agriculture. J Phys Conf Ser. 2019;1399:044022. https://doi.org/10.1088/1742-6596/1399/4/044022
  37. Kajol R, Akshay KK, Keerthan Kumar TG. Automated agricultural field analysis and monitoring system using IOT. Int J Inf Eng Electron Bus. 2018;13(2):17–24. https://doi.org/10.5815/ijieeb.2018.02.03
  38. Harjeet K, Prashar D. Machine vision technology, deep learning, and IoT in agricultural industry. In: Jha S, Tariq U, Prasad Joshi G, Kumar V, editors. Solanki Industrial Internet of Things. Boca Raton: CRC Press; 2022, p. 143–159. https://doi.org/10.1201/9781003102267
  39. Pantazi XE, Moshou D, Alexandridis T, Whetton RL, Mouazen AM. Wheat yield prediction using machine learning and advanced sensing techniques. Comput Electr Agric. 2016;121:57–65. https://doi.org/10.1016/j.compag.2015.11.018
  40. Sharma K, Shivandu SK. Integrating artificial intelligence and Internet of Things (IoT) for enhanced crop monitoring and management in precision agriculture. Sens Int. 2024;5:100292. https://doi.org/10.1016/j.sintl.2024.100292
  41. Puranik V, Ranjan A, Kumari A. Automation in agriculture and IoT. In: Proceedings of the 4th International Conference on Internet of Things: Smart Innovation and Usages. Ghaziabad: IEEE; 2019, p. 1–6. https://doi.org/10.1109/IoT-SIU.2019.8777619
  42. Phasinam K, Kassanuk T, Shinde PP, Thakar CM, Sharma DK, Mohiddin MK, et al. Application of IoT and cloud computing in automation of agriculture irrigation. J Food Qual. 2022;2022(1):8285969. https://doi.org/10.1155/2022/8285969
  43. Fiestas E, Linares P, Alva JA, Prado S. Integration of an iiot platform with a deep learning based computer vision system for seedling quality control automation. In: Proceedings of the IEEE 3rd Eurasia Conference on IOT, Communication and Engineering. Yunlin: IEEE; 2021, p. 621–626. https://doi.org/10.1109/ECICE52819.2021.9645700
  44. Sandhu GK, Singh A. IoT-enabled image processing approaches for automated plant disease detection: A comparative analysis. In: Sharma M, Nath M, Sheikh S, Singh A, editors. Recent Advances in Computing Sciences. London: CRC Press; 2025, p. 302–309. https://doi.org/10.1201/9781003570349.
  45. Misra NN, Dixit Y, Al-Mallahi A, Bhullar MS, Upadhyay R, Martynenko A. IoT, big data, and artificial intelligence in agriculture and food industry. IEEE Internet Things J. 2020;9(9):6305–6324. https://doi.org/10.1109/JIOT.2020.2998584
  46. Shults R, Urazaliev A, Annenkov A, Nesterenko O, Kucherenko O, Kim K. Different approaches to coordinate transformation parameters determination of nonhomogeneous coordinate systems. Environ Engin. (Lithuania). 2020:enviro.2020.687. https://doi.org/10.3846/enviro.2020.687
  47. Nekrasov S, Peterka J, Zhyhylii D, Dovhopolov A, Kolesnyk V. Mathematical estimation of roughness rz of threaded surface obtained by machining method. MM Sci J. 2022;5699–5703. https://doi.org/10.17973/MMSJ.2022_06_2022090
  48. Smailov N, Tolemanova A, Ayapbergenova A, Tashtay Y, Amir A. Modelling and application of fibre optic sensors for concrete structures: A literature review. Civ Engin Architect. 2025;13(3):1885–1897. https://doi.org/10.13189/cea.2025.130332
  49. Wolfert S, Ge L, Verdouw C, Bogaardt MJ. Big Data in smart farming – A review. Agric Syst. 2017;153:69–80. https://doi.org/10.1016/j.agsy.2017.01.023
  50. Ruby EK, Amirthayogam G, Sasi G, Chitra T, Choubey A, Gopalakrishnan S. Advanced image processing techniques for automated detection of healthy and infected leaves in agricultural systems. Mesopotamian J Comput Sci. 2024;2024:44–52. https://doi.org/10.58496/MJCSC/2024/006
  51. Edan Y, Adamides G, Oberti R. Agriculture automation. In: Nof SY, editor. Springer Handbook of Automation. Cham: Springer; 2023, p. 1055–1078. https://doi.org/10.1007/978-3-030-96729-1_49
  52. Luo J, Li B, Leung C. A survey of computer vision technologies in urban and controlled-environment agriculture. ACM Comput Surv. 2023;56(5):118. https://doi.org/10.1145/3626186
  53. Kuswidiyanto LW, Nugroho AP, Jati AW, Wismoyo GW, Arif SS. Automatic water level monitoring system based on computer vision technology for supporting the irrigation modernization. IOP Conf Ser Earth Environ Sci. 2021;686:012055. https://doi.org/10.1088/1755-1315/686/1/012055
  54. Smailov N, Tsyporenko V, Sabibolda A, Tsyporenko V, Abdykadyrov A, Kabdoldina A, Dosbayev Z, Ualiyev Z, Kadyrova R. Streamlining digital correlation-interferometric direction finding with spatial analytical signal. Inform Autom Pomiary Gospod Ochronie Srodow. 2024;14(3):43–48. https://doi.org/10.35784/iapgos.6177
  55. Voloshina A, Panchenko A, Panchenko I, Zasiadko A. Geometrical parameters for distribution systems of hydraulic machines. Mod Dev Path Agr Prod Trends Innov. 2019:323–336. https://doi.org/10.1007/978-3-030-14918-5_34
  56. Shedole ST, Madhu YB. Automated pest detection and control in agriculture using IoT and image processing. J Surv Fish Sci. 2019;6(1):106–116. https://doi.org/10.53555/sfs.v6i1.2351
  57. Devi N, Sarma KK, Laskar S. Design of an intelligent bean cultivation approach using computer vision, IoT and spatio-temporal deep learning structures. Ecol Inform. 2023;75:102044. https://doi.org/10.1016/j.ecoinf.2023.102044
  58. Fiedler NC, de Campos AA, Caldeira MVW, de Souza Lima JS, Ramalho AHC, da Silva Lopes E. Economic and operational analysis of mechanized forest implementation. Rev Árvore. 2020;44:e4422. http://doi.org/10.1590/1806-908820200000022
  59. Jafar A, Bibi N, Naqvi RA, Sadeghi-Niaraki A, Jeong D. Revolutionizing agriculture with artificial intelligence: plant disease detection methods, applications, and their limitations. Front Plant Sci. 2024;15:1356260. https://doi.org/10.3389/fpls.2024.1356260


Premier Science
Publishing Science that inspires