| Resource | Description | Action |
|---|---|---|
| Production Inference Engine | Live YOLOv8 pipeline hosted on Hugging Face Space. | Run Live Demo |
| Technical Architecture | Interactive Reveal.js slide deck (Socratic Method). | View Presentation |
| Research Foundations | Full mathematical breakdown and ablation study (CVPR). | Read Research Paper |
Real-time wheat head detection with per-object confidence scoring β Tuned YOLOv8s at 1024px resolution
Automated agronomic analytics: Yield estimation (t/ha), Spatial Uniformity (CV%), Revenue projection, and one-click PDF Executive Briefing export
The AgriVision Decision Support System represents an end-to-end Machine Learning pipeline engineered to supersede traditional, labor-intensive manual crop counting methodologies. By ingesting high-resolution drone imagery, the system leverages a custom-tuned YOLOv8 architecture to accurately detect and segment wheat heads across complex field topologies. This robust automated framework enables the reproducible extraction of highly localized visual features with a precision fundamentally unattainable through legacy human inspection.
Moving strictly beyond rudimentary object detection, the architecture acts as a deterministic conduit between pixel space and actionable business intelligence. Raw bounding box vectors are algorithmically fused with region-specific agronomic constantsβincluding Thousand Grain Weight (TGW) and localized grain density metricsβto instantly compute crucial agricultural indicators. This engine delivers rapid field yield estimations (t/ha), calculates spatial Coefficient of Variation (CV%) to highlight localized physiological stress, projects direct financial returns, and subsequently synthesizes these analytics into presentation-ready PDF briefings for farm management stakeholders.
Traditional agriculture continues to heavily rely on manual field scouting and crop counting, representing a significant friction point in modern farm operations. This legacy process is prohibitively slow, rendering it grossly unscalable across broadacre farming. Furthermore, localized manual sampling is statistically error-prone due to human fatigue and the complex, overlapping nature of dense wheat canopies. This critical lack of deterministic spatial data severely blindsides farm managers, ultimately making accurate macro-level yield forecasting and rapid localized stress detection nearly impossible prior to the physical harvest.
To eliminate manual sampling bottlenecks, the AgriVision system establishes a highly optimized, fully automated data pipeline. The modelβs baseline weights were meticulously trained on large-scale, domain-specific agricultural dataβpredominantly the multi-regional Global Wheat Datasetβensuring robustness against varying phenotypic traits and illumination states.
The analytical flow operates synchronously as follows:
Agriculture_Decision_System/
βββ app.py # Streamlit application entry point
βββ style.css # Custom dark-theme UI stylesheet
βββ requirements.txt # Pinned production dependencies
βββ requirements_cloud.txt # Lightweight cloud deployment deps
β
βββ src/ # Core backend modules
β βββ config.py # Regional agronomic constants (10 regions)
β βββ analytics.py # Yield estimation, CV%, revenue engine
β βββ inference.py # YOLOv8 model loading & batch inference
β βββ report.py # Automated PDF report generation (FPDF)
β βββ model.py # Model wrapper utilities
β βββ metrics.py # Evaluation metric helpers
β βββ dataset.py # Dataset loading & preprocessing
β βββ utils.py # Shared utility functions
β
βββ scripts/ # Training, evaluation & analysis tools (18 scripts)
β βββ train.py # Model training script
β βββ evaluate_model.py # mAP / Precision / Recall evaluation
β βββ compare_models.py # Baseline vs. Tuned comparison pipeline
β βββ extract_false_negatives.py # FN forensic extraction
β βββ extract_false_positives.py # FP forensic extraction
β βββ generate_pro_dashboard.py # Advanced visualization dashboards
β βββ ... # Additional diagnostic utilities
β
βββ notebooks/ # Jupyter research notebooks
β βββ 01_Basic_EDA.ipynb # Exploratory Data Analysis
β βββ 02_yolov8s_baseline_train.ipynb
β βββ 03_YOLOv8s_training_iteration2(tuned).ipynb
β
βββ configs/ # YAML training configurations
β βββ wheat_v8.yaml # Dataset config
β βββ yolov8_baseline_args.yaml # Baseline hyperparameters
β βββ yolov8_tuned.yaml # Tuned hyperparameters
β
βββ docs/ # Technical documentation & visual assets
β βββ error_analysis_report.md
β βββ hyperparameter_rationale.md
β βββ iteration_2_evaluation.md
β βββ assets/ # Analytical charts & dashboards
β
βββ data/ # Dataset directory (gitignored)
β βββ raw/ # Original images & annotations
β βββ processed/ # YOLO-formatted data
β βββ test_samples/ # Quick-test imagery
β
βββ tests/ # Unit tests
β βββ test_metrics.py
β
βββ outputs/ # Training outputs & weights (gitignored)
βββ AgriVision_Paper/ # CVPR-format research paper (LaTeX)
βββ LICENSE # MIT License
The true engineering value of the AgriVision framework lies in its deterministic state engine, which bridges the gap between deep learning outputs and real-world agricultural operations. Rather than simply returning raw visualization overlays, the backend securely intercepts Machine Learning bounding box tensors, aggregates detection density, and applies rigorous agronomic mathematics to translate pixel geometry into actionable farming metrics.
| Metric | Agronomic Calculation | Engineering Impact |
|---|---|---|
| Spatial Uniformity (CV%) | Calculates the statistical Coefficient of Variation across the target sampling tiles. | Acts as a high-fidelity indicator for field health. A low CV% confirms absolute growth consistency, while a high CV% rapidly highlights localized physiological crop stress or mechanical seeding failures. |
| Estimated Yield (t/ha) | Translates raw detection density into gross output using region-specific constants (e.g., Thousand Grain Weight). | Converts abstract network counts into mathematically grounded tonnage per hectare, unlocking the ability to execute highly predictive macro-level harvest forecasting. |
| Projected Revenue | Fuses the spatial Yield output with current localized commodity market pricing. | Delivers hyper-localized financial modeling projections, rendering projected monetary returns in the selected regional currency to inform immediate strategic fiscal planning. |
Upon the termination of an inference run, the data payloads are algorithmically handed off to an automated document generation subsystem. To ensure absolute operational compatibility across volatile, read-only stateless cloud architectures, the system utilizes a highly optimized byte-stream encoder via the FPDF library (leveraging strictly dest='S'). This effectively packages all captured spatial metrics, network confidence outputs, and complex financial projections directly into memory as raw bytes. The Streamlit presentation layer then safely serves this dynamically generated PDF report, circumventing legacy server file-system restrictions to deliver a presentation-ready, zero-latency briefing directly to execution stakeholders.
The tuned YOLOv8s model was rigorously validated against a 548-image holdout partition from the Global Wheat Head Detection dataset. Complete methodology, mathematical proofs, and ablation results are documented in the research paper.
| Metric | Baseline (640px) | Tuned (1024px) | Ξ |
|---|---|---|---|
| mAP@50 | 0.950 | 0.944 | β0.006 |
| mAP@50-95 | 0.569 | 0.563 | β0.006 |
| Precision | 0.877 | 0.873 | β0.004 |
| Recall | 0.974 | 0.906 | β0.068 |
| Median Confidence | 0.434 | 0.605 | +39.4% |
| Training Epochs | 50 (full run) | 35 (early stop) | β30% |
Key Insight: The tuned model trades marginal recall (β6.8%) for a +39.4% increase in prediction confidence, eliminating 1,092 zero-IoU hallucinated detections that would have inflated yield estimates by 4β8%. In precision agriculture, fewer confident predictions are operationally superior to many uncertain ones.
Left: Hallucination cull (violin). Center: Confidence density shift from 0.434 β 0.605. Right: Aggregate precision/recall/F1 comparison.
Systematic decomposition of baseline failure modes: scale degradation on micro-objects (top-left), photometric fragility in low-light zones (top-right), the critical "danger zone" of small + dark targets (bottom-left), and aspect ratio deformation (bottom-right).
Tuned model forensics: FP/FN distribution (top-left), crowd-scene immunity with near-flat FN trend slope of 0.033 (top-right), clean vs. hallucinating confidence (bottom-left), and persistent Jaccard Index > 0.50 even under critical FP stress (bottom-right).
To replicate this environment locally for development or auditing purposes, follow the strict initialization sequence below. Ensure you are running Python 3.11+ before beginning.
Establish your local workspace by cloning the source repository from GitHub and navigating into the root directory:
git clone https://github.com/VaheGdlyan/Agriculture_Decision_System.git
cd Agriculture_Decision_System
Isolate the deployment environment by initializing a custom Python virtual environment named venv311.
For Windows (PowerShell):
python -m venv venv311
.\venv311\Scripts\Activate.ps1
For Windows (CMD):
python -m venv venv311
venv311\Scripts\activate.bat
For Linux / macOS Systems:
python3 -m venv venv311
source venv311/bin/activate
Install all project dependencies:
pip install -r requirements.txt
Launch the application on localhost:
streamlit run app.py
To maintain engineering transparency, the following technical boundaries of the current iteration are strictly documented alongside our strategic scaling roadmap.
The next technical iterations of the AgriVision framework are prioritized as follows:
If you use this work in your research, please cite:
@misc{gdlyan2026agrivision,
title = {AgriVision: Error-Driven Hyperparameter Optimization for
High-Confidence Wheat Head Detection in UAV Imagery},
author = {Gdlyan, Vahe},
year = {2026},
howpublished = {\url{https://github.com/VaheGdlyan/Agriculture_Decision_System}},
note = {CVPR format}
}
Vahe Gdlyan
This project is open-source and released under the MIT License.