IPB University Researchers Build Swarm Drones for AI-Based Indoor Farming
Two researchers at IPB University, Dr Karlisa Priandana and Dr Medria Kusuma Dewi Hardhienata, are developing swarm drone technology to make indoor farming more reliable and data-driven. The goal is simple: continuous monitoring, faster decisions, and fewer single points of failure than static sensors alone.
Indoor farms control light, temperature, and humidity precisely-but that control depends on continuous, high-quality data. Static cameras and sensors can fail. A coordinated swarm steps in as redundancy, keeping the data stream alive and enabling real-time analysis of plant health and farm conditions.
What the team is building
- A coordinated swarm of small drones ("like a swarm of bees") that can monitor large indoor grow areas without collisions.
- Each drone carries a multispectral camera and environmental sensors, integrated via IoT for synchronized data capture.
- All data is sent in real time to a central system for AI-based analysis, supporting quick, accurate, and data-driven decisions.
- AI models identify plant species from drone imagery and help drones communicate, adjust formations, and hold positions even in tight spaces.
Why this matters for researchers
- Reliability: Swarms provide redundancy when fixed sensors degrade or go offline.
- Coverage and throughput: Coordinated flight increases sampling frequency and spatial resolution without adding fixed hardware.
- Closed-loop control: Real-time analysis can feed alerts, irrigation or lighting adjustments, and task scheduling.
- Scalability: More drones, same protocol-expand capacity without re-wiring the farm.
High-level system architecture (reference model)
- On-drone payload: Multispectral camera (for indices such as NDVI), RGB camera, temperature/humidity/CO2 sensors.
- Edge processing: Lightweight preprocessing (e.g., exposure control, denoise, frame selection) to reduce bandwidth and latency.
- Connectivity: QoS-aware messaging (e.g., MQTT/ROS 2) from drones to a central broker; local fallback when links drop.
- Localization and navigation: Visual-inertial odometry with fiducial markers; UWB beacons or AprilTags for drift correction in GPS-denied environments.
- AI models:
- Plant detection/segmentation for canopy coverage, growth stage, and stress signs.
- Spectral analytics for vigor and nutrient/water stress.
- Anomaly detection for pests, disease spots, and sensor drift.
- Multi-agent coordination: Collision avoidance plus formation control (consensus/flocking) with priority rules for narrow aisles.
- Data store and pipeline: Time-aligned image and sensor data with experiment metadata; versioned models and reproducible runs.
- Operator interface: Live map of coverage, alerts, and a backlog of high-value inspection tasks.
Key metrics to track
- Coverage and revisit time per zone.
- End-to-end latency (capture to alert) and packet loss rate.
- Collision-free flight hours and minimum separation distance.
- Battery utilization per mission and charge cycle health.
- Model performance: precision/recall on plant ID, stress detection, and anomaly classification.
- Yield impact: change in quality grades, loss reduction, and intervention lead time.
Design notes for indoor environments
- Lighting: LED flicker can affect multispectral/RGB capture; lock exposure and sync captures where possible.
- Airflow: HVAC patterns cause micro-gusts in aisles; tune PID gains and speed caps for stability.
- RF reality: Dense metal racks and water content attenuate signals; plan beacon placement and channel hopping.
- Charging: Hot-swap batteries or contact-based docking to sustain continuous coverage.
- Fail-safes: Geofencing, altitude limits, prop guards, and graceful degradation to single-drone mode if comms drop.
- Data alignment: Time-sync cameras and sensors (PTP or NTP with hardware time stamps) for clean multimodal fusion.
Workflow example
- Plan: The system schedules a sweep based on zones with stale data or recent anomalies.
- Capture: Drones fan out, collect multispectral and environmental data, and stream to the broker.
- Analyze: The AI stack estimates vigor, flags stress patterns, and classifies plant species where needed.
- Act: The operator gets prioritized alerts with confidence scores, suggested actions, and affected rows.
- Learn: Labels from operator feedback and ground truth feed model updates on a set cadence.
What the researchers emphasize
They developed swarm drones first for monitoring and surveillance in indoor farms. When fixed sensors fail or degrade, the swarm keeps monitoring going so operators don't miss critical changes.
Each drone carries multispectral and environmental sensors tied into an IoT backbone, with data analyzed in real time by AI. The system can identify plant types from imagery and coordinate drone positions and formations without collisions, even in confined spaces.
The expected outcome: higher efficiency, more sustainable cultivation practices, and better outcomes for farmers through data-driven operations.
Related resources
- Swarm robotics - foundational concepts for multi-agent coordination.
- NDVI - a common vegetation index derived from multispectral data.
Build your team's AI capability
If you're designing similar systems or validating models for autonomous sensing, see the AI Learning Path for Research Scientists for structured training on data pipelines, model evaluation, and experiment design.
Your membership also unlocks: