chowdhary
End-to-End Crop Row Navigation via LiDAR-Based Deep Reinforcement Learning
Mineiro, Ana Luiza, Affonso, Francisco, Becker, Marcelo
Abstract-- Reliable navigation in under-canopy agricultural environments remains a challenge due to GNSS unreliability, cluttered rows, and variable lighting. T o address these limitations, we present an end-to-end learning-based navigation system that maps raw 3D LiDAR data directly to control commands using a deep reinforcement learning policy trained entirely in simulation. Our method includes a voxel-based downsampling strategy that reduces LiDAR input size by 95.83%, enabling efficient policy learning without relying on labeled datasets or manually designed control interfaces. The policy was validated in simulation, achieving a 100% success rate in straight-row plantations and showing a gradual decline in performance as row curvature increased, tested across varying sinusoidal frequencies and amplitudes. Autonomous robots have seen significant growth in modern agriculture, particularly for under-canopy tasks such as plant phenotyping, crop row harvesting, and disease scouting. These applications require platforms that are not only compact and agile but also capable of accurately navigating between dense crop rows (Figure 1) [1]. However, reliable navigation in such environments remains an active area of research due to several challenges, including clutter and occlusions caused by narrow row spacing and the high visual variability introduced by different plant growth stages [2].
- North America > United States > Illinois > Champaign County > Urbana (0.04)
- Europe > Netherlands > South Holland > Delft (0.04)
CropNav: a Framework for Autonomous Navigation in Real Farms
Gasparino, Mateus Valverde, Higuti, Vitor Akihiro Hisano, Sivakumar, Arun Narenthiran, Velasquez, Andres Eduardo Baquero, Becker, Marcelo, Chowdhary, Girish
Small robots that can operate under the plant canopy can enable new possibilities in agriculture. However, unlike larger autonomous tractors, autonomous navigation for such under canopy robots remains an open challenge because Global Navigation Satellite System (GNSS) is unreliable under the plant canopy. We present a hybrid navigation system that autonomously switches between different sets of sensing modalities to enable full field navigation, both inside and outside of crop. By choosing the appropriate path reference source, the robot can accommodate for loss of GNSS signal quality and leverage row-crop structure to autonomously navigate. However, such switching can be tricky and difficult to execute over scale. Our system provides a solution by automatically switching between an exteroceptive sensing based system, such as Light Detection And Ranging (LiDAR) row-following navigation and waypoints path tracking. In addition, we show how our system can detect when the navigate fails and recover automatically extending the autonomous time and mitigating the necessity of human intervention. Our system shows an improvement of about 750 m per intervention over GNSS-based navigation and 500 m over row following navigation.
- South America > Brazil (0.04)
- North America > United States > Oregon (0.04)
- North America > United States > Illinois > Champaign County > Champaign (0.04)
Learning to Turn: Diffusion Imitation for Robust Row Turning in Under-Canopy Robots
Sivakumar, Arun N., Thangeda, Pranay, Fang, Yixiao, Gasparino, Mateus V., Cuaran, Jose, Ornik, Melkior, Chowdhary, Girish
Under-canopy agricultural robots require robust navigation capabilities to enable full autonomy but struggle with tight row turning between crop rows due to degraded GPS reception, visual aliasing, occlusion, and complex vehicle dynamics. We propose an imitation learning approach using diffusion policies to learn row turning behaviors from demonstrations provided by human operators or privileged controllers. Simulation experiments in a corn field environment show potential in learning this task with only visual observations and velocity states. However, challenges remain in maintaining control within rows and handling varied initial conditions, highlighting areas for future improvement.
Lessons from Deploying CropFollow++: Under-Canopy Agricultural Navigation with Keypoints
Sivakumar, Arun N., Gasparino, Mateus V., McGuire, Michael, Higuti, Vitor A. H., Akcal, M. Ugur, Chowdhary, Girish
We present a vision-based navigation system for under-canopy agricultural robots using semantic keypoints. Autonomous under-canopy navigation is challenging due to the tight spacing between the crop rows ($\sim 0.75$ m), degradation in RTK-GPS accuracy due to multipath error, and noise in LiDAR measurements from the excessive clutter. Our system, CropFollow++, introduces modular and interpretable perception architecture with a learned semantic keypoint representation. We deployed CropFollow++ in multiple under-canopy cover crop planting robots on a large scale (25 km in total) in various field conditions and we discuss the key lessons learned from this.
- Information Technology > Artificial Intelligence > Vision (1.00)
- Information Technology > Artificial Intelligence > Robots (1.00)
Unmatched uncertainty mitigation through neural network supported model predictive control
Gasparino, Mateus V., Mishra, Prabhat K., Chowdhary, Girish
This paper presents a deep learning based model predictive control (MPC) algorithm for systems with unmatched and bounded state-action dependent uncertainties of unknown structure. We utilize a deep neural network (DNN) as an oracle in the underlying optimization problem of learning based MPC (LBMPC) to estimate unmatched uncertainties. Generally, non-parametric oracles such as DNN are considered difficult to employ with LBMPC due to the technical difficulties associated with estimation of their coefficients in real time. We employ a dual-timescale adaptation mechanism, where the weights of the last layer of the neural network are updated in real time while the inner layers are trained on a slower timescale using the training data collected online and selectively stored in a buffer. Our results are validated through a numerical experiment on the compression system model of jet engine. These results indicate that the proposed approach is implementable in real time and carries the theoretical guarantees of LBMPC.
Will Artificial Intelligence and robotics usher in an era of sustainable precision agriculture?
Across midwestern farms, if Girish Chowdhary has his way, farmers will someday release beagle-sized robots into their fields like a pack of hounds flushing pheasant. The robots, he says, will scurry in the cool shade beneath a wide diversity of plants, pulling weeds, planting cover crops, diagnosing plant infections, and gathering data to help farmers optimize their farms. Chowdhary, a researcher at the University of Illinois, works surrounded by corn, one of the most productive monocultures in the world. In the United States, the corn industry was valued at $82.6 billion in 2021, but it -- like almost every other segment of the agricultural economy -- faces daunting problems, including changing weather patterns, environmental degradation, severe labor shortages, and the rising cost of key supplies, or inputs: herbicides, pesticides, and seed. Agribusiness as a whole is betting that the world has reached the tipping point where desperate need caused by a growing population, the economic realities of conventional farming, and advancing technology converge to require something called precision agriculture, which aims to minimize inputs and the costs and environmental problems that go with them. No segment of agriculture is without its passionate advocates of robotics and artificial intelligence as solutions to, basically, all the problems facing farmers today.
- North America > United States > Illinois (0.25)
- North America > United States > Iowa (0.05)
- North America > United States > Wisconsin (0.04)
- (6 more...)
- Materials > Chemicals > Agricultural Chemicals (0.70)
- Food & Agriculture > Agriculture > Pest Control (0.55)
Farming Drives Toward 'Precision Agriculture' Technologies
This story originally appeared on Undark and is part of the Climate Desk collaboration. Across Midwestern farms, if Girish Chowdhary has his way, farmers will someday release beagle-sized robots into their fields like a pack of hounds flushing pheasant. The robots, he says, will scurry in the cool shade beneath a wide diversity of plants, pulling weeds, planting cover crops, diagnosing plant infections, and gathering data to help farmers optimize their farms. Chowdhary, a researcher at the University of Illinois, works surrounded by corn, one of the most productive monocultures in the world. In the United States, the corn industry was valued at $82.6 billion in 2021, but it--like almost every other segment of the agricultural economy--faces daunting problems, including changing weather patterns, environmental degradation, severe labor shortages, and the rising cost of key inputs: herbicides, pesticides, and seed.
- North America > United States > Illinois (0.26)
- North America > Canada (0.06)
- Materials > Chemicals > Agricultural Chemicals (0.75)
- Food & Agriculture > Agriculture > Pest Control (0.59)
B.Tech student designs AI-powered smart traffic signal
It is usually seen that the traffic on the side of the road where we are traveling is crowded and the opposite side is almost empty but the green signal is given for the same period for every direction without assigning any priority. Caught in one such occasion while traveling for a medical emergency, Deepraj Chowdhary, a B.Tech student from the International Institute of Information Technology-Naya Raipur hit upon an idea to use artificial intelligence to speed up traffic more smartly. Christened as Smart Traffic Signal Management System for Roadways, Chowdhary has used artificial intelligence in python programming language and written algorithms to give more green signals for the direction which has the highest density of vehicles. Currently, the waiting duration for vehicles at traffic signals is equally divided irrespective of the density though traffic police can adjust it manually. Moreover, almost all the major traffic signals are attached with cameras that send live feed to the control room.
- North America > United States (0.19)
- Asia > India (0.19)
- Transportation > Infrastructure & Services (1.00)
- Transportation > Ground > Road (1.00)
High precision control and deep learning-based corn stand counting algorithms for agricultural robot
Zhang, Zhongzhong, Kayacan, Erkan, Thompson, Benjamin, Chowdhary, Girish
This paper presents high precision control and deep learning-based corn stand counting algorithms for a low-cost, ultra-compact 3D printed and autonomous field robot for agricultural operations. Currently, plant traits, such as emergence rate, biomass, vigor, and stand counting, are measured manually. This is highly labor-intensive and prone to errors. The robot, termed TerraSentia, is designed to automate the measurement of plant traits for efficient phenotyping as an alternative to manual measurements. In this paper, we formulate a Nonlinear Moving Horizon Estimator (NMHE) that identifies key terrain parameters using onboard robot sensors and a learning-based Nonlinear Model Predictive Control (NMPC) that ensures high precision path tracking in the presence of unknown wheel-terrain interaction. Moreover, we develop a machine vision algorithm designed to enable an ultra-compact ground robot to count corn stands by driving through the fields autonomously. The algorithm leverages a deep network to detect corn plants in images, and a visual tracking model to re-identify detected objects at different time steps. We collected data from 53 corn plots in various fields for corn plants around 14 days after emergence (stage V3 - V4). The robot predictions have agreed well with the ground truth with $C_{robot}=1.02 \times C_{human}-0.86$ and a correlation coefficient $R=0.96$. The mean relative error given by the algorithm is $-3.78\%$, and the standard deviation is $6.76\%$. These results indicate a first and significant step towards autonomous robot-based real-time phenotyping using low-cost, ultra-compact ground robots for corn and potentially other crops.
- Oceania > Australia > Queensland (0.04)
- North America > United States > Pennsylvania > Allegheny County > Pittsburgh (0.04)
- North America > United States > Illinois > Champaign County > Urbana (0.04)
- (3 more...)
- Food & Agriculture > Agriculture (1.00)
- Energy (1.00)
- Government > Regional Government > North America Government > United States Government (0.68)
A Growing Presence on the Farm: Robots
In a research field off Highway 54 last autumn, corn stalks shimmered in rows 40-feet deep. Girish Chowdhary, an agricultural engineer at the University of Illinois at Urbana-Champaign, bent to place a small white robot at the edge of a row marked 103. The robot, named TerraSentia, resembled a souped up version of a lawn mower, with all-terrain wheels and a high-resolution camera on each side. In much the same way that self-driving cars "see" their surroundings, TerraSentia navigates a field by sending out thousands of laser pulses to scan its environment. A few clicks on a tablet were all that were needed to orient the robot at the start of the row before it took off, squeaking slightly as it drove over ruts in the field.