Goto

Collaborating Authors

 agricultural field


An Adaptive Coverage Control Approach for Multiple Autonomous Off-road Vehicles in Dynamic Agricultural Fields

Ahmadi, Sajad, Davoodi, Mohammadreza, Velni, Javad Mohammadpour

arXiv.org Artificial Intelligence

This paper presents an adaptive coverage control method for a fleet of off-road and Unmanned Ground Vehicles (UGVs) operating in dynamic (time-varying) agricultural environments. Traditional coverage control approaches often assume static conditions, making them unsuitable for real-world farming scenarios where obstacles, such as moving machinery and uneven terrains, create continuous challenges. To address this, we propose a real-time path planning framework that integrates Unmanned Aerial Vehicles (UAVs) for obstacle detection and terrain assessment, allowing UGVs to dynamically adjust their coverage paths. The environment is modeled as a weighted directed graph, where the edge weights are continuously updated based on the UAV observations to reflect obstacle motion and terrain variations. The proposed approach incorporates Voronoi-based partitioning, adaptive edge weight assignment, and cost-based path optimization to enhance navigation efficiency. Simulation results demonstrate the effectiveness of the proposed method in improving path planning, reducing traversal costs, and maintaining robust coverage in the presence of dynamic obstacles and muddy terrains.


Evaluating Path Planning Strategies for Efficient Nitrate Sampling in Crop Rows

Liu, Ruiji, Breitfeld, Abigail, Vijayarangan, Srinivasan, Kantor, George, Yandun, Francisco

arXiv.org Artificial Intelligence

Abstract: This paper presents a pipeline that combines high-resolution orthomosaic maps generated from UAS imagery with GPS-based global navigation to guide a skid-steered ground robot. We evaluated three path planning strategies: A* Graph search, Deep Q-learning (DQN) model, and Heuristic search, benchmarking them on planning time and success rate in realistic simulation environments. Experimental results reveal that the Heuristic search achieves the fastest planning times (0.28 ms) and a 100% success rate, while the A* approach delivers nearoptimal performance, and the DQN model, despite its adaptability, incurs longer planning delays and occasional suboptimal routing. These results highlight the advantages of deterministic rulebased methods in geometrically constrained crop-row environments and lay the groundwork for future hybrid strategies in precision agriculture. Keywords: Path planning, autonomous control, crop rows, autonomous nitrate sampling 1. INTRODUCTION Autonomous navigation in agricultural fields is challenging due to structured layouts with unstructured variability.


Agricultural Field Boundary Detection through Integration of "Simple Non-Iterative Clustering (SNIC) Super Pixels" and "Canny Edge Detection Method"

Gayibov, Artughrul

arXiv.org Artificial Intelligence

Efficient use of cultivated areas is a necessary factor for sustainable development of agriculture and ensuring food security. Along with the rapid development of satellite technologies in developed countries, new methods are being searched for accurate and operational identification of cultivated areas. In this context, identification of cropland boundaries based on spectral analysis of data obtained from satellite images is considered one of the most optimal and accurate methods in modern agriculture. This article proposes a new approach to determine the suitability and green index of cultivated areas using satellite data obtained through the "Google Earth Engine" (GEE) platform. In this approach, two powerful algorithms, "SNIC (Simple Non-Iterative Clustering) Super Pixels" and "Canny Edge Detection Method", are combined. The SNIC algorithm combines pixels in a satellite image into larger regions (super pixels) with similar characteristics, thereby providing better image analysis. The Canny Edge Detection Method detects sharp changes (edges) in the image to determine the precise boundaries of agricultural fields. This study, carried out using high-resolution multispectral data from the Sentinel-2 satellite and the Google Earth Engine JavaScript API, has shown that the proposed method is effective in accurately and reliably classifying randomly selected agricultural fields. The combined use of these two tools allows for more accurate determination of the boundaries of agricultural fields by minimizing the effects of outliers in satellite images. As a result, more accurate and reliable maps can be created for agricultural monitoring and resource management over large areas based on the obtained data. By expanding the application capabilities of cloud-based platforms and artificial intelligence methods in the agricultural field.


Radar Meets Vision: Robustifying Monocular Metric Depth Prediction for Mobile Robotics

Job, Marco, Stastny, Thomas, Kazik, Tim, Siegwart, Roland, Pantic, Michael

arXiv.org Artificial Intelligence

Mobile robots require accurate and robust depth measurements to understand and interact with the environment. While existing sensing modalities address this problem to some extent, recent research on monocular depth estimation has leveraged the information richness, yet low cost and simplicity of monocular cameras. These works have shown significant generalization capabilities, mainly in automotive and indoor settings. However, robots often operate in environments with limited scale cues, self-similar appearances, and low texture. In this work, we encode measurements from a low-cost mmWave radar into the input space of a state-of-the-art monocular depth estimation model. Despite the radar's extreme point cloud sparsity, our method demonstrates generalization and robustness across industrial and outdoor experiments. Our approach reduces the absolute relative error of depth predictions by 9-64% across a range of unseen, real-world validation datasets. Importantly, we maintain consistency of all performance metrics across all experiments and scene depths where current vision-only approaches fail. We further address the present deficit of training data in mobile robotics environments by introducing a novel methodology for synthesizing rendered, realistic learning datasets based on photogrammetric data that simulate the radar sensor observations for training. Our code, datasets, and pre-trained networks are made available at https://github.com/ethz-asl/radarmeetsvision.


LiDAR-Based Crop Row Detection Algorithm for Over-Canopy Autonomous Navigation in Agriculture Fields

Liu, Ruiji, Yandun, Francisco, Kantor, George

arXiv.org Artificial Intelligence

Autonomous navigation is crucial for various robotics applications in agriculture. However, many existing methods depend on RTK-GPS systems, which are expensive and susceptible to poor signal coverage. This paper introduces a state-of-the-art LiDAR-based navigation system that can achieve over-canopy autonomous navigation in row-crop fields, even when the canopy fully blocks the interrow spacing. Our crop row detection algorithm can detect crop rows across diverse scenarios, encompassing various crop types, growth stages, weeds presence, and discontinuities within the crop rows. Without utilizing the global localization of the robot, our navigation system can perform autonomous navigation in these challenging scenarios, detect the end of the crop rows, and navigate to the next crop row autonomously, providing a crop-agnostic approach to navigate the whole row-crop field. This navigation system has undergone tests in various simulated agricultural fields, achieving an average of 2.98cm autonomous driving accuracy without human intervention on the custom Amiga robot. In addition, the qualitative results of our crop row detection algorithm from the actual soybean fields validate our LiDAR-based crop row detection algorithm's potential for practical agricultural applications.


Overcome the Fear Of Missing Out: Active Sensing UAV Scanning for Precision Agriculture

Krestenitis, Marios, Raptis, Emmanuel K., Kapoutsis, Athanasios Ch., Ioannidis, Konstantinos, Kosmatopoulos, Elias B., Vrochidis, Stefanos

arXiv.org Artificial Intelligence

This paper deals with the problem of informative path planning for a UAV deployed for precision agriculture applications. First, we observe that the ``fear of missing out'' data lead to uniform, conservative scanning policies over the whole agricultural field. Consequently, employing a non-uniform scanning approach can mitigate the expenditure of time in areas with minimal or negligible real value, while ensuring heightened precision in information-dense regions. Turning to the available informative path planning methodologies, we discern that certain methods entail intensive computational requirements, while others necessitate training on an ideal world simulator. To address the aforementioned issues, we propose an active sensing coverage path planning approach, named OverFOMO, that regulates the speed of the UAV in accordance with both the relative quantity of the identified classes, i.e. crops and weeds, and the confidence level of such detections. To identify these instances, a robust Deep Learning segmentation model is deployed. The computational needs of the proposed algorithm are independent of the size of the agricultural field, rendering its applicability on modern UAVs quite straightforward. The proposed algorithm was evaluated with a simu-realistic pipeline, combining data from real UAV missions and the high-fidelity dynamics of AirSim simulator, showcasing its performance improvements over the established state of affairs for this type of missions. An open-source implementation of the algorithm and the evaluation pipeline is also available: \url{https://github.com/emmarapt/OverFOMO}.


Mobile robots sampling algorithms for monitoring of insects populations in agricultural fields

Yehoshua, Adi, Edan, Yael

arXiv.org Artificial Intelligence

Plant diseases are major causes of production losses and may have a significant impact on the agricultural sector. Detecting pests as early as possible can help increase crop yields and production efficiency. Several robotic monitoring systems have been developed allowing to collect data and provide a greater understanding of environmental processes. An agricultural robot can enable accurate timely detection of pests, by traversing the field autonomously and monitoring the entire cropped area within a field. However, in many cases it is impossible to sample all plants due to resource limitations. In this thesis, the development and evaluation of several sampling algorithms are presented to address the challenge of an agriculture-monitoring ground robot designed to locate insects in an agricultural field, where complete sampling of all the plants is infeasible. Two situations were investigated in simulation models that were specially developed as part of this thesis: where no a-priori information on the insects is available and where prior information on the insects distributions within the field is known. For the first situation, seven algorithms were tested, each utilizing an approach to sample the field without prior knowledge of it. For the second situation, we present the development and evaluation of a dynamic sampling algorithm which utilizes real-time information to prioritize sampling at suspected points, locate hot spots and adapt sampling plans accordingly. The algorithm's performance was compared to two existing algorithms using Tetranychidae insect data from previous research. Analyses revealed that the dynamic algorithm outperformed the others.


The 10 most innovative robotics companies in 2022

#artificialintelligence

Perhaps 2021 will be seen as a tipping point, the year we suddenly noticed that the robots were everywhere. In the factory, of course, the use of industrial robots around the world is rapidly accelerating, with average global robot density in manufacturing hitting 126 robots per 10,000 employees, nearly double the number from just five years ago, according to the 2021 World Robot Report. The auto industry is the biggest employer of robots by far, accounting for 42% of all installed units in 2021; there were nearly 1,300 robots for every 10,000 human employees in the car sector. Industrial robots are becoming more versatile, too: This year's most innovative robotics companies include autonomous mobile robots (AMRs), such as Denmark's Mobile Industrial Robots, and Pittsburgh-based Seegrid, that use sensors and software to navigate safely through dynamic work environments, as well as collaborative "cobots"--the fastest-growing market segment--designed to work alongside humans and learn new tasks quickly. San Francisco-based Nimble uses AI and "imitation learning" to teach warehouse robots how to pick and pack products like cosmetics, apparel, and consumer electronics (once considered too delicate for robot handling) for U.S. customers including Best Buy, Victoria's Secret, and Puma.


Artificial Intelligence: Transforming Lives of People in Indian Small Towns

#artificialintelligence

Artificial Intelligence has entered the domestic market of India with its smart functionalities for smart cities, industries, smart homes, consumers, and many more. Consumers have started preferring artificial intelligence over traditional workloads or systems for time-efficient and cost-efficient features. This is about the urban cities of India where there is not digital divide and poor or no network connections. But, artificial intelligence in Indian small towns is thriving in recent years while these AI models are transforming the lives of people living in these small towns. Multiple AI-based start-ups are focused on developing different AI models for Indian small towns to enhance the standard of living. Let's dive deep into how artificial intelligence is transforming the Indian small towns efficiently and effectively.


Researchers Created Digital Log Tool For Drone Users In Agricultural Field

#artificialintelligence

A researcher from the Purdue University developed the Web-based application that allows unmanned aircraft system operators in agriculture to easily log their flight-related data. The globally available logbook helps record the date, time and location of a flight, model and registration information of the device, the type of sensors used, safety precautions taken and other related information. "We've lacked a system to provide UAS users in agriculture with a way to record information about their flights, sensors and maintenance issues, and creation of a common protocol for UAS operations for various research and production related applications is an effort to plug that gap and bring standardization to flight data collection," says Dharmendra Saraswat, an associate professor in Purdue University. About DEEPAERO DEEP AERO is a global leader in drone technology innovation. At DEEP AERO, we are building an autonomous drone economy powered by AI & Blockchain.