Goto

Collaborating Authors

 college station


Robotic Multimodal Data Acquisition for In-Field Deep Learning Estimation of Cover Crop Biomass

Johnson, Joe, Chalasani, Phanender, Shah, Arnav, Ray, Ram L., Bagavathiannan, Muthukumar

arXiv.org Artificial Intelligence

Accurate weed management is essential for mitigating significant crop yield losses, necessitating effective weed suppression strategies in agricultural systems. Integrating cover crops (CC) offers multiple benefits, including soil erosion reduction, weed suppression, decreased nitrogen requirements, and enhanced carbon sequestration, all of which are closely tied to the aboveground biomass (AGB) they produce. However, biomass production varies significantly due to microsite variability, making accurate estimation and mapping essential for identifying zones of poor weed suppression and optimizing targeted management strategies. To address this challenge, developing a comprehensive CC map, including its AGB distribution, will enable informed decision-making regarding weed control methods and optimal application rates. Manual visual inspection is impractical and labor-intensive, especially given the extensive field size and the wide diversity and variation of weed species and sizes. In this context, optical imagery and Light Detection and Ranging (LiDAR) data are two prominent sources with unique characteristics that enhance AGB estimation. This study introduces a ground robot-mounted multimodal sensor system designed for agricultural field mapping. The system integrates optical and LiDAR data, leveraging machine learning (ML) methods for data fusion to improve biomass predictions. The best ML-based model for dry AGB estimation achieved a coefficient of determination value of 0.88, demonstrating robust performance in diverse field conditions. This approach offers valuable insights for site-specific management, enabling precise weed suppression strategies and promoting sustainable farming practices.


Do Androids Dream of Anything at All?

The New Yorker

Although the literature of automatism has existed in one mold or another since the late Middle Ages--with sixteenth-century folktales about a golem made of clay and summoned to life, through ritual incantation, to defend Prague's Jewish community --its modern form was set in motion by a play called "R.U.R.," by the Czech writer Karel Čapek. Its 1921 première, also in Prague, set the agenda for the next century, and it has remained an apparently ironclad convention that all critical writing about the genre begin there. The drama gave us the word "robot," a derivative of an Old Slavic root related to "serfdom," and its narrative, of a rebellion among artificial workers, provided a metaphorical template--stories about robots are stories about labor and freedom. The word "robot" is still with us, and the underlying metaphor has a generous flexibility, encompassing two related but distinct ideas. One is that the first thing we would obviously do with artificial people is enslave them--as in, say, "Westworld."


Amazon's Delivery Drones Are Grounded. The Birds and Dogs of This Texas Town Are Grateful

WIRED

As the spring planting season arrives in College Station, Texas, certified master gardener Mark Smith is thrilled that peace is in the air. This time last year, a loud buzzing noise began disrupting Smith's morning routine of checking on the peppers, tomatoes, herbs, and shrubs growing in his backyard. Several times an hour, an Amazon Prime Air delivery drone would noisily emerge about 800 feet away, just past a line of trees behind Smith's home. His neighbors began calling the fleet flying chainsaws. Smith, a retired civil engineer, preferred a different comparison: "It was like your neighbor runs their leaf blower all day long," he says.

  Country: North America > United States > Texas > Brazos County > College Station (0.26)
  Industry: Transportation > Air (0.89)

Amazon now offers drone deliveries for prescription medications in Texas

Engadget

Amazon is now offering drone prescription deliveries in College Station, TX. Customers will be eligible for aerial deliveries of "more than 500 medications" for common conditions like the flu, asthma and pneumonia. The home of Texas A&M has enjoyed Prime Air drone deliveries of (non-medical) Amazon shipments since 2022. The company says medications will arrive within an hour of placing their order, and there won't be an additional fee to use the service. The drones fly at 40 to 120 meters, an altitude Amazon says presents minimal obstacles. After arriving at the customer's home, the drone "slowly and safely" lowers itself to a delivery marker.


DeepSI: Interactive Deep Learning for Semantic Interaction

Bian, Yali, North, Chris

arXiv.org Artificial Intelligence

In this paper, we design novel interactive deep learning methods to improve semantic interactions in visual analytics applications. The ability of semantic interaction to infer analysts' precise intents during sensemaking is dependent on the quality of the underlying data representation. We propose the $\text{DeepSI}_{\text{finetune}}$ framework that integrates deep learning into the human-in-the-loop interactive sensemaking pipeline, with two important properties. First, deep learning extracts meaningful representations from raw data, which improves semantic interaction inference. Second, semantic interactions are exploited to fine-tune the deep learning representations, which then further improves semantic interaction inference. This feedback loop between human interaction and deep learning enables efficient learning of user- and task-specific representations. To evaluate the advantage of embedding the deep learning within the semantic interaction loop, we compare $\text{DeepSI}_{\text{finetune}}$ against a state-of-the-art but more basic use of deep learning as only a feature extractor pre-processed outside of the interactive loop. Results of two complementary studies, a human-centered qualitative case study and an algorithm-centered simulation-based quantitative experiment, show that $\text{DeepSI}_{\text{finetune}}$ more accurately captures users' complex mental models with fewer interactions.


Amazon unveils new Prime Air delivery drone that will drop packages from TWELVE feet in the air

Daily Mail - Science & tech

Amazon has unveiled its newest delivery drone that will soon be dropping packages from 12 feet in the air in two U.S. cities. The retail giant has long wanted to solve the last leg of package delivery, especially since it launched Amazon Prime's Two-Day delivery offering in 2005. Jeff Bezos first announced drone delivery in 2013, but the service only made a single delivery three years after that. The drone, dubbed MK27-2, will start making deliveries in Lockeford, California, and College Station, Texas, by the end of 2022. The autonomous craft is about five-and-a-half feet in diameter, weighs 80 pounds and can only carry packages that weight less than five pounds.


Amazon's redesigned Prime Air delivery drone can fly farther than its predecessor

Engadget

Amazon recently stopped testing its Scout sidewalk delivery robot and made other decisions indicating that it's scaling back its experimental projects. Looks like its delivery drone development for Prime Air is still going strong, though, because the e-commerce giant has just released a sneak peek of its next-gen machine. The MK30 was designed to be lighter than the current model dubbed MK27-2. It will still have six rotors like its predecessor, based on the images the e-commerce giant has shared, except it no longer has a full hexagonal frame. The e-commerce giant is slated to start drone deliveries in College Station, Texas and Lockeford, California later this year to help it gauge people's interest in getting their orders flown over and dropped into their yards.


Amazon's Prime Air drones will soon make deliveries in Texas

Engadget

Amazon has revealed the second city where it plans to start making drone deliveries later this year. The company says it will start contacting customers in College Station, Texas, to gauge their interest in receiving orders via Prime Air. Amazon says it was impressed by many elements of the city, including the research being conducted by Texas A&M University, such as work on drone technology. The US Census Bureau estimates the population of College Station was 120,000 as of last July, so while it isn't the biggest city around, it seems like a decent size for the initially rollout of Prime Air. "Amazon's new facility presents a tremendous opportunity for College Station to be at the forefront of the development of drone delivery technology," Karl Mooney, the mayor of College Station, said.


From Philosophy to Interfaces: an Explanatory Method and a Tool Inspired by Achinstein's Theory of Explanation

Sovrano, Francesco, Vitali, Fabio

arXiv.org Artificial Intelligence

We propose a new method for explanations in Artificial Intelligence (AI) and a tool to test its expressive power within a user interface. In order to bridge the gap between philosophy and human-computer interfaces, we show a new approach for the generation of interactive explanations based on a sophisticated pipeline of AI algorithms for structuring natural language documents into knowledge graphs, answering questions effectively and satisfactorily. Among the mainstream philosophical theories of explanation we identified one that in our view is more easily applicable as a practical model for user-centric tools: Achinstein's Theory of Explanation. With this work we aim to prove that the theory proposed by Achinstein can be actually adapted for being implemented into a concrete software application, as an interactive process answering questions. To this end we found a way to handle the generic (archetypal) questions that implicitly characterise an explanatory processes as preliminary overviews rather than as answers to explicit questions, as commonly understood. To show the expressive power of this approach we designed and implemented a pipeline of AI algorithms for the generation of interactive explanations under the form of overviews, focusing on this aspect of explanations rather than on existing interfaces and presentation logic layers for question answering. We tested our hypothesis on a well-known XAI-powered credit approval system by IBM, comparing CEM, a static explanatory tool for post-hoc explanations, with an extension we developed adding interactive explanations based on our model. The results of the user study, involving more than 100 participants, showed that our proposed solution produced a statistically relevant improvement on effectiveness (U=931.0, p=0.036) over the baseline, thus giving evidence in favour of our theory.


Model LineUpper: Supporting Interactive Model Comparison at Multiple Levels for AutoML

Narkar, Shweta, Zhang, Yunfeng, Liao, Q. Vera, Wang, Dakuo, Weisz, Justin D

arXiv.org Artificial Intelligence

Automated Machine Learning (AutoML) is a rapidly growing set of technologies that automate the model development pipeline by searching model space and generating candidate models. A critical, final step of AutoML is human selection of a final model from dozens of candidates. In current AutoML systems, selection is supported only by performance metrics. Prior work has shown that in practice, people evaluate ML models based on additional criteria, such as the way a model makes predictions. Comparison may happen at multiple levels, from types of errors, to feature importance, to how the model makes predictions of specific instances. We developed \tool{} to support interactive model comparison for AutoML by integrating multiple Explainable AI (XAI) and visualization techniques. We conducted a user study in which we both evaluated the system and used it as a technology probe to understand how users perform model comparison in an AutoML system. We discuss design implications for utilizing XAI techniques for model comparison and supporting the unique needs of data scientists in comparing AutoML models.