Goto

Collaborating Authors

 sensor


A multi-armed robot for assisting with agricultural tasks

Robohub

In their paper Force Aware Branch Manipulation To Assist Agricultural Tasks, which was presented at IROS 2025,, and proposed a methodology to safely manipulate branches to aid various agricultural tasks. We interviewed Madhav to find out more. Could you give us an overview of the problem you were addressing in the paper? Our work is motivated by StickBug [1], a multi-armed robotic system for precision pollination in greenhouse environments. One of the main challenges StickBug faces is that many flowers are partially or fully hidden within the plant canopy, making them difficult to detect and reach directly for pollination.


Learning with Feature Evolvable Streams

Neural Information Processing Systems

Learning with streaming data has attracted much attention during the past few years.Though most studies consider data stream with fixed features, in real practice the features may be evolvable. For example, features of data gathered by limited lifespan sensors will change when these sensors are substituted by new ones. In this paper, we propose a novel learning paradigm: Feature Evolvable Streaming Learning where old features would vanish and new features would occur. Rather than relying on only the current features, we attempt to recover the vanished features and exploit it to improve performance. Specifically, we learn two models from the recovered features and the current features, respectively. To benefit from the recovered features, we develop two ensemble methods. In the first method, we combine the predictions from two models and theoretically show that with the assistance of old features, the performance on new features can be improved. In the second approach, we dynamically select the best single prediction and establish a better performance guarantee when the best model switches.


Phased LSTM: Accelerating Recurrent Network Training for Long or Event-based Sequences

Neural Information Processing Systems

Recurrent Neural Networks (RNNs) have become the state-of-the-art choice for extracting patterns from temporal sequences. Current RNN models are ill suited to process irregularly sampled data triggered by events generated in continuous time by sensors or other neurons. Such data can occur, for example, when the input comes from novel event-driven artificial sensors which generate sparse, asynchronous streams of events or from multiple conventional sensors with different update intervals. In this work, we introduce the Phased LSTM model, which extends the LSTM unit by adding a new time gate. This gate is controlled by a parametrized oscillation with a frequency range which require updates of the memory cell only during a small percentage of the cycle. Even with the sparse updates imposed by the oscillation, the Phased LSTM network achieves faster convergence than regular LSTMs on tasks which require learning of long sequences. The model naturally integrates inputs from sensors of arbitrary sampling rates, thereby opening new areas of investigation for processing asynchronous sensory events that carry timing information. It also greatly improves the performance of LSTMs in standard RNN applications, and does so with an order-of-magnitude fewer computes.


Graphene-based sensor to improve robot touch

Robohub

Multiscale-structured miniaturized 3D force sensors CC BY 4.0 Robots are becoming increasingly capable in vision and movement, yet touch remains one of their major weaknesses. Now, researchers have developed a miniature tactile sensor that could give robots something much closer to a human sense of touch. The technology, developed by researchers at the University of Cambridge, is based on liquid metal composites and graphene - a two-dimensional form of carbon. The'skin' allows robots to detect not just how hard they are pressing on an object, but also the direction of applied forces, whether an object is slipping, and even how rough a surface is, at a scale small enough to rival the spatial resolution of human fingertips. Their results are reported in the journal .

  Country:
  Genre: Research Report (0.50)
  Industry: Health & Medicine (0.31)

Humanoid home robots are on the market – but do we really want them?

Robohub

Humanoid home robots are on the market - but do we really want them? Last year, Norwegian-US tech company 1X announced a strange new product: "the world's first consumer-ready humanoid robot designed to transform life at home". Standing 168 centimetres tall and weighing in at 30 kilograms, the US$20,000 Neo bot promises to automate common household chores such as folding laundry and loading the dishwasher. Neo has a built-in artificial intelligence (AI) system, but for tricky tasks it requires a 1X employee wearing a virtual reality helmet to remotely take over the robot. The operator can see whatever the bot does inside your house, and the process is recorded for future learning.


Developing an optical tactile sensor for tracking head motion during radiotherapy: an interview with Bhoomika Gandhi

Robohub

What was the topic of your PhD research and why was it an interesting area? My topic of research was developing an optical tactile sensor to track head motion during radiotherapy. I worked on both the hardware and software development of this sensor, though my focus was mostly on the software side. Its importance comes from the fact that during radiotherapy, patients undergoing head and neck cancer treatment are typically immobilised. This is usually done using a thermoplastic mask, which can feel very claustrophobic, or a stereotactic frame.


Robot Talk Episode 147 – Miniature living robots, with Maria Guix

Robohub

Claire chatted to Maria Guix from the University of Barcelona about combining electronics and biology to create biohybrid robots with emergent properties. Maria Guix is a chemist and nanotechnology researcher in the University of Barcelona's ChemInFlow lab, developing miniaturised living robots and integrating flexible sensors into microfluidic platforms to better understand biohybrid robotic platforms. She has held postdoctoral positions at IFW Dresden, Purdue University, and the Institute for Bioengineering of Catalonia, advancing biocompatible micromotors, magnetic microrobot automation, and functional living robots. Robot Talk is a weekly podcast that explores the exciting world of robotics, artificial intelligence and autonomous machines. Robot Talk is a weekly podcast that explores the exciting world of robotics, artificial intelligence and autonomous machines.


The Human Flatus Atlas plans to measure the explosivity of farts

New Scientist

Feedback is feeling bold, so here is a prediction: the research we are about to describe is going to win an Ig Nobel award within the next decade. The entire project feels tailor-made for the Igs. It is an effort to objectively measure human flatulence using biosensors, or "Smart Underwear". We learned of this from a press release from the University of Maryland, flagged to us by physics reporter Karmela Padavic-Callaghan with the phrase: "Surely, Feedback can do something with this." The essential problem is that we do not know the normal range for flatulence, unlike other key biomarkers like blood glucose.



An Inside Look at Lego's New Tech-Packed Smart Brick

WIRED

Lego's next release is a digital brick loaded with sensors that add new layers of interactivity to its play sets. WIRED got exclusive access to the Lego labs where the Smart Brick was born. The secretive division of 237 staff based here and in London, Boston, and Singapore is dedicated to thinking up what comes next for the world's largest toy brand. In front of me, on a plain white table, is a batch of prototypes of Lego's new Smart Brick, the final version of which is a small, sensor-laden 2-by-4 black brick with a big brain. No outsider has seen these prototypes, all of which represent stages of a journey Lego has been charting over the past eight years. Lego hopes this innovation, which lands in stores March 1, will safeguard the future of its plastic empire. The diminutive proportions of the finished Smart Brick belie the fact that the thing is exceedingly clever. Inside is a tiny custom chip running bespoke software that can communicate with onboard sensors to monitor and react to motion, orientation, and magnetic fields. It's also likely no exaggeration that the Smart Brick could represent the most radical product Lego has produced since Jens Nygaard Knudsen, the company's former longtime chief designer, created the minifigure nearly 50 years ago.