Goto

Collaborating Authors

 forward motion


Adaptive Dual-Headway Unicycle Pose Control and Motion Prediction for Optimal Sampling-Based Feedback Motion Planning

İşleyen, Aykut, Kadu, Abhidnya, van de Molengraft, René, Arslan, Ömür

arXiv.org Artificial Intelligence

Safe, smooth, and optimal motion planning for nonholonomically constrained mobile robots and autonomous vehicles is essential for achieving reliable, seamless, and efficient autonomy in logistics, mobility, and service industries. In many such application settings, nonholonomic robots, like unicycles with restricted motion, require precise planning and control of both translational and orientational motion to approach specific locations in a designated orientation, such as for approaching changing, parking, and loading areas. In this paper, we introduce a new dual-headway unicycle pose control method by leveraging an adaptively placed headway point in front of the unicycle pose and a tailway point behind the goal pose. In summary, the unicycle robot continuously follows its headway point, which chases the tailway point of the goal pose and the asymptotic motion of the tailway point towards the goal position guides the unicycle robot to approach the goal location with the correct orientation. The simple and intuitive geometric construction of dual-headway unicycle pose control enables an explicit convex feedback motion prediction bound on the closed-loop unicycle motion trajectory for fast and accurate safety verification. We present an application of dual-headway unicycle control for optimal sampling-based motion planning around obstacles. In numerical simulations, we show that optimal unicycle motion planning using dual-headway translation and orientation distances significantly outperforms Euclidean translation and cosine orientation distances in generating smooth motion with minimal travel and turning effort.


Wirelessly-Controlled Untethered Piezoelectric Planar Soft Robot Capable of Bidirectional Crawling and Rotation

Zheng, Zhiwu, Cheng, Hsin, Kumar, Prakhar, Wagner, Sigurd, Chen, Minjie, Verma, Naveen, Sturm, James C.

arXiv.org Artificial Intelligence

Electrostatic actuators provide a promising approach to creating soft robotic sheets, due to their flexible form factor, modular integration, and fast response speed. However, their control requires kilo-Volt signals and understanding of complex dynamics resulting from force interactions by on-board and environmental effects. In this work, we demonstrate an untethered planar five-actuator piezoelectric robot powered by batteries and on-board high-voltage circuitry, and controlled through a wireless link. The scalable fabrication approach is based on bonding different functional layers on top of each other (steel foil substrate, actuators, flexible electronics). The robot exhibits a range of controllable motions, including bidirectional crawling (up to ~0.6 cm/s), turning, and in-place rotation (at ~1 degree/s). High-speed videos and control experiments show that the richness of the motion results from the interaction of an asymmetric mass distribution in the robot and the associated dependence of the dynamics on the driving frequency of the piezoelectrics. The robot's speed can reach 6 cm/s with specific payload distribution.


Forward Motion: The Fourth Industrial Revolution Is Happening Now

#artificialintelligence

Advances in technology, robotics, genetic engineering, quantum computing will blur the boundaries ... [ ] between the digital, physical, and biological worlds, and usher in a whole new set of complex challenges for business leaders. Current smart technology has ushered in the Fourth Industrial Revolution, a new era integrating communications with automating industrial practices and traditional manufacturing. Through this improved communication, smart devices make human intervention unnecessary as machines communicate, self-diagnose and solve problems. While these new products and services may increase efficiency, analysts say they should be as ethical as possible, given their impact on our lives. Advances in AI, the internet of things (IoT), 3-D printing, robotics, genetic engineering, quantum computing will blur the boundaries between the digital, physical, and biological worlds, and with them usher in a whole new set of complex challenges for business leaders to negotiate.


Stumble-proof robot adapts to challenging terrain in real time – TechCrunch

#artificialintelligence

Robots have a hard time improvising, and encountering an unusual surface or obstacle usually means an abrupt stop or hard fall. But researchers have created a new model for robotic locomotion that adapts in real time to any terrain it encounters, changing its gait on the fly to keep trucking when it hits sand, rocks, stairs and other sudden changes. Although robotic movement can be versatile and exact, and robots can "learn" to climb steps, cross broken terrain and so on, these behaviors are more like individual trained skills that the robot switches between. Although robots like Spot famously can spring back from being pushed or kicked, the system is really just working to correct a physical anomaly while pursuing an unchanged policy of walking. There are some adaptive movement models, but some are very specific (for instance this one based on real insect movements) and others take long enough to work that the robot will certainly have fallen by the time they take effect.


Automated, predictive, and interpretable inference of C. elegans escape dynamics

Daniels, Bryan C., Ryu, William S., Nemenman, Ilya

arXiv.org Machine Learning

The roundworm C. elegans exhibits robust escape behavior in response to rapidly rising temperature. The behavior lasts for a few seconds, shows history dependence, involves both sensory and motor systems, and is too complicated to model mechanistically using currently available knowledge. Instead we model the process phenomenologically, and we use the Sir Isaac dynamical inference platform to infer the model in a fully automated fashion directly from experimental data. The inferred model requires incorporation of an unobserved dynamical variable, and is biologically interpretable. The model makes accurate predictions about the dynamics of the worm behavior, and it can be used to characterize the functional logic of the dynamical system underlying the escape response. This work illustrates the power of modern artificial intelligence to aid in discovery of accurate and interpretable models of complex natural systems.


Artificial Snakeskin Helps Robots Get Their Slither On

IEEE Spectrum Robotics

Snakes have got to be some of the most creatively mobile animals ever evolved. They can squeeze into very small holes. Some of them can even fly, a little bit. And all of this despite looking like a lizard that's missing 100 percent of the limbs that it's supposed to have. Roboticists have been working on snake robots for a long time, primarily with a focus on versatile mobility in constrained spaces.


DeployBot could be used on future space missions

Daily Mail - Science & tech

Researchers have built the world's first soft robot that can move without the need for a motor or any mechanical components. The robot, which the team has named'DeployBot', moves like an inchworm when an electric current is applied to its frame. The team believes that it could have a range of uses, including on future space missions, where access to motors is limited. The robot, which the team has named'DeployBot', moves like an inchworm when an electric current is applied to its frame. The team believes that it could have a range of uses, including on future space missions, where access to motors is limited (artist's impression pictured) DeployBot is assembled from eight modules – four for the body, and one on each leg. The modules are made of both rigid and flexible materials, and contain magnets that connect and lock them together.


Towards Human-Induced Vision-Guided Robot Behavior

Ferrer, Gabriel John (Hendrix College)

AAAI Conferences

An appealing alternative to tediously specifying robot behaviors in response to particular image features is to have the robot’s behavior be induced by human decisions made when piloting the robot. This paper presents one promising approach to creating this alternative. A human pilots a camera-equipped robot, which builds a representation of its target environment using Growing Neural Gas (GNG). The robot associates an action with each GNG node based on what the human pilot was doing while the node was active. When running autonomously, the robot chooses the action associated with the node that is the closest match to the current input image. Preliminary results suggest that the approach has potential, but that subsequent alteration of the actions induced for some of the GNG nodes is important for acceptable performance.