Lee explains his research on marsupial robots, or carrier-passenger pairs of heterogeneous robot systems. They discuss the possible applications of marsupial robots including the DARPA Subterranean Competition, and some of the technical challenges including optimal deployment formulated as a stochastic assignment problem. Chris Lee is pursuing a Master of Science in Robotics at Oregon State University, having received a Bachelor of Science in Mechanical Engineering from the University of Buffalo. His research is in robotic exploration, frontier extraction, and stochastic assignment.
Apart from the IEEE/RSJ IROS 2020 (International Conference on Intelligent Robots and Systems) original series Real Roboticist that we have been featuring in the last weeks, another series of three videos was produced together with Black in Robotics and the support of Toyota Research Institute. In this series, black roboticists give their personal examples of why diversity matters in robotics, showcase their research and explain what made them build a career in robotics. Here's a list of all the speakers and organisations who took part in the videos:
On August 8th, 2021, a team of four graduate students from the University of Toronto presented their ethical design in the world's first ever roboethics competition, the RO-MAN 2021 Roboethics to Design & Development Competition. During the competition, design teams tackled a challenging yet relatable scenario--introducing a robot helper to the household. The students' solution, entitled "Jeeves, the Ethically Designed Interface (JEDI)", demonstrated how home robots can act safely and according to social and cultural norms. Click here to watch their video submission. JEEVES acted as an extension of the mother and the interface rules accommodated her priorities.
This week you'll be able to listen to the talks of Jonathan Hurst (Professor of Robotics at Oregon State University, and Chief Technology Officer at Agility Robotics) and Andrea Thomaz (Associate Professor of Robotics at the University of Texas at Austin, and CEO of Diligent Robotics) as part of this series that brings you the plenary and keynote talks from the IEEE/RSJ IROS2020 (International Conference on Intelligent Robots and Systems). Jonathan's talk is in the topic of humanoids, while Andrea's is about human-robot interaction. Bio: Jonathan W. Hurst is Chief Technology Officer and co-founder of Agility Robotics, and Professor and co-founder of the Oregon State University Robotics Institute. He holds a B.S. in mechanical engineering and an M.S. and Ph.D. in robotics, all from Carnegie Mellon University. His university research focuses on understanding the fundamental science and engineering best practices for robotic legged locomotion and physical interaction.
Gaze is an extremely powerful and important signal during human-human communication and interaction, conveying intentions and informing about other's decisions. What happens when a robot and a human interact looking at each other? Researchers at IIT-Istituto Italiano di Tecnologia (Italian Institute of Technology) investigated whether a humanoid robot's gaze influences the way people reason in a social decision-making context. What they found is that a mutual gaze with a robot affects human neural activity, influencing decision-making processes, in particular delaying them. Thus, a robot gaze brings humans to perceive it as a social signal.
This week you'll meet Michelle Johnson, Associate Professor of Physical Medicine and Rehabilitation at the University of Pennsylvania. Michelle is also the Director of the Rehabilitation Robotics Lab at the University of Pennsylvania, whose aim is to use rehabilitation robotics and neuroscience to investigate brain plasticity and motor function after non-traumatic brain injuries, for example in stroke survivors or persons diagnosed with cerebral palsy. If you'd like to know more about her professional journey, her work with affordable robots for low/middle income countries and her next frontier in robotics, among many more things, check out her video below!
As part of our series showcasing the plenary and keynote talks from the IEEE/RSJ IROS2020 (International Conference on Intelligent Robots and Systems), this week we bring you Nikolaus Correll (Associate Professor at the University of Colorado at Boulder) and Cynthia Breazeal (Professor of Media Arts and Sciences at MIT). Nikolaus' talk is on the topic of robot manipulation, while Cynthia's talk is about the topic of social robots. Bio: Nikolaus Correll is an Associate Professor at the University of Colorado at Boulder. He obtained his MS in Electrical Engineering from ETH Zürich and his PhD in Computer Science from EPF Lausanne in 2007. From 2007-2009 he was a post-doc at MIT's Computer Science and Artificial Intelligence Lab (CSAIL).
During the last decades robots are transforming from simple machines to cognitive collaborators. The distance that has been covered is long, but there are still challenges, as well as opportunities that lie ahead. That was also the main topic of discussion in the agROBOfood event'Visioning the future of agri-food robotics' by a panel of experts of the domain. The topic was introduced by two inspiring presentations. The first one was by Jérôme Bandry, who shared the vision of CEMA (European Agricultural Machinery).
In this fourth release of our series dedicated to IEEE/RSJ IROS 2020 (International Conference on Intelligent Robots and Systems) original series Real Roboticist, we bring you Peter Corke. He is a distinguished professor of robotic vision at Queensland University of Technology, Director of the QUT Centre for Robotics, and Director of the ARC Centre of Excellence for Robotic Vision. If you've ever studied a robotics or computer vision course, you might have read a classic book: Peter Corke's Robotics, Vision and Control. Moreover, Peter has also released several open-source robotics resources and free courses, all available at his website. If you'd like to hear more about his career in robotics and education, his main challenges and what he learnt from them, and what's his advice for current robotics students, check out his video below.
Anthony DiMare and Charles Chiau deep dive into how Bedrock Ocean is innovating in the world of Marine Surveys. At Bedrock Ocean, they are developing an Autonomous Underwater Vehicle (AUV) that is able to map the seafloor autonomously and at a high resolution. They are also developing a data platform to access, process, and visualize data captured from other companies at the seafloor. Bedrock Ocean is solving two problems in the industry of Marine Surveying. The vast majority of the seafloor is completely unmapped 2. The data that is captured from the seafloor is not standardized or centralized.