A versatile assembly system, using TV cameras and oomputer-controlled arm and moving table, is described. It makes almple assemblies such aa a peg and rings and a toy car. It separates parts from a heap, recognising them with an overhead camera, then assembles them by feel. It can be instructed to perform a new task with different parte by spending an hour showing it the parts and a day or two programming the assembly manipulations. A hierarchical description of parts, views, outlines etc. is used to construct models, and a structure matching algorithm is used in recognition.Later version appearing in Artificial Intelligence, Vol 6, pp. 129(1975) (available for a fee).In IJCAI-73: THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 20-23 August 1973, Stanford University Stanford, California.
In this paper, we summarize a new approach for the dissemination of robotics technologies. In a manner analogous to the personal computer movement of the early 1980's, we propose that a productive niche for robotic technologies is as a creative outlet for human expression and discovery. This paper describes our ongoing efforts to design, prototype and test a low-cost, highly competent personal rover for the domestic environment.
When designing a robot to interact with people, the decision to incorporate a robot arm may arise. In this paper, we investigate adding an inexpensive, functional arm to our mobile CoBot service robots. Specifically, we examine two-dimensional extendable pantograph arms for CoBot. Pantograph arms have intuitive kinematics and inverse kinematics. Pantograph arms are modular and adding additional linkages corresponds to simple changes in the kinematic calculations. These arms have several advantages (and disadvantages) compared to traditional robot arms. A prototype pantograph arm is currently in development and our goal is to attach a modular pantograph arm to CoBot to perform simple needed tasks, such as knocking on doors and pressing elevator buttons.
Autonomous landing is one of the must-have operations of unmanned aerial vehicles such as drones. In this paper we describe an approach for fully autonomous landing of AR.Drones. This approach consists of two components – identifying and recognizing a proper pattern used to mark the landing area and reaching the landing area (landing) even if small disturbances occur. We discuss possible patterns to identify the landing area, justify why we selected two coloured circles as the landing pattern, and describe an algorithm to recognize this pattern. In the second part we explain the PID controller used to control the drone during landing. The presented techniques were implemented and are available as an extension of Control Tower – software for controlling AR.Drone from a computer.