Artificial Intelligence (AI) is not just a buzzword, but a crucial part of the technology landscape. AI is changing every industry and business function, which results in increased interest in its applications, subdomains and related fields. This makes AI companies the top leaders driving the technology swift. AI helps us to optimise and automate crucial business processes, gather essential data and transform the world, one step at a time. From Google and Amazon to Apple and Microsoft, every major tech company is dedicating resources to breakthroughs in artificial intelligence. As big enterprises are busy acquiring or merging with other emerging inventions, small AI companies are also working hard to develop their own intelligent technology and services. By leveraging artificial intelligence, organizations get an innovative edge in the digital age. AI consults are also working to provide companies with expertise that can help them grow. In this digital era, AI is also a significant place for investment. AI companies are constantly developing the latest products to provide the simplest solutions. Henceforth, Analytics Insight brings you the list of top 100 AI companies that are leading the technology drive towards a better tomorrow. AEye develops advanced vision hardware, software, and algorithms that act as the eyes and visual cortex of autonomous vehicles. AEye is an artificial perception pioneer and creator of iDAR, a new form of intelligent data collection that acts as the eyes and visual cortex of autonomous vehicles. Since its demonstration of its solid state LiDAR scanner in 2013, AEye has pioneered breakthroughs in intelligent sensing. Their mission was to acquire the most information with the fewest ones and zeros. This would allow AEye to drive the automotive industry into the next realm of autonomy. Algorithmia invented the AI Layer.
These Visionary companies have a big idea and are well on their way to achieving it, although it isn't always an easy road for any really innovative technology. In the case of Cruise, that meant testing self driving vehicles on the streets of San Francisco, one of the hardest driving environments in the world. Some of our Visionary Awards go to companies who are opening up new market applications for robotics, such as Built Robotics in construction, Dishcraft in food services, Embark in self-driving trucks, Iron Ox in urban agriculture and Zipline in drone delivery. Some are building tools or platforms that the entire robotics industry can benefit from, such as Agility Robotics, Covariant, Formant, RobustAI and Zoox. The companies in our Good Robot Awards also show that'technologies built for us, have to be built by us'.
General Atomics Aeronautical Systems, Inc. (GA-ASI) has demonstrated the DARPA-developed Collaborative Operations in Denied Environment (CODE) autonomy engine on the company's Avenger Unmanned Aircraft System (UAS). CODE was used in order to gain further understanding of cognitive Artificial Intelligence (AI) processing on larger UAS platforms for air-to-air targeting. Using a network-enabled Tactical Targeting Network Technology (TTNT) radio for mesh network mission communications, GA-ASI was able to demonstrate integration of emerging Advanced Tactical Data Links (ATDL), as well as separation between flight and mission critical systems. During the autonomous flight, CODE software controlled the manoeuvring of the Avenger UAS for over two hours without human pilot input. GA-ASI extended the base software behavioural functions for a coordinated air-to-air search with up to six aircraft, using five virtual aircraft for the purposes of the demonstration.
A private rocket-launch startup unveiled its fully autonomous drone designed to drop a rocket in midair that shoots small satellites into orbit without a launchpad. Alabama-based company Aevum rolled out its Ravn X Autonomous Launch Vehicle at the Cecil SpacePort launch facility in Jacksonville, Fla., on Thursday. America is changing faster than ever! Add Changing America to your Facebook or Twitter feed to stay on top of the news. The 80-foot aircraft has a wingspan of 60 feet, stands 18 feet tall and is the world's largest Unmanned Aircraft System (UAS) by mass, weighing 55,000 pounds.
General Atomics Aeronautical Systems, Inc. (GA-ASI) has been awarded a contract by the U.S. Department of Defense's Joint Artificial Intelligence Center (JAIC) to develop enhanced autonomous sensing capabilities for unmanned aerial vehicles (UAVs). The JAIC Smart Sensor project aims to advance drone-based AI technology by demonstrating object recognition algorithms and employing onboard AI to automatically control UAV sensors and direct autonomous flight. GA-ASI will deploy these new capabilities on a MQ-9 Reaper UAV equipped with a variety of sensors, including GA-ASI's Reaper Defense Electronic Support System (RDESS) and Lynx Synthetic Aperture Radar (SAR). GA-ASI's Metis Intelligence, Surveillance and Reconnaissance (ISR) tasking and intelligence-sharing application, which enables operators to specify effects-based mission objectives and receive automatic notification of actionable intelligence, will be used to command the unmanned aircraft. J.R. Reid, GA-ASI Vice President of Strategic Development, commented: "GA-ASI is excited to leverage the considerable investment we have made to advance the JAIC's autonomous sensing objective. This will bring a tremendous increase in unmanned systems capabilities for applications across the full-range of military operations."
Due to the advancements in cellular technologies and the dense deployment of cellular infrastructure, integrating unmanned aerial vehicles (UAVs) into the fifth-generation (5G) and beyond cellular networks is a promising solution to achieve safe UAV operation as well as enabling diversified applications with mission-specific payload data delivery. In particular, 5G networks need to support three typical usage scenarios, namely, enhanced mobile broadband (eMBB), ultra-reliable low-latency communications (URLLC), and massive machine-type communications (mMTC). On the one hand, UAVs can be leveraged as cost-effective aerial platforms to provide ground users with enhanced communication services by exploiting their high cruising altitude and controllable maneuverability in three-dimensional (3D) space. On the other hand, providing such communication services simultaneously for both UAV and ground users poses new challenges due to the need for ubiquitous 3D signal coverage as well as the strong air-ground network interference. Besides the requirement of high-performance wireless communications, the ability to support effective and efficient sensing as well as network intelligence is also essential for 5G-and-beyond 3D heterogeneous wireless networks with coexisting aerial and ground users. In this paper, we provide a comprehensive overview of the latest research efforts on integrating UAVs into cellular networks, with an emphasis on how to exploit advanced techniques (e.g., intelligent reflecting surface, short packet transmission, energy harvesting, joint communication and radar sensing, and edge intelligence) to meet the diversified service requirements of next-generation wireless systems. Moreover, we highlight important directions for further investigation in future work.
Two menacing men stand next to a white van in a field, holding remote controls. They open the van's back doors, and the whining sound of quadcopter drones crescendos. They flip a switch, and the drones swarm out like bats from a cave. In a few seconds, we cut to a college classroom. The students scream in terror, trapped inside, as the drones attack with deadly force. The lesson that the film, Slaughterbots, is trying to impart is clear: tiny killer robots are either here or a small technological advance away. And existing defences are weak or nonexistent.
This paper presents a novel algorithm named the motion-encoded particle swarm optimization (MPSO) for finding a moving target with unmanned aerial vehicles (UAVs). From the Bayesian theory, the search problem can be converted to the optimization of a cost function that represents the probability of detecting the target. Here, the proposed MPSO is developed to solve that problem by encoding the search trajectory as a series of UAV motion paths evolving over the generation of particles in a PSO algorithm. This motion-encoded approach allows for preserving important properties of the swarm including the cognitive and social coherence, and thus resulting in better solutions. Results from extensive simulations with existing methods show that the proposed MPSO improves the detection performance by 24\% and time performance by 4.71 times compared to the original PSO, and moreover, also outperforms other state-of-the-art metaheuristic optimization algorithms including the artificial bee colony (ABC), ant colony optimization (ACO), genetic algorithm (GA), differential evolution (DE), and tree-seed algorithm (TSA) in most search scenarios. Experiments have been conducted with real UAVs in searching for a dynamic target in different scenarios to demonstrate MPSO merits in a practical application.
Be prepared in the near future when you gaze into the blue skies to perceive a whole series of strange-looking things – no, they will not be birds, nor planes, or even superman. They may be temporarily, and in some cases startlingly mistaken as UFOs, given their bizarre and ominous appearance. But, in due course, they will become recognized as valuable objects of a new era of human-made flying machines, intended to serve a broad range of missions and objectives. Many such applications are already incorporated and well entrenched in serving essential functions for extending capabilities in our vital infrastructures such as transportation, utilities, the electric grid, agriculture, emergency services, and many others. Rapidly advancing technologies have made possible the dramatic capabilities of unmanned aerial vehicles (UAV/drones) to uniquely perform various functions that were inconceivable a mere few years ago.
We show that automated person detection under occlusion conditions can be significantly improved by combining multi-perspective images before classification. Here, we employed image integration by Airborne Optical Sectioning (AOS)---a synthetic aperture imaging technique that uses camera drones to capture unstructured thermal light fields---to achieve this with a precision/recall of 96/93%. Finding lost or injured people in dense forests is not generally feasible with thermal recordings, but becomes practical with use of AOS integral images. Our findings lay the foundation for effective future search and rescue technologies that can be applied in combination with autonomous or manned aircraft. They can also be beneficial for other fields that currently suffer from inaccurate classification of partially occluded people, animals, or objects.