Autonomous a2z, a member company of Born2Global Centre and Sejong Technopark, has recently received a seed investment of US$1.9 million from angel investments by individuals and partner corporations. Autonomous a2z is a company specializing in autonomous mobility solutions. The startup announced that it has begun commercializing an ongoing project. It also stated that the infusion of capital is already serving as a stepping stone for future development of autonomous mobility solutions and for assuming a leading position in relevant markets. Founded in 2018, Autonomous a2z has advanced its cutting-edge technologies to test self-driving technologies throughout Korea. As implied by its name, the firm develops "everything from a to z" in the arena of self-driving cars, including its own systems and algorithms.
HYPR is testing its self-learning autonomous driving system in a modified Daimler Smart Car. As Zoox, the secretive robotaxi developer recently acquired by Amazon, gets ready to unveil its futuristic fleet vehicle, its former CEO who dreamed up the company is re-emerging with a new startup that's designing AI-enabled software he hopes will allow cars to "teach themselves" to drive. Early-stage HYPR, created by Zoox cofounder Tim Kentley Klay, says it's using reinforcement learning, a branch of machine learning that utilizes a reward-based approach, to train driving algorithms dynamically–ideally with no need for direct human instruction or supervision. The Alameda, California-based startup has raised a $10 million seed round and begun testing its approach with a modified Daimler Smart Car. Backers include R7 Ventures and Australian billionaire Andrew Forrest.
Artificial intelligence (AI) is impacting the future of virtually every industry and every human being on the planet. Artificial intelligence has been established as the main driver of emerging technologies such as big data, robotics, and the Internet of Things (IoT). Moving into 2021, Artificial Intelligence will continue to act as a main technological innovator for the foreseeable future. The next decade will see an unprecedented fast development and adoption of existing and emerging technologies. This development will be driven toward pushing digital acceleration forward.
A race is on to accelerate artificial intelligence (AI) at the edge of the network and reduce the need to transmit huge amounts of data to the cloud. The edge, or edge computing, brings data processing resources closer to the data and devices that need them, reducing data latency, which is important for many time-sensitive processes, such as video streaming or self-driving cars. Development of specialized silicon and enhanced machine learning (ML) models is expected to drive greater automation and autonomy at the edge for new offerings, from industrial robots to self-driving vehicles. Vast computer resources in centralized clouds and enterprise data centers are adept at processing large volumes of data to spot patterns and create machine learning training models that "teach" devices to infer what actions to take when they detect similar patterns. But when those models detect something out of the ordinary, they are forced to seek intervention from human operators or get revised models from data-crunching systems.
Edge AI is here to stay! Artificial intelligence (AI) is powering many real-world applications which we see in our daily lives. AI, once seen as an emerging technology, has now successfully penetrated into every industry (B2B & B2C) Banking, logistics, healthcare, defence, manufacturing, retail, automotive, consumer electronics. Smart Speaker like Echo, Google Nest, is one such example of Edge AI solutions in the consumer electronics sector. AI technology is powerful, and human-kind has set its eye on the path of harnessing its potential to the fullest. Intelligence brought to the device can be very useful and creative.
They are capable of seemingly sophisticated results, but they can also be fooled in ways that range from relatively harmless -- misidentifying one animal as another -- to potentially deadly if the network guiding a self-driving car misinterprets a stop sign as one indicating it is safe to proceed. A philosopher with the University of Houston suggests in a paper published in Nature Machine Intelligence that common assumptions about the cause behind these supposed malfunctions may be mistaken, information that is crucial for evaluating the reliability of these networks. As machine learning and other forms of artificial intelligence become more embedded in society, used in everything from automated teller machines to cybersecurity systems, Cameron Buckner, associate professor of philosophy at UH, said it is critical to understand the source of apparent failures caused by what researchers call "adversarial examples," when a deep neural network system misjudges images or other data when confronted with information outside the training inputs used to build the network. They're rare and are called "adversarial" because they are often created or discovered by another machine learning network -- a sort of brinksmanship in the machine learning world between more sophisticated methods to create adversarial examples and more sophisticated methods to detect and avoid them. "Some of these adversarial events could instead be artifacts, and we need to better know what they are in order to know how reliable these networks are," Buckner said.
A Cambridge driverless car start-up that has emerged as one of Britain's brightest prospects in the cutting edge sector has secured backing from Sir Richard Branson's Virgin Group as it seeks to accelerate its plans. Wayve Technologies, founded by 28-year-old Alex Kendall with Amar Shah, is building artificial intelligence technology that uses machine learning techniques pioneered by DeepMind to improve self-driving cars. Its latest funding has seen it secure a further $20m (£15m) from current and new investors. According to Mr Kendall, its chief executive, Wayve's technology could leapfrog US giants such as Google's Waymo and Uber. Mr Kendall said: "The incumbents started off the back of DARPA [the US defence agency] challenges in the mid 2000s. I think those challenges set the industry back about 10 years."
A chilling drone video shows a hammerhead shark circling a seemingly oblivious swimmer off a Miami beach. The video was posted to Instagram by drone operator and photographer Jason McIntosh. The Miami Herald reports that the close encounter was captured off South Beach on Sunday. McIntosh captioned the video, "Hammer Time," and used MC Hammer's famous song "U Can't Touch This" as the soundtrack. The video has been viewed more than 29,000 times since it was posted last week.
This article is part of KrASIA's partnership with Web Summit. The last 12 months have seen decisive change in the way we spend our free time. Mobility solutions are becoming increasingly popular, with driverless vehicles popping up across the world, while our urban spaces are evolving into smart city projects. Web Summit's lifestyle content covers it all. What CNN calls "Europe's largest tech event" gathers experts from the industries that play vital roles in our lifestyles.
Salesforce is backing an AI project called SharkEye which aims to save the lives of beachgoers from one of the sea's deadliest predators. Shark attacks are, fortunately, quite rare. However, they do happen and most cases are either fatal or cause life-changing injuries. Just last week, a fatal shark attack in Australia marked the eighth of the year--an almost 100-year record for the highest annual death toll. Once rare sightings in Southern California beaches are now becoming increasingly common as sharks are preferring the warmer waters close to shore.