Are you thinking of learning programming languages like C, Python or R to work on machine learning projects? AutoML could save you all the time and effort. Lately, Automated machine learning or AutoML has become a popular solution to build computer vision systems. The tech communities are awash with conversations around AutoML as to how it will change the way machine learning is done with limited or no coding knowledge. From autonomous vehicles to handwritten text recognition, face recognition, personalised recommendations, and diagnosing from x-ray images, computer vision is transforming industries globally.
Last week we discussed level 1 and 2 autonomy and this week we will move on to L3-L5 which is considered to be "true" autonomous driving. With L3 in certain situations (e.g., highway driving) the car can fully take over all driving tasks including lane changing, but the driver must be constantly paying attention and has to keep his/her hands near the steering wheel at all times and must be paying attention and not distracted by some other tasks such as watching tv, staring at a phone or sleeping. The reason the driver must always pay attention is that if the autonomous system finds itself in a situation it cannot handle (e.g., an unexpected detour or highway construction) it will provide a warning (e.g., seat vibrates or an alarm sounds) and then hand control back to the driver. L4 is a fully autonomous car that can perform all driving functions without fail and doesn't ever require intervention from the driver, though the driver has the option to take over at any time. The caveat however is that the autonomous function can ONLY be used in certain prescribed situations (e.g., proper weather conditions with certain visibility) or locations (e.g., in a well-mapped city or vicinity).
Artificial intelligence is already impacting virtually every industry and every human being. This incredible technology has brought many good and questionable things into our lives, and it will create an even bigger impact in the next two decades. According to Ray Kurzweil, one of the most-known futurists, computers will have the same level of intelligence as humans by 2029. Kurzweil stated to Futurism, "2029 is the consistent date I have predicted for when an AI will pass a valid Turing test and therefore achieve human levels of intelligence. I have set the date 2045 for the'Singularity' which is when we will multiply our effective intelligence a billion fold by merging with the intelligence we have created."
When a self-driving car passes by, you tend to notice. The towering sensors whirling around on the top of the car more than stand out. But Chinese autonomous vehicle company Pony.ai is reimagining the roofline for its next generation of autonomous taxicabs. As part of a partnership with autonomous vehicle sensor maker Luminar announced Monday, the Pony.ai Typical LiDAR sensors like those from Velodyne, Intel's Mobileye, and Waymo's own Laser Bear Honeycomb are mostly cone-shaped to help pull in a full 360-degree view from the top and around the car.
Plus plans to merge with Hennessy Capital Investment Corp. V in a transaction that would bring the company, which is based in California and China, about $500 million in gross proceeds and a market capitalization of roughly $3.3 billion. The agreement is expected to close in the third quarter, the companies said Monday. The deal would provide "a significant cash infusion for us to expand our commercialization efforts," Plus Chief Executive and co-founder David Liu said, as the company steps up production and aims to fill thousands of contracted orders and vehicle reservations from Chinese and U.S. fleets. The transaction would include a $150 million private placement of shares with BlackRock Inc., D.E. Top news and in-depth analysis on the world of logistics, from supply chain to transport and technology.
We already know we can teach machines to see. Sensors enable autonomous cars to take in visual information and make decisions about what to do next when they're on the road. But did you know machines can smell, too? Artificial Intelligence Is Developing A Sense Of Smell: What Could A Digital Nose Mean In Practice? Aryballe, a startup that uses artificial intelligence and digital olfaction technology to mimic the human sense of smell, helps their business customers turn odor data into actionable information.
The history of artificial intelligence has been marked by repeated cycles of extreme optimism and promise followed by disillusionment and disappointment. Today's AI systems can perform complicated tasks in a wide range of areas, such as mathematics, games, and photorealistic image generation. But some of the early goals of AI like housekeeper robots and self-driving cars continue to recede as we approach them. Part of the continued cycle of missing these goals is due to incorrect assumptions about AI and natural intelligence, according to Melanie Mitchell, Davis Professor of Complexity at the Santa Fe Institute and author of Artificial Intelligence: A Guide For Thinking Humans. In a new paper titled "Why AI is Harder Than We Think," Mitchell lays out four common fallacies about AI that cause misunderstandings not only among the public and the media, but also among experts.
If we are not actively engaged in industries related to technology, we may fail to fully appreciate how we might already be influenced by artificial intelligence in our day-to-day world. Everyone is talking about self-driving cars, seemingly inanimate objects conversing with you about your personal preferences, someone somewhere already seems to recommend your shopping list armed with the knowledge of what you like or dislike. From the viewpoint of the business world, all companies today are looking to adopt AI in some form or the other to improve business processes, achieve efficiency, so on and so forth. I recently read an article about Softbank's Masayoshi Son and his vision "for an AI-powered utopia where machines control how we live". While this may sound like an unreal possibility, one could relate to this thought better if one were to ponder over David Fano's (Chief Growth Officer, WeWork) words, "Basically, every object will have the potential to be a computer".