Are you thinking of learning programming languages like C, Python or R to work on machine learning projects? AutoML could save you all the time and effort. Lately, Automated machine learning or AutoML has become a popular solution to build computer vision systems. The tech communities are awash with conversations around AutoML as to how it will change the way machine learning is done with limited or no coding knowledge. From autonomous vehicles to handwritten text recognition, face recognition, personalised recommendations, and diagnosing from x-ray images, computer vision is transforming industries globally.
Ahead of the 2021-22 Budget being handed down on Tuesday, the federal government has announced a new digital economy strategy, which it described as an investment into the settings, infrastructure, and incentives to grow Australia's digital economy. The strategy, costing just shy of AU$1 billion, is set to include work on "emerging aviation technologies". The government will be making a two-year, AU$32.6 million investment in an Emerging Aviation Technology Partnerships program to "support the use of emerging aviation technologies to address priority community, mobility, and cargo needs in regional Australia". The program will see the government partner with industry to look into tech such as electric engines, drones, and electric vertical take-off and landing aircraft. "This program will support the digital transformation of Australian businesses, increase business efficiency, and reduce carbon emissions through new technology," the government said.
A Tesla engineer has informed California regulators that the electric vehicle company might not have a fully self-driving vehicle ready for this year. The information comes from documents dated May 6 exchanged between the California Department of Motor Vehicles and several Tesla employees, including CJ Moore, the company's autopilot engineer. The documents were released by the legal transparency group PlainSite, which got them under the Freedom of Information Act (FOIA). In January, Tesla chief Elon Musk said he was "highly confident the car will be able to drive itself with reliability in excess of human this year." "Tesla is at Level 2 currently. The ratio of driver interaction would need to be in the magnitude of 1 or 2 million miles per driver interaction to move into higher levels of automation," California DMV noted in the memo.
Federal investigators said Monday they were able to glean some insights into what might have happened after a fire erupted from a Tesla crash that killed two people in the Houston area in April and destroyed the vehicle's data recorder. . The National Transportation Safety Board released preliminary findings from its probe into the crash, which raised speculation about whether the vehicle's partially self-driving system, Autopilot, was to blame. The speculation stemmed from local authorities saying they were nearly positive that no one was behind the wheel when the vehicle crashed. The NTSB, in its preliminary report, said video footage from the vehicle owner's home security system showed him getting behind the wheel of the Tesla Model S and then slowly exiting the driveway. The vehicle traveled about 550 feet "before departing the road on a curve, driving over the curb, and hitting a drainage culvert, a raised manhole and a tree," according to the NTSB.
Last week we discussed level 1 and 2 autonomy and this week we will move on to L3-L5 which is considered to be "true" autonomous driving. With L3 in certain situations (e.g., highway driving) the car can fully take over all driving tasks including lane changing, but the driver must be constantly paying attention and has to keep his/her hands near the steering wheel at all times and must be paying attention and not distracted by some other tasks such as watching tv, staring at a phone or sleeping. The reason the driver must always pay attention is that if the autonomous system finds itself in a situation it cannot handle (e.g., an unexpected detour or highway construction) it will provide a warning (e.g., seat vibrates or an alarm sounds) and then hand control back to the driver. L4 is a fully autonomous car that can perform all driving functions without fail and doesn't ever require intervention from the driver, though the driver has the option to take over at any time. The caveat however is that the autonomous function can ONLY be used in certain prescribed situations (e.g., proper weather conditions with certain visibility) or locations (e.g., in a well-mapped city or vicinity).
Artificial intelligence is already impacting virtually every industry and every human being. This incredible technology has brought many good and questionable things into our lives, and it will create an even bigger impact in the next two decades. According to Ray Kurzweil, one of the most-known futurists, computers will have the same level of intelligence as humans by 2029. Kurzweil stated to Futurism, "2029 is the consistent date I have predicted for when an AI will pass a valid Turing test and therefore achieve human levels of intelligence. I have set the date 2045 for the'Singularity' which is when we will multiply our effective intelligence a billion fold by merging with the intelligence we have created."
NASA's Perseverance Mars rover has achieved yet another first after capturing the sounds of another spacecraft hovering on the red planet. Using the microphone on its rock-zapping SuperCam instrument, the six-wheeled robot listened to the sounds of the Ingenuity helicopter on April 30 and recorded the whirring of its fast-spinning rotors. This marked the first time a spacecraft has recorded audio of another probe on a world beyond Earth. This was the chopper's fourth flight since Perseverance and Ingenuity landed together on Feb. 18 on the floor of Mars' Jezero Crater, NASA said in a statement. A video recently released by NASA combined the footage from Perseverance's Mastcam-Z imager of the solar-powered helicopter with the recorded audio, allowing scientists to know how the robot is performing just by tuning in to the sound it makes.
When a self-driving car passes by, you tend to notice. The towering sensors whirling around on the top of the car more than stand out. But Chinese autonomous vehicle company Pony.ai is reimagining the roofline for its next generation of autonomous taxicabs. As part of a partnership with autonomous vehicle sensor maker Luminar announced Monday, the Pony.ai Typical LiDAR sensors like those from Velodyne, Intel's Mobileye, and Waymo's own Laser Bear Honeycomb are mostly cone-shaped to help pull in a full 360-degree view from the top and around the car.
Plus plans to merge with Hennessy Capital Investment Corp. V in a transaction that would bring the company, which is based in California and China, about $500 million in gross proceeds and a market capitalization of roughly $3.3 billion. The agreement is expected to close in the third quarter, the companies said Monday. The deal would provide "a significant cash infusion for us to expand our commercialization efforts," Plus Chief Executive and co-founder David Liu said, as the company steps up production and aims to fill thousands of contracted orders and vehicle reservations from Chinese and U.S. fleets. The transaction would include a $150 million private placement of shares with BlackRock Inc., D.E. Top news and in-depth analysis on the world of logistics, from supply chain to transport and technology.