If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
The Electronic Entertainment Expo, better known as E3, finished its last day of presentations yesterday. For the first time in its 26 year history, E3 was an all-virtual event due to the COVID-19 pandemic, but that didn't stop the major game companies from delivering some (mostly) electrifying news. Tiny Tina's Wonderlands was one of the games players caught a first glimse of at this year's E3. Tiny Tina's Wonderlands was one of the games players caught a first glimse of at this year's E3. Starting with the Summer Games Fest's Kickoff Live event, audiences caught the first glimpse of developer Gearbox Software's Tiny Tina's Wonderlands, a genial sounding game that appears to anything but; the trailer opens with a dreadlocked warrior blasting a machine gun at a dragon shooting electricity.
Goodyear is among several tire makers to bet on tech-embedded tires. German automaker Continental has sold digital tire monitoring systems for medium-duty trucks for years. The sensors mounted inside the tire can also be purchased separately and retrofitted. French manufacturer Michelin and Japan-based Bridgestone have similar tech to measure strain on tires. Pirelli is responsible for the sensors inside wheels on the latest McLaren Artura sports car.
The idea of using Artificial Intelligence to understand consumer behavior has been around for a while now. From speculating its accuracy to debating its methods, researchers have always had a keen interest in discussing AI's future in consumer research. Emotion AI is the most noticeable development in this regard. Emotion AI is the sub-set of Artificial Intelligence that tries to understand human expressions, both verbal and non-verbal. Also known as Affective Computing, Emotion AI is the science of recognizing, interpreting, processing, and simulating human expressions. Affective Computing was first coined in 1995 by Rosalind Picard's paper of the same name, published by the MIT Press1.
Jack Morrison and Isaac Roberts (far left and right) previously cofounded and sold 3D scanning company Replica Labs to Occipital. There they met electrical engineer Davis Foster (center), with whom they went on to cofound Scythe Robotics. Self-driving cars get all the hype. But while the category continues to face a long and uncertain path to commercialization, a burgeoning crop of autonomous vehicles is already hitting the market. The latest is Scythe Robotics, a Boulder, Colorado-based company that announced today it is launching a zero-emission, autonomous lawn mower backed by $18.6 million from Inspired Capital, True Ventures and more.
AI models are as good as the algorithms and data they are trained on. When an AI system fails, it is usually due to three factors; 1) the algorithm has been incorrectly trained, 2) there is bias in the system's training data, or 3) there is developer bias in the model building process. The focus of this article is on the bias in training data and the bias that is coded directly into AI systems by model developers. "I think today, the AI community at large has a self-selecting bias simply because the people who are building such systems are still largely white, young and male. I think there is a recognition that we need to get beyond it, but the reality is that we haven't necessarily done so yet."
Today, Stripe is stepping in to fill that need itself: The company is launching a new product called Stripe Identity -- a self-serve tool that companies can use to verify user identities, with Stripe managing the customer data in an encrypted format, using computer vision and machine learning to "read" and match up government IDs with live selfies. Stripe says the service works in as little as 15 seconds. The service is launching in beta starting today in 30 countries, the company said, but in the meantime, it's already quietly been in use by select partners. They include Discord (as part of its ID verification feature); Peerspace (which runs ID verification when onboarding users and merchants); Shippo (when when it identifies high-risk users and asks them to verify themselves); and other unnamed customers using it to prevent account takeovers. Developers can go here to request access, and it sounds like Stripe will be talking more about the service in general during Stripe Sessions, its developer conference, later this week.
Before I go any further it's probably worth establishing what a Deepfake is and isn't. A technique by which a digital image or video can be superimposed onto another, which maintains the appearance of an unedited image or video. The term is often misinterpreted, and that's potentially as a result of definitions like this. The concept of manipulating images and video in this way is certainly not a new concept. Visual effects artists working on Hollywood films back in the '90s would probably describe parts of their job as something very similar to this.
The MiR250 Hook can autonomously tow carts that weigh up to 1100 lb. Mobile Industrial Robots (MiR) today launched the improved MiR Hook for automatically collecting and towing carts through dynamic and constricted industrial spaces. The new MiR250 Hook is built around the MiR250, MiR's fastest and most compact AMR. This solution is capable of transporting various sizes of loaded carts weighing up to 500 kg (1100 lbs). It increases the payload that it can tow, and it also features an improved cart gripper that can interface with almost any existing cart on your floor.
UK researchers have secured government funding to study the use of artificial intelligence for breast cancer screening in NHS hospitals. The work builds on previous research which showed that artificial intelligence could be as effective as human radiologists in spotting breast cancer from X-ray images. Backed by funding through the Artificial Intelligence in Health and Care Award, the next stages of the project aim to further assess the feasibility of the AI system to see how the technology could be integrated into the national screening programme in the future to support clinicians. The partnership, which includes Imperial College London, Google Health, Imperial College Healthcare NHS Trust, St George's Hospitals NHS Foundation Trust, and the Royal Surrey NHS Foundation Trust builds on previous work, in which the researchers trained the algorithm on depersonalised patient data and mammograms from patients in the UK and US. The findings, published in Nature in January 2020, showed the AI system was able to correctly identify cancers from the images with a similar degree of accuracy to expert radiologists, and demonstrated potential to assist clinical staff in practice.