Goto

Collaborating Authors

*


Why 90% of machine learning models never hit the market

#artificialintelligence

Corporations are going through rough times. The times are uncertain, and having to make customer experiences more and more seamless and immersive isn't taking off any of the pressure on companies. In that light, it's understandable that they're pouring billions of dollars into the development of machine learning models to improve their products. Companies can't just throw money at data scientists and machine learning engineers, and hope that magic happens. Here's how AI can improve your company's customer journey The data speaks for itself.


The Pentagon Is Bolstering Its AI Systems--by Hacking Itself

WIRED

The Pentagon sees artificial intelligence as a way to outfox, outmaneuver, and dominate future adversaries. But the brittle nature of AI means that without due care, the technology could perhaps hand enemies a new way to attack. The Joint Artificial Intelligence Center, created by the Pentagon to help the US military make use of AI, recently formed a unit to collect, vet, and distribute open source and industry machine learning models to groups across the Department of Defense. A machine learning "red team," known as the Test and Evaluation Group, will probe pretrained models for weaknesses. Another cybersecurity team examines AI code and data for hidden vulnerabilities.


An intelligent future? How AI is improving construction

#artificialintelligence

Big road projects will often uncover historic finds. During the £1.5bn upgrade of the A14 in Cambridgeshire, an archaeologist found what was believed to be the earliest evidence of beer brewing in Britain, dating back around 2,000 years. Generating as much excitement, for different reasons, was the introduction of a very modern concept on the same scheme. The project team pioneered artificial intelligence (AI) and machine-learning technology to successfully predict times when an accident was more likely to happen – and to take action to stop it. By collecting swathes of information and using the AI, data scientists were able to spot problems before they occurred.


Computer vision hasn't passed 'awareness phase,' survey shows

#artificialintelligence

All the sessions from Transform 2021 are available on-demand now. The majority of organizations agree that computer vision has the potential to transform key areas of business, but only 10% are using it today. That's according to an IDG survey commissioned by Insight, a business-to-business technology consultancy, which asked 200 IT leaders about their awareness, adoption, and perceptions of computer vision. Computer vision is a type of AI technology that allows machines to understand, categorize, and differentiate between images. Using photos from cameras and videos as well as deep learning components, computer vision can identify and classify objects and then react to what it "sees."


Caltech: New Algorithm Helps Autonomous Vehicles Find Themselves, Summer Or Winter

#artificialintelligence

"The rule of thumb is that both images--the one from the satellite and the one from the autonomous vehicle--have to have identical content for current techniques to work. The differences that they can handle are about what can be accomplished with an Instagram filter that changes an image's hues," says Anthony Fragoso (MS '14, PhD '18), lecturer and staff scientist, and lead author of the Science Robotics paper. "In real systems, however, things change drastically based on season because the images no longer contain the same objects and cannot be directly compared." The process--developed by Chung and Fragoso in collaboration with graduate student Connor Lee (BS '17, MS '19) and undergraduate student Austin McCoy--uses what is known as "self-supervised learning." While most computer-vision strategies rely on human annotators who carefully curate large data sets to teach an algorithm how to recognize what it is seeing, this one instead lets the algorithm teach itself.


Hungryroot delivers AI-powered grocery experience

#artificialintelligence

All the sessions from Transform 2021 are available on-demand now. Hungryroot, an AI-powered delivery service, hopes to occupy a similar niche for online groceries in the United States. The recommender system uses a collaborative filtering, supervised learning model to match consumer preferences to foods. Customers answer questions about their dietary habits, the kinds of foods they (and family members) like, the family size, budget, and more. On a weekly basis, the Hungryroot algorithm predicts the groceries the customer might like.


Seizure detection using wearable sensors and machine learning: Setting a benchmark

#artificialintelligence

Epilepsy is a common cause of morbidity and mortality, especially among children, despite advances in management regimens.1, 2 Accurate monitoring and tracking of seizures are important to evaluate seizure burden, recurrence risk, and response to treatment. Outside the hospital, seizure tracking relies on patients' and families' self-reporting, which is often unreliable due to underreporting, seizures missed by caregivers, and patients' difficulties recalling seizures.3-6 Although long-term video-electroencephalography (EEG) in the epilepsy monitoring unit (EMU) is the gold standard for accurately diagnosing and evaluating epilepsy,7 it is also time-consuming and costly, can be perceived as stigmatizing, and places a greater burden on patients and caregivers than seizure monitoring with wearable devices. Based on prior studies, there exists a large clinical gap and urgent medical need to detect a broad range of seizures, beyond focal to bilateral tonic–clonic seizures (FBTCSs) and generalized tonic–clonic seizures (GTCSs), with wearable devices.3, Recent advances in the use and development of non-EEG-based seizure detection devices utilizing a variety of sensors and modalities provided innovative opportunities to fill this gap and to monitor patients continuously in the outpatient setting.


Hackers Got Past Windows Hello by Tricking a Webcam

WIRED

Biometric authentication is a key piece of the tech industry's plans to make the world passwordless. But a new method for duping Microsoft's Windows Hello facial recognition system shows that a little hardware fiddling can trick the system into unlocking when it shouldn't. Services like Apple's FaceID have made facial recognition authentication more commonplace in recent years, with Windows Hello driving adoption even farther. Apple only lets you use FaceID with the cameras embedded in recent iPhones and iPads, and it's still not supported on Macs at all. But because Windows hardware is so diverse, Hello facial recognition works with an array of third-party webcams.


Hyperparameter Optimization: Foundations, Algorithms, Best Practices and Open Challenges

#artificialintelligence

Most machine learning algorithms are configured by one or several hyperparameters that must be carefully chosen and often considerably impact performance. To avoid a time consuming and unreproducible manual trial-and-error process to find well-performing hyperparameter configurations, various automatic hyperparameter optimization (HPO) methods, e.g., based on resampling error estimation for supervised machine learning, can be employed. It gives practical recommendations regarding important choices to be made when conducting HPO, including the HPO algorithms themselves, performance evaluation, how to combine HPO with ML pipelines, runtime improvements, and parallelization.


3D-printed robotic hand powered by water can play Super Mario Bros

New Scientist

A 3D-printed robotic hand controlled by pressurised water can complete the first level of classic computer game Super Mario Bros in less than 90 seconds. Ryan Sochol and his team at the University of Maryland were able to 3D print the hand in a single operation using a machine that can deposit hard plastic, a rubber-like polymer and a water-soluble "sacrificial" material.