Goto

Collaborating Authors

 mona


MONA: Myopic Optimization with Non-myopic Approval Can Mitigate Multi-step Reward Hacking

Farquhar, Sebastian, Varma, Vikrant, Lindner, David, Elson, David, Biddulph, Caleb, Goodfellow, Ian, Shah, Rohin

arXiv.org Artificial Intelligence

Future advanced AI systems may learn sophisticated strategies through reinforcement learning (RL) that humans cannot understand well enough to safely evaluate. We propose a training method which avoids agents learning undesired multi-step plans that receive high reward (multi-step "reward hacks") even if humans are not able to detect that the behaviour is undesired. The method, Myopic Optimization with Non-myopic Approval (MONA), works by combining short-sighted optimization with far-sighted reward. We demonstrate that MONA can prevent multi-step reward hacking that ordinary RL causes, even without being able to detect the reward hacking and without any extra information that ordinary RL does not get access to. We study MONA empirically in three settings which model different misalignment failure modes including 2-step environments with LLMs representing delegated oversight and encoded reasoning and longer-horizon gridworld environments representing sensor tampering.


JESTR: Joint Embedding Space Technique for Ranking Candidate Molecules for the Annotation of Untargeted Metabolomics Data

Kalia, Apurva, Krishnan, Dilip, Hassoun, Soha

arXiv.org Artificial Intelligence

Motivation: A major challenge in metabolomics is annotation: assigning molecular structures to mass spectral fragmentation patterns. Despite recent advances in molecule-to-spectra and in spectra-to-molecular fingerprint prediction (FP), annotation rates remain low. Results: We introduce in this paper a novel paradigm (JESTR) for annotation. Unlike prior approaches that explicitly construct molecular fingerprints or spectra, JESTR leverages the insight that molecules and their corresponding spectra are views of the same data and effectively embeds their representations in a joint space. Candidate structures are ranked based on cosine similarity between the embeddings of query spectrum and each candidate. We evaluate JESTR against mol-to-spec and spec-to-FP annotation tools on three datasets. On average, for rank@[1-5], JESTR outperforms other tools by 23.6%-71.6%. We further demonstrate the strong value of regularization with candidate molecules during training, boosting rank@1 performance by 11.4% and enhancing the model's ability to discern between target and candidate molecules. Through JESTR, we offer a novel promising avenue towards accurate annotation, therefore unlocking valuable insights into the metabolome.


MONAS: Efficient Zero-Shot Neural Architecture Search for MCUs

Qiao, Ye, Xu, Haocheng, Zhang, Yifan, Huang, Sitao

arXiv.org Artificial Intelligence

Neural Architecture Search (NAS) has proven effective in discovering new Convolutional Neural Network (CNN) architectures, particularly for scenarios with well-defined accuracy optimization goals. However, previous approaches often involve time-consuming training on super networks or intensive architecture sampling and evaluations. Although various zero-cost proxies correlated with CNN model accuracy have been proposed for efficient architecture search without training, their lack of hardware consideration makes it challenging to target highly resource-constrained edge devices such as microcontroller units (MCUs). To address these challenges, we introduce MONAS, a novel hardware-aware zero-shot NAS framework specifically designed for MCUs in edge computing. MONAS incorporates hardware optimality considerations into the search process through our proposed MCU hardware latency estimation model. By combining this with specialized performance indicators (proxies), MONAS identifies optimal neural architectures without incurring heavy training and evaluation costs, optimizing for both hardware latency and accuracy under resource constraints. MONAS achieves up to a 1104x improvement in search efficiency over previous work targeting MCUs and can discover CNN models with over 3.23x faster inference on MCUs while maintaining similar accuracy compared to more general NAS approaches.


The Terrifying A.I. Scam That Uses Your Loved One's Voice

The New Yorker

On a recent night, a woman named Robin was asleep next to her husband, Steve, in their Brooklyn home, when her phone buzzed on the bedside table. Robin is in her mid-thirties with long, dirty-blond hair. She works as an interior designer, specializing in luxury homes. The couple had gone out to a natural-wine bar in Cobble Hill that evening, and had come home a few hours earlier and gone to bed. Their two young children were asleep in bedrooms down the hall.


A Survey on Multi-Objective Neural Architecture Search

Shariatzadeh, Seyed Mahdi, Fathy, Mahmood, Berangi, Reza, Shahverdy, Mohammad

arXiv.org Artificial Intelligence

Recently, the expert-crafted neural architectures is increasing overtaken by the utilization of neural architecture search (NAS) and automatic generation (and tuning) of network structures which has a close relation to the Hyperparameter Optimization and Auto Machine Learning (AutoML). After the earlier NAS attempts to optimize only the prediction accuracy, Multi-Objective Neural architecture Search (MONAS) has been attracting attentions which considers more goals such as computational complexity, power consumption, and size of the network for optimization, reaching a trade-off between the accuracy and other features like the computational cost. In this paper, we present an overview of principal and state-of-the-art works in the field of MONAS. Starting from a well-categorized taxonomy and formulation for the NAS, we address and correct some miscategorizations in previous surveys of the NAS field. We also provide a list of all known objectives used and add a number of new ones and elaborate their specifications. We have provides analyses about the most important objectives and shown that the stochastic properties of some the them should be differed from deterministic ones in the multi-objective optimization procedure of NAS. We finalize this paper with a number of future directions and topics in the field of MONAS.


em Beau Is Afraid /em Is Already the Year's Most Infamous Movie. Here's What It's Really All About.

Slate

In this article, Beau is a-spoiled. In an Ari Aster movie, the best thing that can happen is losing your head. Not literally, of course, although the Midsommar auteur is notoriously fond of literally cutting his characters off at the neck. In 2019, he said that "head trauma will always have a place in my movies," and his latest, Beau Is Afraid, holds true to that promise. Early on, just after Beau Wasserman (Joaquin Phoenix) cancels a planned visit to his mother, she is decapitated by a falling chandelier. But alongside the characters who get their skulls crushed and faces smashed are ones who desperately need a respite from the buzzing of their brains--who would give anything if they could, even for a minute, just stop thinking. Toni Colette's character in Hereditary comes from family with a long history of mental illness--a mother with dissociative identity disorder, a father with psychotic depression, a brother with schizophrenia--and is plagued by the feeling that she and her family are the object of a sinister conspiracy.


OpenXLA, Mona's free Data Analysis Tool, 🤖 OpenAI GPT-4

#artificialintelligence

Try Mona's Free Automated Exploratory Data Analysis Tool Say goodbye to endless manual multivariate data exploration! Mona's new automated exploratory data analysis tool eliminates the need for manual data cleaning, transformation and visualization. Simply upload a CSV and follow a simple wizard. Mona will automatically surface granular insights on patterns and anomalies in your dataset, alongside possible explanations. Join a global community of analysts using this one-of-its-kind free tool to streamline exploratory analysis and make better data-driven decisions faster.


Google Cloud Certified Professional Machine Learning Study Guide: Mona, Mona, Ramamurthy, Pratap: 9781119944461: Amazon.com: Books

#artificialintelligence

The book walks readers through the machine learning process from start to finish, starting with data, feature engineering, model training, and deployment on Google Cloud. It also discusses best practices on when to pick a custom model vs AutoML or pretrained models with Vertex AI platform. All technologies such as Tensorflow, Kubeflow, and Vertex AI are presented by way of real-world scenarios to help you apply the theory to practical examples and show you how IT professionals design, build, and operate secure ML cloud environments.


MONAS: Multi-Objective Neural Architecture Search using Reinforcement Learning

Hsu, Chi-Hung, Chang, Shu-Huan, Juan, Da-Cheng, Pan, Jia-Yu, Chen, Yu-Ting, Wei, Wei, Chang, Shih-Chieh

arXiv.org Artificial Intelligence

Recent studies on neural architecture search have shown that automatically designed neural networks perform as good as human-designed architectures. While most existing works on neural architecture search aim at finding architectures that optimize for prediction accuracy. These methods may generate complex architectures consuming excessively high energy consumption, which is not suitable for computing environment with limited power budgets. We propose MONAS, a Multi-Objective Neural Architecture Search with novel reward functions that consider both prediction accuracy and power consumption when exploring neural architectures. MONAS effectively explores the design space and searches for architectures satisfying the given requirements. The experimental results demonstrate that the architectures found by MONAS achieve accuracy comparable to or better than the state-of-the-art models, while having better energy efficiency.


Artificial intelligence beats a path to eCommerce - THINK Marketing

#artificialintelligence

Artificial intelligence (AI) has made its way into many aspects of our lives, even into toys for kids like Anki's Cozmo, which resembles a roboticized Ewok. But as things go, AI isn't just for devices; it's made and continues to make its way into eCommerce and is out there working to determine what to sell to you, how you shop and ensure you have a good shopping experience. According to Gartner, by 2020 85% of customer interactions will be managed without a human, and at the close of 2018, customer digital assistants will recognize customers by face and voice across channels. Investment-wise, in 2014 there were more than 300 million in venture capital invested in AI startups according to Bloomberg. Brands are on board and are using AI to build smarter platforms they hope will create a better online shopping experience for the consumer.