Goto

Collaborating Authors

 Natural Language


Leak reveals what Sam Altman and Jony Ive are cooking up: 100 million AI companion devices

Mashable

OpenAI and Jony Ive's vision for its AI device is a screenless companion that knows everything about you. Details leaked to the Wall Street Journal give us a clearer picture of OpenAI's acquisition of io, cofounded by Ive, the iconic iPhone designer. The ChatGPT maker reportedly plans to ship 100 million AI devices designed to fit in with users' everyday life. "The product will be capable of being fully aware of a user's surroundings and life, will be unobtrusive, able to rest in one's pocket or on one's desk," according to a recording of an OpenAI staff meeting reviewed by the Journal. The device "will be a third core device a person would put on a desk after a MacBook Pro and an iPhone," per the meeting which occurred the same day (Wednesday) that OpenAI announced its acquisition of Ive's company.


News/Media Alliance says Google's AI takes content by force

Mashable

Is Google's new AI Mode feature theft? The News/Media Alliance, trade association representing news media organizations in the U.S. and Canada, certainly thinks so. At Google's I/O showcase earlier this week, the tech company announced the public release of AI Mode in Google Search. AI Mode expands AI Overviews in search and signifies a pivot away from Google's traditional search. Users will see a tab at the top of their Google Search page that takes them to a chatbot interface much like, say, ChatGPT, instead of your typical Google Search results.


AI could account for nearly half of datacentre power usage 'by end of year'

The Guardian

Artificial intelligence systems could account for nearly half of datacentre power consumption by the end of this year, analysis has revealed. The estimates by Alex de Vries-Gao, the founder of the Digiconomist tech sustainability website, came as the International Energy Agency forecast that AI would require almost as much energy by the end of this decade as Japan uses today. De Vries-Gao's calculations, to be published in the sustainable energy journal Joule, are based on the power consumed by chips made by Nvidia and Advanced Micro Devices that are used to train and operate AI models. The paper also takes into account the energy consumption of chips used by other companies, such as Broadcom. The IEA estimates that all data centres โ€“ excluding mining for cryptocurrencies โ€“ consumed 415 terawatt hours (TWh) of electricity last year.


Exclusive: New Claude Model Triggers Stricter Safeguards at Anthropic

TIME - Tech

This moment is a crucial test for Anthropic, a company that claims it can mitigate AI's dangers while still competing in the market. Claude is a direct competitor to ChatGPT, and brings in over 2 billion in annualized revenue. Anthropic argues that its RSP thus creates an economic incentive for itself to build safety measures in time, lest it lose customers as a result of being prevented from releasing new models. "We really don't want to impact customers," Kaplan told TIME earlier in May while Anthropic was finalizing its safety measures. "We're trying to be proactively prepared." But Anthropic's RSP--and similar commitments adopted by other AI companies--are all voluntary policies that could be changed or cast aside at will.


I let Google's Jules AI agent into my code repo and it did four hours of work in an instant

ZDNet

I just added an entire new feature to my software, including UI and functionality, just by typing four paragraphs of instructions. I have screenshots, and I'll try to make sense of it in this article. I can't tell if we're living in the future or we've just descended to a new plane of hell (or both). Let's take a step back. Google's Jules is the latest in a flood of new coding agents released just this week. I wrote about OpenAI Codex and Microsoft's GitHub Copilot Coding Agent at the beginning of the week, and ZDNET's Webb Wright wrote about Google's Jules. All of these coding agents will perform coding operations on a GitHub repository.


Matter-enabled SwitchBot Hub 3 smart home controller is now available

PCWorld

The SwitchBot Hub 3 smart home controller is now available for purchase. The Matter-capable device is quite different than other smart home hubs we've tested, starting with its rotary knob that can adjust the target temperature on a smart thermostat, the brightness of smart lighting devices, or the volume level of a connected speaker. Another feature that makes the 120 controller so interesting is the USB-C cable that connects it to its power supply: The cable senses the ambient temperature and relative humidity in the room where the Hub 3 is installed. These readings are shown on the hub's display. We have a hands-on review of the all-new SwitchBot Ultra, which is also shipping today.


ColdGANs: Taming Language GANs with Cautious Sampling Strategies Thomas Scialom, Paul-Alexis Dray

Neural Information Processing Systems

Training regimes based on Maximum Likelihood Estimation (MLE) suffer from known limitations, often leading to poorly generated text sequences. At the root of these limitations is the mismatch between training and inference, i.e. the so-called exposure bias, exacerbated by considering only the reference texts as correct, while in practice several alternative formulations could be as good. Generative Adversarial Networks (GANs) can mitigate those limitations but the discrete nature of text has hindered their application to language generation: the approaches proposed so far, based on Reinforcement Learning, have been shown to underperform MLE. Departing from previous works, we analyze the exploration step in GANs applied to text generation, and show how classical sampling results in unstable training. We propose to consider alternative exploration strategies in a GAN framework that we name ColdGANs, where we force the sampling to be close to the distribution modes to get smoother learning dynamics. For the first time, to the best of our knowledge, the proposed language GANs compare favorably to MLE, and obtain improvements over the state-of-the-art on three generative tasks, namely unconditional text generation, question generation, and abstractive summarization.


Apple iPhone designer Jony Ive joins OpenAI in 6.5bn deal

BBC News

Sir Jony worked for Apple for 27 years, helping to revive the company with groundbreaking products including the iPhone and iPod. He also designed the iMac in 1998 and the iPad in 2010. When Sir Jony left the company in 2019, Apple's CEO Tim Cook described him as "a singular figure in the design world and his role in Apple's revival cannot be overstated". Shares in Apple fell more than 2% following the news of his partnership with OpenAI. He left to found his own company, LoveFrom, which has worked with companies such as Airbnb and Moncler.


Who's to Blame When AI Agents Screw Up?

WIRED

Over the past year, veteran software engineer Jay Prakash Thakur has spent his nights and weekends prototyping AI agents that could, in the near future, order meals and engineer mobile apps almost entirely on their own. His agents, while surprisingly capable, have also exposed new legal questions that await companies trying to capitalize on Silicon Valley's hottest new technology. Agents are AI programs that can act mostly independently, allowing companies to automate tasks such as answering customer questions or paying invoices. While ChatGPT and similar chatbots can draft emails or analyze bills upon request, Microsoft and other tech giants expect that agents will tackle more complex functions--and most importantly, do it with little human oversight. The tech industry's most ambitious plans involve multi-agent systems, with dozens of agents someday teaming up to replace entire workforces.


C2FAR: Coarse-to-Fine Autoregressive Networks for Precise Probabilistic Forecasting

Neural Information Processing Systems

C2FAR generates a hierarchical, coarse-to-fine discretization of a variable autoregressively; progressively finer intervals of support are generated from a sequence of binned distributions, where each distribution is conditioned on previously-generated coarser intervals. Unlike prior (flat) binned distributions, C2FAR can represent values with exponentially higher precision, for only a linear increase in complexity. We use C2FAR for probabilistic forecasting via a recurrent neural network, thus modeling time series autoregressively in both space and time. C2FAR is the first method to simultaneously handle discrete and continuous series of arbitrary scale and distribution shape. This flexibility enables a variety of time series use cases, including anomaly detection, interpolation, and compression. C2FAR achieves improvements over the state-of-the-art on several benchmark forecasting datasets.