Not enough data to create a plot.
Try a different view from the menu above.
Nonlinear dynamics of localization in neural receptive fields
Localized receptive fields--neurons that are selective for certain contiguous spatiotemporal features of their input--populate early sensory regions of the mammalian brain. Unsupervised learning algorithms that optimize explicit sparsity or independence criteria replicate features of these localized receptive fields, but fail to explain directly how localization arises through learning without efficient coding, as occurs in early layers of deep neural networks and might occur in early sensory regions of biological systems. We consider an alternative model in which localized receptive fields emerge without explicit top-down efficiency constraints--a feedforward neural network trained on a data model inspired by the structure of natural images. Previous work identified the importance of non-Gaussian statistics to localization in this setting but left open questions about the mechanisms driving dynamical emergence. We address these questions by deriving the effective learning dynamics for a single nonlinear neuron, making precise how higher-order statistical properties of the input data drive emergent localization, and we demonstrate that the predictions of these effective dynamics extend to the many-neuron setting. Our analysis provides an alternative explanation for the ubiquity of localization as resulting from the nonlinear dynamics of learning in neural circuits.
Semialgebraic Optimization for Lipschitz Constants of ReLU Networks
The Lipschitz constant of a network plays an important role in many applications of deep learning, such as robustness certification and Wasserstein Generative Adversarial Network. We introduce a semidefinite programming hierarchy to estimate the global and local Lipschitz constant of a multiple layer deep neural network. The novelty is to combine a polynomial lifting for ReLU functions derivatives with a weak generalization of Putinar's positivity certificate. This idea could also apply to other, nearly sparse, polynomial optimization problems in machine learning. We empirically demonstrate that our method provides a trade-off with respect to state of the art linear programming approach, and in some cases we obtain better bounds in less time.
The Download: the desert data center boom, and how to measure Earth's elevations
In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities nearby. Meanwhile, Microsoft has acquired more than 225 acres of undeveloped property, and Apple is expanding its existing data center just across the Truckee River from the industrial park. The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert--and it's just far enough away from Nevada's communities to elude wide notice and, some fear, adequate scrutiny. This story is part of Power Hungry: AI and our energy future--our new series shining a light on the energy demands and carbon costs of the artificial intelligence revolution.
Improved Coresets and Sublinear Algorithms for Power Means in Euclidean Spaces Vincent Cohen-Addad David Saulpic Chris Schwiegelshohn
Special cases of problem include the well-known Fermat-Weber problem - or geometric median problem - where z = 1, the mean or centroid where z = 2, and the Minimum Enclosing Ball problem, where z = . We consider these problem in the big data regime. Here, we are interested in sampling as few points as possible such that we can accurately estimate m. More specifically, we consider sublinear algorithms as well as coresets for these problems. Sublinear algorithms have a random query access to the set A and the goal is to minimize the number of queries.
Chicago paper publishes AI-generated 'summer reading list' with books that don't exist
Texas high school student Elliston Berry joins'Fox & Friends' to discuss the House's passage of a new bill that criminalizes the sharing of non-consensual intimate images, including content created with artificial intelligence. The Chicago Sun-Times admitted on Tuesday that it published an AI-generated list of books that don't exist for its summer reading list. On Sunday, the publication released a special 64-page section titled "Heat Index: Your Guide to the Best of Summer" which featured a list of 15 recommended books for summer. However, upon further look, it was found that 10 of the 15 books on the list were not real. One example included a book called "Nightshade Market" by Min Jin Lee, which was described as a "riveting tale set in Seoul's underground economy" and follows "three women whose paths intersect in an illegal night market" exploring "class, gender and the shadow economies beneath prosperous societies."
Boost your workflow for life with this 60 AI assistant
How often do you wish you had an assistant at work? Let Swatle be your AI-powered partner, helping you tackle your projects efficiently. And luckily, a premium lifetime subscription can be yours now for just 59.99 (reg. Think of Swatle as your right-hand tool, serving as an AI-powered productivity partner ready to help you manage projects, automate repetitive tasks, and even organize your whole team's workflow. Thanks to Swatle's artificial intelligence, it even adapts to your individual needs as you go.
The Dyson Supersonic Nural hair dryer is finally on sale at Amazon -- get it for its lowest-ever price
SAVE OVER 100: As of May 22, the Dyson Supersonic Nural hair dryer is on sale for 399 at Amazon. Dyson has a dedicated bunch of fans out there, so when they release a limited edition jasper plum colorway, it causes a big stir. So what do these fans do when presented with the opportunity to get their hands on this stylish new color? They wait for that first deal to drop. As of May 22, the Dyson Supersonic Nural hair dryer is on sale for 399 at Amazon.
Panchromatic and Multispectral Image Fusion via Alternating Reverse Filtering Network (Supplementary Materials)
The best results are highlighted by bold. It can be clearly seen that our alternating reverse filtering network performs the best compared with other state-of-the-art methods in all the indexes, indicating the superiority of our proposed method. Images in the last row are the MSE residues between the fused results and the ground truth. Compared with other competing methods, our model has minor spatial and spectral distortions. It can be easily concluded from the observation of MSE maps.
ColdGANs: Taming Language GANs with Cautious Sampling Strategies Thomas Scialom, Paul-Alexis Dray
Training regimes based on Maximum Likelihood Estimation (MLE) suffer from known limitations, often leading to poorly generated text sequences. At the root of these limitations is the mismatch between training and inference, i.e. the so-called exposure bias, exacerbated by considering only the reference texts as correct, while in practice several alternative formulations could be as good. Generative Adversarial Networks (GANs) can mitigate those limitations but the discrete nature of text has hindered their application to language generation: the approaches proposed so far, based on Reinforcement Learning, have been shown to underperform MLE. Departing from previous works, we analyze the exploration step in GANs applied to text generation, and show how classical sampling results in unstable training. We propose to consider alternative exploration strategies in a GAN framework that we name ColdGANs, where we force the sampling to be close to the distribution modes to get smoother learning dynamics. For the first time, to the best of our knowledge, the proposed language GANs compare favorably to MLE, and obtain improvements over the state-of-the-art on three generative tasks, namely unconditional text generation, question generation, and abstractive summarization.
Apple iPhone designer Jony Ive joins OpenAI in 6.5bn deal
Sir Jony worked for Apple for 27 years, helping to revive the company with groundbreaking products including the iPhone and iPod. He also designed the iMac in 1998 and the iPad in 2010. When Sir Jony left the company in 2019, Apple's CEO Tim Cook described him as "a singular figure in the design world and his role in Apple's revival cannot be overstated". Shares in Apple fell more than 2% following the news of his partnership with OpenAI. He left to found his own company, LoveFrom, which has worked with companies such as Airbnb and Moncler.