Plotting



Checklist

Neural Information Processing Systems

If you ran experiments... (a) Did you include the code, data, and instructions needed to reproduce the main experimental results (either in the supplemental material or as a URL)? [Yes] Code and instructions were submitted with the supplemental material during the review period.


204da255aea2cd4a75ace6018fad6b4d-Paper.pdf

Neural Information Processing Systems

Random forests are learning algorithms that build large collections of random trees and make predictions by averaging the individual tree predictions. In this paper, we consider various tree constructions and examine how the choice of parameters affects the generalization error of the resulting random forests as the sample size goes to infinity. We show that subsampling of data points during the tree construction phase is important: Forests can become inconsistent with either no subsampling or too severe subsampling. As a consequence, even highly randomized trees can lead to inconsistent forests if no subsampling is used, which implies that some of the commonly used setups for random forests can be inconsistent. As a second consequence we can show that trees that have good performance in nearest-neighbor search can be a poor choice for random forests.


Diff-eRank: A Novel Rank-Based Metric for Evaluating Large Language Models

Neural Information Processing Systems

Large Language Models (LLMs) have transformed natural language processing and extended their powerful capabilities to multi-modal domains. As LLMs continue to advance, it is crucial to develop diverse and appropriate metrics for their evaluation. In this paper, we introduce a novel rank-based metric, Diff-eRank, grounded in information theory and geometry principles. Diff-eRank assesses LLMs by analyzing their hidden representations, providing a quantitative measure of how efficiently they eliminate redundant information during training. We demonstrate the applicability of Diff-eRank in both single-modal (e.g., language) and multimodal settings. For language models, our results show that Diff-eRank increases with model size and correlates well with conventional metrics such as loss and accuracy. In the multi-modal context, we propose an alignment evaluation method based on the eRank, and verify that contemporary multi-modal LLMs exhibit strong alignment performance based on our method.


Dendritic cortical microcircuits approximate the backpropagation algorithm

Neural Information Processing Systems

Deep learning has seen remarkable developments over the last years, many of them inspired by neuroscience. However, the main learning mechanism behind these advances - error backpropagation - appears to be at odds with neurobiology. Here, we introduce a multilayer neuronal network model with simplified dendritic compartments in which error-driven synaptic plasticity adapts the network towards a global desired output. In contrast to previous work our model does not require separate phases and synaptic learning is driven by local dendritic prediction errors continuously in time. Such errors originate at apical dendrites and occur due to a mismatch between predictive input from lateral interneurons and activity from actual top-down feedback. Through the use of simple dendritic compartments and different cell-types our model can represent both error and normal activity within a pyramidal neuron. We demonstrate the learning capabilities of the model in regression and classification tasks, and show analytically that it approximates the error backpropagation algorithm. Moreover, our framework is consistent with recent observations of learning between brain areas and the architecture of cortical microcircuits. Overall, we introduce a novel view of learning on dendritic cortical circuits and on how the brain may solve the long-standing synaptic credit assignment problem.



Trump signs executive orders to spur US 'nuclear energy renaissance'

The Guardian > Energy

Donald Trump signed a series of executive orders on Friday intended to spur a "nuclear energy renaissance" through the construction of new reactors he said would satisfy the electricity demands of data centers for artificial intelligence and other emerging industries. The orders represented the president's latest foray into the policy underlying America's electricity supply. Trump declared a national energy emergency on his first day in office over and moved to undo a ban implemented by Joe Biden on new natural gas export terminals and expand oil and gas drilling in Alaska. Nuclear does not carry oil and gas's carbon emissions, but produces radioactive waste that the United States lacks a facility to permanently store. Some environmental groups have safety concerns over the reactors and their supply chain. Trump signed four orders intended to speed up the approval of nuclear reactors for defense and AI purposes, reform the Nuclear Regulatory Commission with the goal of quadrupling production of electricity over the next 25 years, revamp the regulatory process to have three experimental reactors operating by 4 July 2026 and boost investment in the technology's industrial base.


D-Wave revives 'quantum supremacy' claims for new Advantage2 computer

ZDNet

Quantum computing pioneer D-Wave Quantum on Tuesday announced the general availability of its sixth-generation quantum computer, the Advantage2. The company said the Advantage2 offers orders-of-magnitude greater performance compared to its prior system, expanding the tasks the company can accomplish in optimization problems. The machine even achieves the long-sought goal of quantum "supremacy," says the company, despite that term's highly controversial past. "This is a really historic moment for both D-Wave and the quantum computing industry," said D-Wave CEO Alan Baratz in an interview via Zoom. "Fundamentally, our technology is doing something that can't be touched classically."


Globally Convergent Newton Methods for Ill-conditioned Generalized Self-concordant Losses

Neural Information Processing Systems

In this paper, we study large-scale convex optimization algorithms based on the Newton method applied to regularized generalized self-concordant losses, which include logistic regression and softmax regression. We first prove that our new simple scheme based on a sequence of problems with decreasing regularization parameters is provably globally convergent, that this convergence is linear with a constant factor which scales only logarithmically with the condition number. In the parametric setting, we obtain an algorithm with the same scaling than regular first-order methods but with an improved behavior, in particular in ill-conditioned problems.