Optimal Algorithms for Non-Smooth Distributed Optimization in Networks

Neural Information Processing Systems

In this work, we consider the distributed optimization of non-smooth convex functions using a network of computing units. We investigate this problem under two regularity assumptions: (1) the Lipschitz continuity of the global objective function, and (2) the Lipschitz continuity of local individual functions. Under the local regularity assumption, we provide the first optimal first-order decentralized algorithm called multi-step primal-dual (MSPD) and its corresponding optimal convergence rate. A notable aspect of this result is that, for non-smooth functions, while the dominant term of the error is in O(1/ t), the structure of the communication network only impacts a second-order term in O(1/t), where t is time.


A Benchmark Suite for Evaluating Neural Mutual Information Estimators on Unstructured Datasets

Neural Information Processing Systems

Mutual Information (MI) is a fundamental metric for quantifying dependency between two random variables. When we can access only the samples, but not the underlying distribution functions, we can evaluate MI using sample-based estimators. Assessment of such MI estimators, however, has almost always relied on analytical datasets including Gaussian multivariates. Such datasets allow analytical calculations of the true MI values, but they are limited in that they do not reflect the complexities of real-world datasets. This study introduces a comprehensive benchmark suite for evaluating neural MI estimators on unstructured datasets, specifically focusing on images and texts. By leveraging same-class sampling for positive pairing and introducing a binary symmetric channel trick, we show that we can accurately manipulate true MI values of real-world datasets. Using the benchmark suite, we investigate seven challenging scenarios, shedding light on the reliability of neural MI estimators for unstructured datasets.


Goal Conditioned Reinforcement Learning for Photo Finishing Tuning

Neural Information Processing Systems

Photo finishing tuning aims to automate the manual tuning process of the photo finishing pipeline, like Adobe Lightroom or Darktable. Previous works either use zeroth-order optimization, which is slow when the set of parameters increases, or rely on a differentiable proxy of the target finishing pipeline, which is hard to train. To overcome these challenges, we propose a novel goal-conditioned reinforcement learning framework for efficiently tuning parameters using a goal image as a condition. Unlike previous approaches, our tuning framework does not rely on any proxy and treats the photo finishing pipeline as a black box. Utilizing a trained reinforcement learning policy, it can efficiently find the desired set of parameters within just 10 queries, while optimization-based approaches normally take 200 queries. Furthermore, our architecture utilizes a goal image to guide the iterative tuning of pipeline parameters, allowing for flexible conditioning on pixel-aligned target images, style images, or any other visually representable goals. We conduct detailed experiments on photo finishing tuning and photo stylization tuning tasks, demonstrating the advantages of our method.


Private Geometric Median Mahdi Haghifam Thomas Steinke Jonathan Ullman

Neural Information Processing Systems

Our main contribution is a pair of polynomial-time DP algorithms for the task of private GM with an excess error guarantee that scales with the effective diameter of the datapoints. Additionally, we propose an inefficient algorithm based on the inverse smooth sensitivity mechanism, which satisfies the more restrictive notion of pure DP. We complement our results with a lower bound and demonstrate the optimality of our polynomial-time algorithms in terms of sample complexity.


SparseLLM: Towards Global Pruning of Pre-trained Language Models 2 Chen Ling

Neural Information Processing Systems

The transformative impact of large language models (LLMs) like LLaMA and GPT on natural language processing is countered by their prohibitive computational demands. Pruning has emerged as a pivotal compression strategy, introducing sparsity to enhance both memory and computational efficiency. Yet, traditional global pruning is impractical for LLMs due to scalability issues, while local pruning, despite its efficiency, leads to suboptimal solutions. Addressing these challenges, we propose SparseLLM, a novel framework that redefines the global pruning process into manageable, coordinated subproblems, allowing for resource-efficient optimization with global optimality. SparseLLM's approach, which conceptualizes LLMs as a chain of modular functions and leverages auxiliary variables for problem decomposition, not only facilitates a pragmatic application on LLMs but also demonstrates significant performance improvements, particularly in high-sparsity regimes, surpassing current state-of-the-art methods.


Learning to Understand: Identifying Interactions via the Mรถbius Transform

Neural Information Processing Systems

One of the key challenges in machine learning is to find interpretable representations of learned functions. The Mรถbius transform is essential for this purpose, as its coefficients correspond to unique importance scores for sets of input variables. This transform is closely related to widely used game-theoretic notions of importance like the Shapley and Bhanzaf value, but it also captures crucial higher-order interactions.


Tired of lag? Get Windows 11 Pro for life for just 15.

Mashable

TL;DR: Get a lifetime license to Windows 11 Pro for just 14.97 and unlock a sleek interface, pro-level security, and AI-powered productivity. Need a new laptop but don't have the budget to buy one? We've found the next best thing: updating your operating system. If you've got an old PC that could use an upgrade, a lifetime license to Microsoft Windows 11 Pro is now just 14.97. But you'll want to act fast because this deal ends June 1 at 11:59 p.m. PT.


Mac users: Grab Microsoft Office 2021 for life for 68% off

Mashable

TL;DR: Give your productivity a big boost with a lifetime license to Microsoft Office Home & Business for Mac 2021, now 69.97 (reg. There's a reason Microsoft Office is still around in 2025 -- these apps really work. They range from classics like Word and Excel, which have been around since the days of chunky desktop computers, to newer additions like Teams and OneNote. Right now, you can get them forever with this lifetime license to Microsoft Office Home & Business for Mac 2021, on sale for 69.97 (reg. Give your Apple computer an upgrade for under 75 with this Microsoft Office Home & Business for Mac 2021 license.


Ditch streaming fees with this 15 lifetime content finder

Mashable

TL;DR: Never run out of things to watch with this lifetime subscription to BitMar Streaming Content-Finder, now just 14.99 (reg. BitMar is a streaming content finder ready to help you discover free entertainment all over the web -- from movies and shows to songs and more -- and right now, a lifetime subscription to this helpful tool is just 14.99 with code BITMAR5 until June 1. If you're sick of wasting hours hunting down things to watch, it's time to swap to BitMar. It's your very own AI-powered free content finder -- it scours the web to find you millions of free movies, TV shows, videos, music, and more. And unlike all the expensive streaming services you pay for monthly, you pay once for BitMar and enjoy it forever.