Not enough data to create a plot.
Try a different view from the menu above.
Data Sharing and Compression for Cooperative Networked Control
Sharing forecasts of network timeseries data, such as cellular or electricity load patterns, can improve independent control applications ranging from traffic scheduling to power generation. Typically, forecasts are designed without knowledge of a downstream controller's task objective, and thus simply optimize for mean prediction error. However, such task-agnostic representations are often too large to stream over a communication network and do not emphasize salient temporal features for cooperative control. This paper presents a solution to learn succinct, highly-compressed forecasts that are co-designed with a modular controller's task objective. Our simulations with real cellular, Internet-of-Things (IoT), and electricity load data show we can improve a model predictive controller's performance by at least 25% while transmitting 80% less data than the competing method.
CycleNet: Rethinking Cycle Consistency in Text-Guided Diffusion for Image Manipulation
Diffusion models (DMs) have enabled breakthroughs in image synthesis tasks but lack an intuitive interface for consistent image-to-image (I2I) translation. Various methods have been explored to address this issue, including mask-based methods, attention-based methods, and image-conditioning. However, it remains a critical challenge to enable unpaired I2I translation with pre-trained DMs while maintaining satisfying consistency. This paper introduces CycleNet, a novel but simple method that incorporates cycle consistency into DMs to regularize image manipulation.
DataComp-LM: In search of the next generation of training sets for language models Jeffrey Li* 1,2 Alex Fang* 1,2
We introduce DataComp for Language Models (DCLM), a testbed for controlled dataset experiments with the goal of improving language models. As part of DCLM, we provide a standardized corpus of 240T tokens extracted from Common Crawl, effective pretraining recipes based on the OpenLM framework, and a broad suite of 53 downstream evaluations.
A Basic Facts about Quantum Walk
In Appendix A, we provide more details of quantum walk and give our user-friendly framework. In Appendix B, we introduce the classical method for optimizing approximately convex functions in a self-contained way. In Appendix C, we prove our main result of quantum approximately convex optimization. In this section, we first define the quantum walk operators and introduce some spectral properties. Then, we show how to efficiently implement a quantum walk.
Checklist
For all authors... (a) Do the main claims made in the abstract and introduction accurately reflect the paper's contributions and scope? If you used crowdsourcing or conducted research with human subjects... (a) Did you include the full text of instructions given to participants and screenshots, if applicable? [N/A] (b) Did you describe any potential participant risks, with links to Institutional Review Board (IRB) approvals, if applicable? [N/A] (c) Did you include the estimated hourly wage paid to participants and the total amount spent on participant compensation? A.1 Group, Coset and Quotient Space A group G is a set of elements equipped with a binary operation (denoted as) that satisfies the following group axioms: 1. (Closure) For all a, b G, a b G. 2. (Associative) For all a, b, c G, (a b) c = a (b c). 3. (Identity element) There exists an identity element e in G such that, for any a G we have e a = a e = a. 4. (Inverse element) For each a G, there exists an element b G such that a b = b a = e where e is the identity element. The centered dot can sometimes be omitted if there is no ambiguity. In this work, we are mainly interested in symmetry groups where each group element is associated with a symmetry of a pattern, which is a transformation that leaves the pattern invariant.