Goto

Collaborating Authors

 Mobile



Real-time Core-Periphery Guided ViT with Smart Data Layout Selection on Mobile Devices

Neural Information Processing Systems

Mobile devices have become essential enablers for AI applications, particularly in scenarios that require real-time performance. Vision Transformer (ViT) has become a fundamental cornerstone in this regard due to its high accuracy. Recent efforts have been dedicated to developing various transformer architectures that offer improved accuracy while reducing the computational requirements. However, existing research primarily focuses on reducing the theoretical computational complexity through methods such as local attention and model pruning, rather than considering realistic performance on mobile hardware. Although these optimizations reduce computational demands, they either introduce additional overheads related to data transformation (e.g., Reshape and Transpose) or irregular computation/data-access patterns.


SparCL: Sparse Continual Learning on the Edge Zifeng Wang

Neural Information Processing Systems

Existing work in continual learning (CL) focuses on mitigating catastrophic forgetting, i.e., model performance deterioration on past tasks when learning a new task. However, the training efficiency of a CL system is under-investigated, which limits the real-world application of CL systems under resource-limited scenarios. In this work, we propose a novel framework called Sparse Continual Learning (SparCL), which is the first study that leverages sparsity to enable cost-effective continual learning on edge devices. SparCL achieves both training acceleration and accuracy preservation through the synergy of three aspects: weight sparsity, data efficiency, and gradient sparsity. Specifically, we propose task-aware dynamic masking (TDM) to learn a sparse network throughout the entire CL process, dynamic data removal (DDR) to remove less informative training data, and dynamic gradient masking (DGM) to sparsify the gradient updates. Each of them not only improves efficiency, but also further mitigates catastrophic forgetting. SparCL consistently improves the training efficiency of existing state-of-the-art (SOTA) CL methods by at most 23 less training FLOPs, and, surprisingly, further improves the SOTA accuracy by at most 1.7%. SparCL also outperforms competitive baselines obtained from adapting SOTA sparse training methods to the CL setting in both efficiency and accuracy. We also evaluate the effectiveness of SparCL on a real mobile phone, further indicating the practical potential of our method.


The best travel deals to shop during the Amazon Big Spring Sale

Mashable

Amazon's Big Spring Sale has officially commenced and is filled to the brim with deals on everything from laptops and TVs to robot vacuums and coffee makers. One seasonally appropriate category that's particularly popping off is travel deals. Whether you're looking to upgrade your luggage or in-flight entertainment, there are plenty of discounts to help you do so on a budget. Mashable's shopping experts are keeping track of all the best travel deals the Big Spring Sale has to offer, including luggage, noise-cancelling headphones, Kindles, charging accessories, and more. Don't see what you're looking for?


Waze is officially stopping support for Google Assistant on iPhones

Engadget

The navigation app Waze is dropping Google Assistant support for iPhones, citing "ongoing difficulties" with integrating the service. The company says it plans on replacing it with an "enhanced voice integration solution" at some point in the future. Google Assistant will still work for Android users. This is happening a full year after iPhone users began reporting issues related to Google Assistant, with many people noting that voice commands were totally broken. Waze says that it has "not been working as intended for over a year" and that it would rather "phase out Google Assistant on iOS" instead of "patching a feature that has faced ongoing difficulties." As previously stated, Google Assistant for Waze will continue to work on Android phones.


Three sensitive messages from full Signal chat explained

BBC News

In his message, Waltz congratulates Pete - referring to Hegseth, as well as the IC, shorthand for "intelligence community" and Kurilla, a reference to Michael Kurilla, a US Army General who oversees Central Command, a regional combatant command with responsibility over the Middle East and parts of Central and South Asia. The messages do not reveal how the target's whereabouts or movements were tracked. A military expert contacted by the BBC - but who wished to rename nameless - suggested that a combination of aerial platforms, technological tracking capabilities or human intelligence on the ground could have been used, or a combination of various sources. At least 53 people were killed in the initial wave of US airstrikes on Houthi targets in Yemen, which struck more than 30 targets including training facilities, drone infrastructure, as well as weapons manufacturing and storage sties and command and control centres, including one in which the Pentagon said several unmanned aerial vehicle experts were located. It is unclear which of the targets Waltz was referring to in the group chat.


Supplementary Materials for On the Effects of Data Scale on Computer Control Agents

Neural Information Processing Systems

For completeness, in the following we include a datasheet based on the format of [1]. For what purpose was the dataset created? Was there a specific task in mind? Who created the dataset (e.g., which team, research group) and on behalf of which entity What do the instances that comprise the dataset represent (e.g., documents, photos, people, How many instances are there in total (of each type, if appropriate)? What data does each instance consist of?


On the Effects of Data Scale on UI Control Agents

Neural Information Processing Systems

Autonomous agents that control user interfaces to accomplish human tasks are emerging. Leveraging LLMs to power such agents has been of special interest, but unless fine-tuned on human-collected task demonstrations, performance is still relatively low. In this work we study whether fine-tuning alone is a viable approach for building real-world UI control agents.


Synatra: Turning Indirect Knowledge into Direct Demonstrations for Digital Agents at Scale

Neural Information Processing Systems

LLMs can now act as autonomous agents that interact with digital environments and complete specific objectives (e.g., arranging an online meeting). However, accuracy is still far from satisfactory, partly due to a lack of large-scale, direct demonstrations for digital tasks. Obtaining supervised data from humans is costly, and automatic data collection through exploration or reinforcement learning relies on complex environmental and content setup, resulting in datasets that lack comprehensive coverage of various scenarios. On the other hand, there is abundant knowledge that may indirectly assist task completion, such as online tutorials that were created for human consumption. In this work, we present Synatra, an approach that effectively transforms this indirect knowledge into direct supervision at scale. We define different types of indirect knowledge, and carefully study the available sources to obtain it, methods to encode the structure of direct demonstrations, and finally methods to transform indirect knowledge into direct demonstrations. We use 100k such synthetically-created demonstrations to finetune a 7B CodeLlama, and demonstrate that the resulting agent surpasses all comparably sized models on three web-based task benchmarks Mind2Web, MiniWoB++ and WebArena, as well as surpassing GPT-3.5 on WebArena and Mind2Web. In addition, while synthetic demonstrations prove to be only 3% the cost of human demonstrations (at $0.031 each), we show that the synthetic demonstrations can be more effective than an identical number of human demonstrations collected from limited domains.


Amazon Spring Sale 2025: The best tech deals from Apple, Bose, Sonos, Beats, Anker and others

Engadget

This year's Amazon Spring Sale is in full swing, and as promised, a ton of household, fashion and outdoor gear has dropped to record-low prices. Tech isn't a huge focus for this sale, but there are a decent number of devices on sale right now for some of the best prices we've seen all year. The selection may not be as good as that of Amazon Prime Day in July, but it still provides a good opportunity to save on things like headphones, robot vacuums, air purifiers and more. We've collected the best Amazon Spring Sale deals on tech gear here so you don't have to go searching for them. The Spring Sale runs through March 31, so check back here for all of the latest deals as they drop.