epic
Surveillance and ICE Are Driving Patients Away From Medical Care, Report Warns
A new EPIC report says data brokers, ad-tech surveillance, and ICE enforcement are among the factors leading to a "health privacy crisis" that is eroding trust and deterring people from seeking care. When immigration agents enter hospitals and private companies are allowed to buy and sell data that reveals who seeks medical care, patients retreat, treatment is delayed, and health outcomes worsen, according to a new report that describes a growing "health privacy crisis" in the United States driven by surveillance and weak law enforcement limits. The report, published by the Electronic Privacy Information Center (EPIC), attributes the problem to outdated privacy laws and rapidly expanding digital systems that allow health-related information to be tracked, analyzed, breached, and accessed by both private companies and government agencies. EPIC, a Washington-based nonprofit focused on privacy and civil liberties, based its findings on a review of federal and state laws, court rulings, agency policies, technical research, and documented case studies examining how health data is collected, shared, and used across government and commercial systems. "Unregulated digital technologies, mass surveillance, and weak privacy laws have created a health privacy crisis," the report says.
- North America > United States > California (0.05)
- South America > Venezuela (0.04)
- North America > United States > Texas (0.04)
- (6 more...)
- Information Technology > Security & Privacy (1.00)
- Information Technology > Artificial Intelligence (1.00)
- Information Technology > Communications > Social Media (0.73)
- Information Technology > Communications > Mobile (0.47)
EPIC: Effective Prompting for Imbalanced-Class Data Synthesis in Tabular Data Classification via Large Language Models
Large language models (LLMs) have demonstrated remarkable in-context learning capabilities across diverse applications. In this work, we explore the effectiveness of LLMs for generating realistic synthetic tabular data, identifying key prompt design elements to optimize performance. We introduce EPIC, a novel approach that leverages balanced, grouped data samples and consistent formatting with unique variable mapping to guide LLMs in generating accurate synthetic data across all classes, even for imbalanced datasets. Evaluations on real-world datasets show that EPIC achieves state-of-the-art machine learning classification performance, significantly improving generation efficiency.
Elon Musk Said Grok's Roasts Would Be 'Epic' at Parties--So I Tried It on My Coworkers
Elon Musk Said Grok's Roasts Would Be'Epic' at Parties--So I Tried It on My Coworkers It went about as well as you'd expect. We can debate the worthiness of Elon Musk's accomplishments--building up Tesla, hollowing out the government, shooting for Mars --but we can all agree that his insistence on being seen as funny is his most grating quality. From the constant 4:20 references to his quote tweet "dunks" to awarding " Certified Bangers " badges to silly X posts, Musk's desperation for validation knows no bounds. It can get pretty annoying when the richest guy on earth makes a joke and then awkwardly eyes the room waiting for everyone to laugh. But over the weekend, I was intrigued when a clip emerged of Musk telling Joe Rogan that using Grok's Unhinged Mode to deliver an "epic vulgar roast" is a surefire way to "make people really laugh at a party."
- Asia > Nepal (0.15)
- North America > United States > California (0.05)
- Europe > Slovakia (0.05)
- Europe > Czechia (0.05)
- Government (0.68)
- Law (0.48)
- Information Technology > Artificial Intelligence (0.97)
- Information Technology > Communications > Mobile (0.71)
Supplementary Material
This section contains supplementary material to support the main paper text. In the video, we show the process of creating and building the activity-context memory described in Sec. The video compares the baseline approaches to our method. We present the algorithm for creating and maintaining the ACO memory in Algorithm 1. Equation 5. Note that in practice, we normalize The remaining columns show similar "states" from THOR that our agents deem We show additional detection results to supplement Figure 1 (left). Last column shows failure cases.
Reasoning Planning for Language Models
Nguyen, Bao, Nguyen, Hieu Trung, She, Ruifeng, Fu, Xiaojin, Nguyen, Viet Anh
Selecting an appropriate reasoning method for a given query remains a key challenge in language model generation. Existing approaches typically generate multiple candidate responses and use an aggregation strategy to select the output answer, often assuming that more candidate answers yield higher accuracy. We revisit this assumption through a rigorous theoretical analysis, deriving accuracy bounds for standard aggregation methods under fixed generation distributions and candidate sizes. Building on these insights, we introduce EPIC, an Ensemble Planning with Contrastive learning framework to learn a shared representation space that captures both model reasoning abilities and query-method compatibility. EPIC incorporates our probability bounds as a regularizer in a utility-driven optimization that balances accuracy and computational cost. Experiments on diverse mathematical reasoning tasks show that EPIC consistently selects optimal reasoning methods, improving accuracy while reducing computational overhead. Our code can be found at https://github.com/nguyenngocbaocmt02/EPIC.
- Europe > Austria > Vienna (0.14)
- North America > United States > New Mexico > Bernalillo County > Albuquerque (0.04)
- Asia > Thailand > Bangkok > Bangkok (0.04)
- (2 more...)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (0.95)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Search (0.68)
- Information Technology > Artificial Intelligence > Cognitive Science > Problem Solving (0.66)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.46)
EPIC: Generative AI Platform for Accelerating HPC Operational Data Analytics
Karimi, Ahmad Maroof, Shin, Woong, Hines, Jesse, Ghosal, Tirthankar, Sattar, Naw Safrin, Wang, Feiyi
We present EPIC, an AI-driven platform designed to augment operational data analytics. EPIC employs a hierarchical multi-agent architecture where a top-level large language model provides query processing, reasoning and synthesis capabilities. These capabilities orchestrate three specialized low-level agents for information retrieval, descriptive analytics, and predictive analytics. This architecture enables EPIC to perform HPC operational analytics on multi-modal data, including text, images, and tabular formats, dynamically and iteratively. EPIC addresses the limitations of existing HPC operational analytics approaches, which rely on static methods that struggle to adapt to evolving analytics tasks and stakeholder demands. Through extensive evaluations on the Frontier HPC system, we demonstrate that EPIC effectively handles complex queries. Using descriptive analytics as a use case, fine-tuned smaller models outperform large state-of-the-art foundation models, achieving up to 26% higher accuracy. Additionally, we achieved 19x savings in LLM operational costs compared to proprietary solutions by employing a hybrid approach that combines large foundational models with fine-tuned local open-weight models.
- North America > United States > Minnesota > Hennepin County > Minneapolis (0.14)
- Asia > Singapore (0.04)
- Asia > Indonesia > Bali (0.04)
- Africa > Ethiopia > Addis Ababa > Addis Ababa (0.04)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Agents (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning > Generative AI (0.51)
Toward Lifelong-Sustainable Electronic-Photonic AI Systems via Extreme Efficiency, Reconfigurability, and Robustness
Yin, Ziang, Zhou, Hongjian, Sudarshan, Chetan Choppali, Chhabria, Vidya, Gu, Jiaqi
The relentless growth of large-scale artificial intelligence (AI) has created unprecedented demand for computational power, straining the energy, bandwidth, and scaling limits of conventional electronic platforms. Electronic-photonic integrated circuits (EPICs) have emerged as a compelling platform for next-generation AI systems, offering inherent advantages in ultra-high bandwidth, low latency, and energy efficiency for computing and interconnection. Beyond performance, EPICs also hold unique promises for sustainability. Fabricated in relaxed process nodes with fewer metal layers and lower defect densities, photonic devices naturally reduce embodied carbon footprint (CFP) compared to advanced digital electronic integrated circuits, while delivering orders-of-magnitude higher computing performance and interconnect bandwidth. To further advance the sustainability of photonic AI systems, we explore how electronic-photonic design automation (EPDA) and cross-layer co-design methodologies can amplify these inherent benefits. We present how advanced EPDA tools enable more compact layout generation, reducing both chip area and metal layer usage. We will also demonstrate how cross-layer device-circuit-architecture co-design unlocks new sustainability gains for photonic hardware: ultra-compact photonic circuit designs that minimize chip area cost, reconfigurable hardware topology that adapts to evolving AI workloads, and intelligent resilience mechanisms that prolong lifetime by tolerating variations and faults. By uniting intrinsic photonic efficiency with EPDA- and co-design-driven gains in area efficiency, reconfigurability, and robustness, we outline a vision for lifelong-sustainable electronic-photonic AI systems. This perspective highlights how EPIC AI systems can simultaneously meet the performance demands of modern AI and the urgent imperative for sustainable computing.
- North America > United States > California > San Francisco County > San Francisco (0.14)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- North America > United States > Arizona (0.04)
- Energy (1.00)
- Information Technology (0.95)
- Semiconductors & Electronics (0.88)
Hymn of Babylon is pieced together after 2,100 YEARS: Scientists use AI to reconstruct ancient song
A hymn dedicated to the ancient city of Babylon has been discovered after 2,100 years. Sung to the god Marduk, patron deity of the great city, the poem describes Babylon's flowing rivers, jewelled gates, and'bathed priests' in stunning detail. Although the song was lost to time after Alexander the Great captured the city, fragments of clay tablets survived in the ruins of Sippar, a city 40 miles to the North. In a process that would have taken'decades' to complete by hand, researchers used AI to piece together 30 different tablet pieces and recover the lost hymn. Originally 250 lines long, scientists have been able to translate a third of the original cuneiform text.
EPiC: Towards Lossless Speedup for Reasoning Training through Edge-Preserving CoT Condensation
Jia, Jinghan, Reisizadeh, Hadi, Fan, Chongyu, Baracaldo, Nathalie, Hong, Mingyi, Liu, Sijia
Large language models (LLMs) have shown remarkable reasoning capabilities when trained with chain-of-thought (CoT) supervision. However, the long and verbose CoT traces, especially those distilled from large reasoning models (LRMs) such as DeepSeek-R1, significantly increase training costs during the distillation process, where a non-reasoning base model is taught to replicate the reasoning behavior of an LRM. In this work, we study the problem of CoT condensation for resource-efficient reasoning training, aimed at pruning intermediate reasoning steps (i.e., thoughts) in CoT traces, enabling supervised model training on length-reduced CoT data while preserving both answer accuracy and the model's ability to generate coherent reasoning. Our rationale is that CoT traces typically follow a three-stage structure: problem understanding, exploration, and solution convergence. Through empirical analysis, we find that retaining the structure of the reasoning trace, especially the early stage of problem understanding (rich in reflective cues) and the final stage of solution convergence, is sufficient to achieve lossless reasoning supervision. To this end, we propose an Edge-Preserving Condensation method, EPiC, which selectively retains only the initial and final segments of each CoT trace while discarding the middle portion. This design draws an analogy to preserving the "edge" of a reasoning trajectory, capturing both the initial problem framing and the final answer synthesis, to maintain logical continuity. Experiments across multiple model families (Qwen and LLaMA) and benchmarks show that EPiC reduces training time by over 34% while achieving lossless reasoning accuracy on MATH500, comparable to full CoT supervision. To the best of our knowledge, this is the first study to explore thought-level CoT condensation for efficient reasoning model distillation.
- South America > Chile > Santiago Metropolitan Region > Santiago Province > Santiago (0.04)
- North America > United States > Minnesota (0.04)
- North America > United States > Michigan (0.04)
- (2 more...)