microprocessor
Bespoke Co-processor for Energy-Efficient Health Monitoring on RISC-V-based Flexible Wearables
Vergos, Theofanis, Vergos, Polykarpos, Tahoori, Mehdi B., Zervakis, Georgios
Flexible electronics offer unique advantages for conformable, lightweight, and disposable healthcare wearables. However, their limited gate count, large feature sizes, and high static power consumption make on-body machine learning classification highly challenging. While existing bendable RISC-V systems provide compact solutions, they lack the energy efficiency required. We present a mechanically flexible RISC-V that integrates a bespoke multiply-accumulate co-processor with fixed coefficients to maximize energy efficiency and minimize latency. Our approach formulates a constrained programming problem to jointly determine co-processor constants and optimally map Multi-Layer Perceptron (MLP) inference operations, enabling compact, model-specific hardware by leveraging the low fabrication and non-recurring engineering costs of flexible technologies. Post-layout results demonstrate near-real-time performance across several healthcare datasets, with our circuits operating within the power budget of existing flexible batteries and occupying only 2.42 mm^2, offering a promising path toward accessible, sustainable, and conformable healthcare wearables. Our microprocessors achieve an average 2.35x speedup and 2.15x lower energy consumption compared to the state of the art.
- Europe > Switzerland > Zürich > Zürich (0.14)
- North America > United States (0.04)
- Europe > Greece > West Greece > Patra (0.04)
- Europe > Germany > Baden-Württemberg > Karlsruhe Region > Karlsruhe (0.04)
- Health & Medicine > Therapeutic Area (0.95)
- Health & Medicine > Consumer Health (0.85)
Enriching Patent Claim Generation with European Patent Dataset
Jiang, Lekang, Li, Chengzu, Goetz, Stephan
Drafting patent claims is time-intensive, costly, and requires professional skill. Therefore, researchers have investigated large language models (LLMs) to assist inventors in writing claims. However, existing work has largely relied on datasets from the United States Patent and Trademark Office (USPTO). To enlarge research scope regarding various jurisdictions, drafting conventions, and legal standards, we introduce EPD, a European patent dataset. EPD presents rich textual data and structured metadata to support multiple patent-related tasks, including claim generation. This dataset enriches the field in three critical aspects: (1) Jurisdictional diversity: Patents from different offices vary in legal and drafting conventions. EPD fills a critical gap by providing a benchmark for European patents to enable more comprehensive evaluation. (2) Quality improvement: EPD offers high-quality granted patents with finalized and legally approved texts, whereas others consist of patent applications that are unexamined or provisional. Experiments show that LLMs fine-tuned on EPD significantly outperform those trained on previous datasets and even GPT-4o in claim quality and cross-domain generalization. (3) Real-world simulation: We propose a difficult subset of EPD to better reflect real-world challenges of claim generation. Results reveal that all tested LLMs perform substantially worse on these challenging samples, which highlights the need for future research.
- North America > United States > New Mexico > Bernalillo County > Albuquerque (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Europe > Italy > Tuscany > Florence (0.04)
- (2 more...)
Intel's Fall from Grace
In August 2000, Intel briefly had a market value of 509 billion (more than 930 billion in 2024 dollars). It was the most valuable public company and the "platform leader" in the personal computer industry along with Microsoft. At the start of December 2024, Intel's value stood at 104 billion (after falling under 100 billion), far below Microsoft ( 3.1 trillion) and Apple ( 3.6 trillion). Nvidia ( 3.4 trillion) became the new leader in semiconductors, rivaling Apple in market value. Intel also had fallen behind long-time rival AMD ( 222 billion) as well as Broadcom ( 176 billion), Qualcomm ( 174 billion), and ARM ( 141 billion).
- Semiconductors & Electronics (1.00)
- Banking & Finance > Trading (0.62)
- Information Technology > Hardware (0.47)
Lo-MARVE: A Low Cost Autonomous Underwater Vehicle for Marine Exploration
This paper presents Low-cost Marine Autonomous Robotic Vehicle Explorer (Lo-MARVE), a novel autonomous underwater vehicle (AUV) designed to provide a low cost solution for underwater exploration and environmental monitoring in shallow water environments. Lo-MARVE offers a cost-effective alternative to existing AUVs, featuring a modular design, low-cost sensors, and wireless communication capabilities. The total cost of Lo-MARVE is approximately EUR 500. Lo-MARVE is developed using the Raspberry Pi 4B microprocessor, with control software written in Python. The proposed AUV was validated through field testing outside of a laboratory setting, in the freshwater environment of the River Corrib in Galway, Ireland. This demonstrates its ability to navigate autonomously, collect data, and communicate effectively outside of a controlled laboratory setting. The successful deployment of Lo-MARVE in a real-world environment validates its proof of concept.
- Europe > Ireland > Connaught > County Galway > Galway (0.25)
- North America > United States > Nevada > Clark County > Las Vegas (0.04)
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.04)
- (4 more...)
- Research Report (0.64)
- Overview (0.47)
A Micro Architectural Events Aware Real-Time Embedded System Fault Injector
Magliano, Enrico, Carpegna, Alessio, Savino, Alessadro, Di Carlo, Stefano
In contemporary times, the increasing complexity of the system poses significant challenges to the reliability, trustworthiness, and security of the SACRES. Key issues include the susceptibility to phenomena such as instantaneous voltage spikes, electromagnetic interference, neutron strikes, and out-of-range temperatures. These factors can induce switch state changes in transistors, resulting in bit-flipping, soft errors, and transient corruption of stored data in memory. The occurrence of soft errors, in turn, may lead to system faults that can propel the system into a hazardous state. Particularly in critical sectors like automotive, avionics, or aerospace, such malfunctions can have real-world implications, potentially causing harm to individuals. This paper introduces a novel fault injector designed to facilitate the monitoring, aggregation, and examination of micro-architectural events. This is achieved by harnessing the microprocessor's PMU and the debugging interface, specifically focusing on ensuring the repeatability of fault injections. The fault injection methodology targets bit-flipping within the memory system, affecting CPU registers and RAM. The outcomes of these fault injections enable a thorough analysis of the impact of soft errors and establish a robust correlation between the identified faults and the essential timing predictability demanded by SACRES.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Europe > Italy > Piedmont > Turin Province > Turin (0.04)
- Europe > Belgium > Flanders > Antwerp Province > Antwerp (0.04)
- Aerospace & Defense (0.54)
- Semiconductors & Electronics (0.51)
Robots for deep-sea recovery missions in sci-fi and reality
My new science fiction/science fact article for Science Robotics is out on why deep ocean robotics is hard. Especially when trying to bring up a sunken submarine 3 miles underwater, which the CIA actually did in 1974. It's even harder if you're trying to bring up an alien spaceship- which is the plot of Harry Turtledove's new sci-fi novel Three Miles Under. Though the expedition was 50 years before the OceanGate Titan tragedy, the same challenges exist for today's robots. The robotics science in the book is very real, the aliens, not so much.
Learning domain-specific causal discovery from time series
Wang, Xinyue, Kording, Konrad Paul
Causal discovery (CD) from time-varying data is important in neuroscience, medicine, and machine learning. Techniques for CD encompass randomized experiments, which are generally unbiased but expensive, and algorithms such as Granger causality, conditional-independence-based, structural-equation-based, and score-based methods that are only accurate under strong assumptions made by human designers. However, as demonstrated in other areas of machine learning, human expertise is often not entirely accurate and tends to be outperformed in domains with abundant data. In this study, we examine whether we can enhance domain-specific causal discovery for time series using a data-driven approach. Our findings indicate that this procedure significantly outperforms human-designed, domain-agnostic causal discovery methods, such as Mutual Information, VAR-LiNGAM, and Granger Causality on the MOS 6502 microprocessor, the NetSim fMRI dataset, and the Dream3 gene dataset. We argue that, when feasible, the causality field should consider a supervised approach in which domain-specific CD procedures are learned from extensive datasets with known causal relationships, rather than being designed by human specialists. Our findings promise a new approach toward improving CD in neural and medical data and for the broader machine learning community.
- North America > United States > Pennsylvania (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Research Report > Strength High (1.00)
- Research Report > New Finding (1.00)
- Research Report > Experimental Study (1.00)
- Health & Medicine > Pharmaceuticals & Biotechnology (1.00)
- Health & Medicine > Therapeutic Area > Neurology (0.88)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Performance Analysis > Accuracy (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.92)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.67)
Bill Gates predicts how artificial intelligence will evolve - AS USA
Microsoft co-founder Bill Gates has published a 7-page letter in which he talked about the importance that Artificial Intelligence will have in the future. Gates says that the development of this technology is "as fundamental as the creation of the microprocessor, the personal computer, the Internet and the mobile phone". "In my life, I have seen two technology demos that I thought were revolutionary. The first time was in 1980, when I was introduced to a graphical user interface, the forerunner of all modern operating systems, including Windows [...] The second came last year. I had been meeting with the OpenAI team since 2016 and was impressed by their steady progress."
Research Bits: April 19
Processor power prediction Researchers from Duke University, Arm Research, and Texas A&M University developed an AI method for predicting the power consumption of a processor, returning results more than a trillion times per second while consuming very little power itself. "This is an intensively studied problem that has traditionally relied on extra circuitry to address," said Zhiyao Xie, a PhD candidate at Duke. "But our approach runs directly on the microprocessor in the background, which opens many new opportunities. I think that's why people are excited about it." The approach, called APOLLO, uses an AI algorithm to identify and select just 100 of a processor's millions of signals that correlate most closely with its power consumption. It then builds a power consumption model off of those 100 signals and monitors them to predict the entire chip's performance in real-time.
- North America > United States > Texas (0.25)
- North America > United States > Illinois (0.05)
- Asia > China > Shanghai > Shanghai (0.05)
- Asia > China > Hong Kong (0.05)
- Information Technology > Artificial Intelligence (0.71)
- Information Technology > Hardware (0.52)
Develop IoT artificial intelligence holistically to prosper
Even the best electrical engineers and IoT practitioners might not be able to figure out the AI component of IoT artificial intelligence without some guidance. IoT practitioners and data scientists who want to build IoT-based AI don't have to work it out on their own. In fact, they must often partner with other experts or else they risk missing-critical factors to ensure their project succeeds. In Artificial Intelligence for IoT Cookbook, author and senior staff enterprise architect Michael Roshak discusses techniques with detailed instructions to build AI for IoT deployments and resolve common problems. After establishing the basic set up for IoT and AI, Roshak digs into advanced skills such as computer vision and natural language processing.