measurement
Unsupervised Learning From Incomplete Measurements for Inverse Problems
In many real-world inverse problems, only incomplete measurement data are available for training which can pose a problem for learning a reconstruction function. Indeed, unsupervised learning using a fixed incomplete measurement process is impossible in general, as there is no information in the nullspace of the measurement operator. This limitation can be overcome by using measurements from multiple operators. While this idea has been successfully applied in various applications, a precise characterization of the conditions for learning is still lacking. In this paper, we fill this gap by presenting necessary and sufficient conditions for learning the underlying signal model needed for reconstruction which indicate the interplay between the number of distinct measurement operators, the number of measurements per operator, the dimension of the model and the dimension of the signals. Furthermore, we propose a novel and conceptually simple unsupervised learning loss which only requires access to incomplete measurement data and achieves a performance on par with supervised learning when the sufficient condition is verified. We validate our theoretical bounds and demonstrate the advantages of the proposed unsupervised loss compared to previous methods via a series of experiments on various imaging inverse problems, such as accelerated magnetic resonance imaging, compressed sensing and image inpainting.
Another Trump Casualty: A Tiny Office That Keeps Measurements of the World Accurate
Dru Smith, Chief Geodesist of the National Geodetic Survey stands near a measurement device used to survey the height of the Washington Monument in 2017.Susan Walsh/AP This story was originally published by Wired and is reproduced here as part of the Climate Desk collaboration. Cuts made by the Trump administration are threatening the function of a tiny but crucial office within the National Oceanic and Atmospheric Administration that maintains the US framework of spatial information: latitudes, longitudes, vertical measurements like elevation, and even measurements of Earth's gravitational field. Staff losses at the National Geodetic Survey (NGS), the oldest scientific agency in the US, could further cripple its mission and activities, including a long-awaited project to update the accuracy of these measurements, former employees and experts say. As the world turns more and more toward operations that need precise coordinate systems like the ones NGS provides, the science that underpins this office's activities, these experts say, is becoming even more crucial. The work of NGS, says Tim Burch, the executive director of the National Society of Professional Surveyors, "is kind of like oxygen. You don't know you need it until it's not there."
Zero-shot Medical Event Prediction Using a Generative Pre-trained Transformer on Electronic Health Records
Redekop, Ekaterina, Wang, Zichen, Kulkarni, Rushikesh, Pleasure, Mara, Chin, Aaron, Hassanzadeh, Hamid Reza, Hill, Brian L., Emami, Melika, Speier, William, Arnold, Corey W.
Longitudinal data in electronic health records (EHRs) represent an individual`s clinical history through a sequence of codified concepts, including diagnoses, procedures, medications, and laboratory tests. Foundational models, such as generative pre-trained transformers (GPT), can leverage this data to predict future events. While fine-tuning of these models enhances task-specific performance, it is costly, complex, and unsustainable for every target. We show that a foundation model trained on EHRs can perform predictive tasks in a zero-shot manner, eliminating the need for fine-tuning. This study presents the first comprehensive analysis of zero-shot forecasting with GPT-based foundational models in EHRs, introducing a novel pipeline that formulates medical concept prediction as a generative modeling task. Unlike supervised approaches requiring extensive labeled data, our method enables the model to forecast a next medical event purely from a pretraining knowledge. We evaluate performance across multiple time horizons and clinical categories, demonstrating model`s ability to capture latent temporal dependencies and complex patient trajectories without task supervision. Model performance for predicting the next medical concept was evaluated using precision and recall metrics, achieving an average top1 precision of 0.614 and recall of 0.524. For 12 major diagnostic conditions, the model demonstrated strong zero-shot performance, achieving high true positive rates while maintaining low false positives. We demonstrate the power of a foundational EHR GPT model in capturing diverse phenotypes and enabling robust, zero-shot forecasting of clinical outcomes. This capability enhances the versatility of predictive healthcare models and reduces the need for task-specific training, enabling more scalable applications in clinical settings.
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Performance Analysis > Accuracy (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.93)
AI-Based Energy Transportation Safety: Pipeline Radial Threat Estimation Using Intelligent Sensing System
Zhu, Chengyuan, Yang, Yiyuan, Yang, Kaixiang, Zhang, Haifeng, Yang, Qinmin, Chen, C. L. Philip
The application of artificial intelligence technology has greatly enhanced and fortified the safety of energy pipelines, particularly in safeguarding against external threats. The predominant methods involve the integration of intelligent sensors to detect external vibration, enabling the identification of event types and locations, thereby replacing manual detection methods. However, practical implementation has exposed a limitation in current methods - their constrained ability to accurately discern the spatial dimensions of external signals, which complicates the authentication of threat events. Our research endeavors to overcome the above issues by harnessing deep learning techniques to achieve a more fine-grained recognition and localization process. This refinement is crucial in effectively identifying genuine threats to pipelines, thus enhancing the safety of energy transportation. This paper proposes a radial threat estimation method for energy pipelines based on distributed optical fiber sensing technology. Specifically, we introduce a continuous multi-view and multi-domain feature fusion methodology to extract comprehensive signal features and construct a threat estimation and recognition network. The utilization of collected acoustic signal data is optimized, and the underlying principle is elucidated. Moreover, we incorporate the concept of transfer learning through a pre-trained model, enhancing both recognition accuracy and training efficiency. Empirical evidence gathered from real-world scenarios underscores the efficacy of our method, notably in its substantial reduction of false alarms and remarkable gains in recognition accuracy. More generally, our method exhibits versatility and can be extrapolated to a broader spectrum of recognition tasks and scenarios.
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.28)
- North America > United States (0.28)
- Asia > China > Zhejiang Province (0.14)
Elasticity Measurements of Expanded Foams using a Collaborative Robotic Arm
Beber, Luca, Lamon, Edoardo, Palopoli, Luigi, Fambri, Luca, Saveriano, Matteo, Fontanelli, Daniele
Medical applications of robots are increasingly popular to objectivise and speed up the execution of several types of diagnostic and therapeutic interventions. Particularly important is a class of diagnostic activities that require physical contact between the robotic tool and the human body, such as palpation examinations and ultrasound scans. The practical application of these techniques can greatly benefit from an accurate estimation of the biomechanical properties of the patient's tissues. In this paper, we evaluate the accuracy and precision of a robotic device used for medical purposes in estimating the elastic parameters of different materials. The measurements are evaluated against a ground truth consisting of a set of expanded foam specimens with different elasticity that are characterised using a high-precision device. The experimental results in terms of precision are comparable with the ground truth and suggest future ambitious developments.
- Europe > Italy (0.14)
- North America > United States (0.14)
- Health & Medicine > Diagnostic Medicine (0.66)
- Energy > Oil & Gas > Upstream (0.46)
Topological Reconstruction of Particle Physics Processes using Graph Neural Networks
Ehrke, Lukas, Raine, John Andrew, Zoch, Knut, Guth, Manuel, Golling, Tobias
We present a new approach, the Topograph, which reconstructs underlying physics processes, including the intermediary particles, by leveraging underlying priors from the nature of particle physics decays and the flexibility of message passing graph neural networks. The Topograph not only solves the combinatoric assignment of observed final state objects, associating them to their original mother particles, but directly predicts the properties of intermediate particles in hard scatter processes and their subsequent decays. In comparison to standard combinatoric approaches or modern approaches using graph neural networks, which scale exponentially or quadratically, the complexity of Topographs scales linearly with the number of reconstructed objects. We apply Topographs to top quark pair production in the all hadronic decay channel, where we outperform the standard approach and match the performance of the state-of-the-art machine learning technique.
Forget Police Sketches: Researchers Perfectly Reconstruct Faces by Reading Brainwaves
Using brain scans and direct neuron recording from macaque monkeys, the team found specialized "face patches" that respond to specific combinations of facial features. In the early 2000s, while recording from epilepsy patients with electrodes implanted into their brains, Quian Quiroga and colleagues found that face cells are particularly picky. In a stroke of luck, Tsao and team blew open the "black box" of facial recognition while working on a different problem: how to describe a face mathematically, with a matrix of numbers. In macaque monkeys with electrodes implanted into their brains, the team recorded from three "face patches"--brain areas that respond especially to faces--while showing the monkeys the computer-generated faces.
Comparing Distance Measurements with Python and SciPy
At the core of cluster analysis is the concept of measuring distances between a variety of different data point dimensions. For example, when considering k-means clustering, there is a need to measure a) distances between individual data point dimensions and the corresponding cluster centroid dimensions of all clusters, and b) distances between cluster centroid dimensions and all resulting cluster member data point dimensions. While k-means, the simplest and most prominent clustering algorithm, generally uses Euclidean distance as its similarity distance measurement, contriving innovative or variant clustering algorithms which, among other alterations, utilize different distance measurements is not a stretch. It is thus a judgment of orientation and not magnitude: two vectors with the same orientation have a cosine similarity of 1, two vectors at 90 have a similarity of 0, and two vectors diametrically opposed have a similarity of -1, independent of their magnitude.
dan-cziczo-maria-zawadowicz-measuring-biological-dust-in-upper-atmosphere-0620
When applied to previously-collected atmospheric samples and data, their findings support evidence that on average these bioaerosols globally make up less than 1 percent of the particles in the upper troposphere -- where they could influence cloud formation and by extension, the climate -- and not around 25 to 50 percent as some previous research suggests. While atmospheric and climate modeling suggests that bioaerosols, globally averaged, are not abundant and efficient enough at freezing to significantly influence cloud formation, research findings have varied significantly. The group leveraged the presence of phosphorus in the mass spectra to train the classification machine learning algorithm on known samples and then, primed, applied it to field data acquired from Desert Research Institute's Storm Peak Laboratory in Steamboat Springs, Colorado, and from the Carbonaceous Aerosol and Radiative Effects Study based in the town of Cool, California. Knowing that the principal atmospheric emissions of phosphorus are from mineral dust, combustion products, and biological particles, they exploited the presence of phosphate and organic nitrogen ions and their characteristic ratios in known samples to classify the particles.
- Research Report > Experimental Study (0.88)
- Research Report > New Finding (0.69)
Artificial intelligence replaces physicists
The experiment, developed by physicists from ANU, University of Adelaide and UNSW ADFA, created an extremely cold gas trapped in a laser beam, known as a Bose-Einstein condensate, replicating the experiment that won the 2001 Nobel Prize. The artificial intelligence system's ability to set itself up quickly every morning and compensate for any overnight fluctuations would make this fragile technology much more useful for field measurements, said co-lead researcher Dr Michael Hush from UNSW ADFA. The team cooled the gas to around 1 microkelvin, and then handed control of the three laser beams over to the artificial intelligence to cool the trapped gas down to nanokelvin. "It may be able to come up with complicated ways humans haven't thought of to get experiments colder and make measurements more precise.