The announcement in September took the world by storm: In radio emissions from Venus's atmosphere, researchers found signs of phosphine, a toxic compound that on Earth is made in significant amounts only by microbes and chemists. The unexpected detection could point to a microbial biosphere floating in the venusian clouds, the researchers suggested in Nature Astronomy . But almost immediately, other astronomers began to point out questionable methods or said they couldn't reproduce results. Now, after reanalyzing their data, the original proponents are downgrading their claims. Phosphine levels are at least seven times lower than first claimed, the authors reported in a preprint posted on 17 November to arXiv. But the team still believes the gas is there, Jane Greaves, an astronomer at Cardiff University who led the work, said in a talk last week to a NASA Venus science group. “We have again a phosphine line.” The original observations were made in 2017 at the James Clerk Maxwell Telescope (JCMT) in Hawaii, and in 2019 at the Atacama Large Millimeter/submillimeter Array (ALMA) in Chile. In Venus's radio spectrum, Greaves and her colleagues detected an absorption line they attributed to phosphine. The researchers went to great lengths to remove confounding effects such as absorption by Earth's own atmosphere. But critics said such aggressive fixes made the discovery of a false positive more likely. ALMA scientists have since found a new noise source: telescope calibration errors. After reanalyzing the ALMA data, Greaves said her team now finds phosphine at just 1 part per billion (ppb). That's still above levels that can be explained by natural processes such as volcanic eruptions or lightning strikes, Greaves said. A study published last month in Astronomy & Astrophysics , led by Therese Encrenaz, an astronomer at the Paris Observatory, ruled out higher phosphine levels. Her team analyzed observations made in 2015 by NASA's Infrared Telescope Facility in Hawaii. Phosphine should have popped out if it had existed at levels above 5 ppb. “It's easy to see there's no phosphine line,” Encrenaz says. If the line does exist, it might not be due to phosphine, according to a critique submitted to Nature Astronomy . It argues that the dip in the JCMT spectrum can be explained by an overlapping absorption line from sulfur dioxide (SO2), the gas that makes up most venusian clouds. The Greaves team concedes the point in its reanalysis. “We emphasize that there could be a contribution from SO2,” they write. But the width of the absorption line in the ALMA data suggests the feature isn't “solely SO2,” they write. Just where any signal is coming from is also in dispute. ALMA is only sensitive to absorption from substances at altitudes above 70 kilometers (km), Encrenaz says. But the Nature Astronomy paper suggested the signal originated some 55 km up, in warmer, more hospitable cloud layers. “This is very difficult to conceive,” Encrenaz says. Greaves and her co-authors argue in their reanalysis that ALMA is unable to capture the full width—and therefore depth—of the signal. “There is no empirical evidence that [phosphine] lies only above 70 km.” Colin Wilson, a co-author of the Nature Astronomy critique, says it's too early to say where the “phosphine roller coaster will end up.” More observations at ALMA might settle the issue, he says. “Whether or not we find phosphine, we're likely to find something new.”
Yara rules are a ubiquitous tool among cybersecurity practitioners and analysts. Developing high-quality Yara rules to detect a malware family of interest can be labor- and time-intensive, even for expert users. Few tools exist and relatively little work has been done on how to automate the generation of Yara rules for specific families. In this paper, we leverage large n-grams ($n \geq 8$) combined with a new biclustering algorithm to construct simple Yara rules more effectively than currently available software. Our method, AutoYara, is fast, allowing for deployment on low-resource equipment for teams that deploy to remote networks. Our results demonstrate that AutoYara can help reduce analyst workload by producing rules with useful true-positive rates while maintaining low false-positive rates, sometimes matching or even outperforming human analysts. In addition, real-world testing by malware analysts indicates AutoYara could reduce analyst time spent constructing Yara rules by 44-86%, allowing them to spend their time on the more advanced malware that current tools can't handle. Code will be made available at https://github.com/NeuromorphicComputationResearchProgram .
Particulate matter pollution is one of the deadliest types of air pollution worldwide due to its significant impacts on the global environment and human health. Particulate Matter (PM2.5) is one of the important particulate pollutants to measure the Air Quality Index (AQI). The conventional instruments used by the air quality monitoring stations to monitor PM2.5 are costly, bulkier, time-consuming, and power-hungry. Furthermore, due to limited data availability and non-scalability, these stations cannot provide high spatial and temporal resolution in real-time. To overcome the disadvantages of existing methodology this article presents analytical equations based prediction approach for PM2.5 using an Artificial Neural Network (ANN). Since the derived analytical equations for the prediction can be computed using a Wireless Sensor Node (WSN) or low-cost processing tool, it demonstrates the usefulness of the proposed approach. Moreover, the study related to correlation among the PM2.5 and other pollutants is performed to select the appropriate predictors. The large authenticate data set of Central Pollution Control Board (CPCB) online station, India is used for the proposed approach. The RMSE and coefficient of determination (R2) obtained for the proposed prediction approach using eight predictors are 1.7973 ug/m3 and 0.9986 respectively. While the proposed approach results show RMSE of 7.5372 ug/m3 and R2 of 0.9708 using three predictors. Therefore, the results demonstrate that the proposed approach is one of the promising approaches for monitoring PM2.5 without power-hungry gas sensors and bulkier analyzers.
By accurately predicting industrial aging processes (IAPs), it is possible to schedule maintenance events further in advance, thereby ensuring a cost-efficient and reliable operation of the plant. So far, these degradation processes were usually described by mechanistic models or simple empirical prediction models. In this paper, we evaluate a wider range of data-driven models for this task, comparing some traditional stateless models (linear and kernel ridge regression, feed-forward neural networks) to more complex recurrent neural networks (echo state networks and LSTMs). To examine how much historical data is needed to train each of the models, we first examine their performance on a synthetic dataset with known dynamics. Next, the models are tested on real-world data from a large scale chemical plant. Our results show that LSTMs produce near perfect predictions when trained on a large enough dataset, while linear models may generalize better given small datasets with changing conditions.
CHICAGO & LONDON--(BUSINESS WIRE)--Artificial Intelligence (AI) is widely expected to drive important benefits across the health system, from increasing efficiency to improving patient outcomes, but it also may be key to making healthcare more human. Benefits range from increasing the amount of time clinicians can spend with patients and on cross-care team collaboration to enhancing the ability to deliver preventative care. According to a new study of more than 900 healthcare professionals in the U.S. and U.K. conducted by MIT Technology Review Insights with GE Healthcare, nearly half of medical professionals surveyed said AI is already increasing their ability to spend time with and provide care to patients. Additionally, more than 78 percent of healthcare business leaders who reported they have deployed AI in their operations also reported that AI has helped drive workflow improvements, streamlining operational and administrative activities and delivering significant efficiencies toward transforming the future of healthcare. "Of any industry, AI could have the most profound benefits on human lives if we can effectively harness it across the healthcare system," said Kieran Murphy, President and CEO, GE Healthcare.
In recent years, interest in monitoring air quality has been growing. Traditional environmental monitoring stations are very expensive, both to acquire and to maintain, therefore their deployment is generally very sparse. This is a problem when trying to generate air quality maps with a fine spatial resolution. Given the general interest in air quality monitoring, low-cost air quality sensors have become an active area of research and development. Low-cost air quality sensors can be deployed at a finer level of granularity than traditional monitoring stations. Furthermore, they can be portable and mobile. Low-cost air quality sensors, however, present some challenges: they suffer from cross-sensitivities between different ambient pollutants; they can be affected by external factors such as traffic, weather changes, and human behavior; and their accuracy degrades over time. Some promising machine learning approaches can help us obtain highly accurate measurements with low-cost air quality sensors. In this article, we present low-cost sensor technologies, and we survey and assess machine learning-based calibration techniques for their calibration. We conclude by presenting open questions and directions for future research.
Fox News Flash top headlines for Sept. 12 are here. Check out what's clicking on Foxnews.com A bio-inspired robot can use water from the environment to launch itself into the air, British researchers revealed. The robot can travel 85 feet through the air after taking off and researchers believe it could be used to collect samples in hazardous or otherwise cluttered environments, such as during a major flood. Researchers from the Aerial Robotics Laboratory at Imperial College London devised a system that requires only 0.2 grams of calcium carbide powder in a combusion chamber, with the only moving part being a small pump that delivers water from the environment where the robot sits.
Abstract: Recent automated crop mapping via supervised le arning - based methods have demonstrated unprecedented improvement over classical techniques. However, m ost crop mapping studies are limited to same - year crop mapping in which the present year's labeled data is used to predict the same year's crop map. Cross - y ear crop mapping is more useful as it allows the prediction of the following years' crop maps using previously labeled data. We propose Vector Dynamic Time Warping ( VD TW), a novel multi - year classification approach based on warping of angular distances between phenological vectors. The results prove that the proposed VDTW method is robust to temporal and spectral v ariations compensating for different farming practices, climate and atmospheric effects, and measurement errors between years. We also describe a method for determining the most discriminative time window that allows high classification accuracies with lim ited data. We carried out test s of our approach with Lan dsat 8 time - series imagery from years 2013 to 2016 for classification of corn and cotton in the Harran Plain, and corn, cotton, and soybean in the Bismil Plain of Southeastern Turkey. In addition, we tested VDTW corn and soybean in Kansas, the US for 2017 and 2018 with the Harmonized Landsat Sentinel data . The VDTW method achieved 99.85% and 99.74% overall accuracies for the same and cross years, respectively with fewer training samples compared to oth er state - of - the - art approaches, i.e. spectral angle mapp er ( SAM), dynamic time warping ( DTW), time - weighted DTW ( TWDTW), random forest (RF), support vector machine ( SVM) and deep long short - term memory ( LSTM) methods. The proposed method could be expanded for other crop types and/or geographical areas. Keywords: Time series; phenology; multi - year classification; dynamic programming; Landsat; crop mapping; land use; corn; cotton; soybean 1. Introduction T he world population is expected to exceed nine billion in 2050  . Providing adequate nutrition for the increasing human population is a significant concern. Advanced agri cultural technologies, such as precision agriculture and precision irrigation are rapidly emerging to optimize water, fertilizers, and pesticides; thereby enabling higher crop yield. Accurate crop maps are the first requirements of advanced agriculture app lications such as yield forecasting . Early - season crop yield estimates are a crucial factor for food security and monitor ing agricultural subventio ns. Crop maps are also an essential tool for statistical purposes to analyze annual changes in agricultural p roduction. However, there are a variety of field crops with similar phenologies and spectral signatures.
We describe a chemical robotic assistant equipped with a curiosity algorithm (CA) that can efficiently explore the state a complex chemical system can exhibit. The CA-robot is designed to explore formulations in an open-ended way with no explicit optimization target. By applying the CA-robot to the study of self-propelling multicomponent oil-in-water droplets, we are able to observe an order of magnitude more variety of droplet behaviours than possible with a random parameter search and given the same budget. We demonstrate that the CA-robot enabled the discovery of a sudden and highly specific response of droplets to slight temperature changes. Six modes of self-propelled droplets motion were identified and classified using a time-temperature phase diagram and probed using a variety of techniques including NMR. This work illustrates how target free search can significantly increase the rate of unpredictable observations leading to new discoveries with potential applications in formulation chemistry.
Electronic Medical Records (EMR) are a rich source of patient information, including measurements reflecting physiologic signs and administered therapies. Identifying which variables are useful in predicting clinical outcomes can be challenging. Advanced algorithms such as deep neural networks were designed to process high-dimensional inputs containing variables in their measured form, thus bypass separate feature selection or engineering steps. We investigated the effect of extraneous input variables on the predictive performance of Recurrent Neural Networks (RNN) by including in the input vector extraneous variables randomly drawn from theoretical and empirical distributions. RNN models using different input vectors (EMR variables; EMR and extraneous variables; extraneous variables only) were trained to predict three clinical outcomes: in-ICU mortality, 72-hour ICU re-admission, and 30-day ICU-free days. The measured degradations of the RNN's predictive performance with the addition of extraneous variables to EMR variables were negligible.