Tiny robots made using pollen could one day be used to clean contaminated water. Waste water from some factories contains mercury, a metal that can cause illness if consumed. There are techniques to remove mercury in water treatment plants, but they are time consuming and expensive. Martin Pumera at the University of Chemistry and Technology, Prague, in the Czech Republic, and his colleagues are working on a low-cost alternative.
The adoption of artificial intelligence and machine learning technologies has never been more critical. Due to COVID-19, many organizations need to find a new way of working. Ensuring production rates are reliable, if not increased, while limiting the number of personnel - in some cases down to 50%. Many asset heavy industries, such as water, transportation & energy are considered critical infrastructure. Every effort needs to be made to maintain these.
Water management issues are at the center of environmental debates taking place across the globe. Irrational distribution, leakages, contamination, and overuse of groundwater are some of the biggest challenges associated with the water management industry. Today, industry leaders are exploring AI development services for water management systems to mitigate the water crisis using AI and IoT devices. Together, these technologies provide effective mechanisms to monitor water quality, detect leakages, analyze demand, and streamline global water management. This blog post explores and highlights some AI use cases for the diverse water industry.
The "Innovations in Renewable Energy Generation, Desalination, Artificial Intelligence, LEDs, and Vaccines" report has been added to ResearchAndMarkets.com's offering. This edition of the Inside R&D TechVision Opportunity Engine (TOE) features an innovation for enhancing digital imaging in deep learning and an innovation based on using novel receptors for mitigating vector borne diseases. The TOE also provides intelligence on the efficient conversion of carbon dioxide in to value added products and the use of passive solar power for desalination. The TOE also features innovations based on the use of sustainable materials for oil water separation and environment friendly materials that can be used in the construction industry. The TOE additionally provides insights on numerous AI-based solutions for detection of cyber attacks, accurate assessment of diseases, and for the improvement of haptic feedback during telerobotic surgeries.
Multiclass classification problems are those where a label must be predicted, but there are more than two labels that may be predicted. These are challenging predictive modeling problems because a sufficiently representative number of examples of each class is required for a model to learn the problem. It is made challenging when the number of examples in each class is imbalanced, or skewed toward one or a few of the classes with very few examples of other classes. Problems of this type are referred to as imbalanced multiclass classification problems and they require both the careful design of an evaluation metric and test harness and choice of machine learning models. The E.coli protein localization sites dataset is a standard dataset for exploring the challenge of imbalanced multiclass classification.
Due to the growing amount of data from in-situ sensors in wastewater systems, it becomes necessary to automatically identify abnormal behaviours and ensure high data quality. This paper proposes an anomaly detection method based on a deep autoencoder for in-situ wastewater systems monitoring data. The autoencoder architecture is based on 1D Convolutional Neural Network (CNN) layers where the convolutions are performed over the inputs across the temporal axis of the data. Anomaly detection is then performed based on the reconstruction error of the decoding stage. The approach is validated on multivariate time series from in-sewer process monitoring data. We discuss the results and the challenge of labelling anomalies in complex time series. We suggest that our proposed approach can support the domain experts in the identification of anomalies.
Recent drought and population growth are planting unprecedented demand for the use of available limited water resources. Irrigated agriculture is one of the major consumers of freshwater. A large amount of water in irrigated agriculture is wasted due to poor water management practices. To improve water management in irrigated areas, models for estimation of future water requirements are needed. Developing a model for forecasting irrigation water demand can improve water management practices and maximise water productivity. Data mining can be used effectively to build such models. In this study, we prepare a dataset containing information on suitable attributes for forecasting irrigation water demand. The data is obtained from three different sources namely meteorological data, remote sensing images and water delivery statements. In order to make the prepared dataset useful for demand forecasting and pattern extraction, we pre-process the dataset using a novel approach based on a combination of irrigation and data mining knowledge. We then apply and compare the effectiveness of different data mining methods namely decision tree (DT), artificial neural networks (ANNs), systematically developed forest (SysFor) for multiple trees, support vector machine (SVM), logistic regression, and the traditional Evapotranspiration (ETc) methods and evaluate the performance of these models to predict irrigation water demand. Our experimental results indicate the usefulness of data pre-processing and the effectiveness of different classifiers. Among the six methods we used, SysFor produces the best prediction with 97.5% accuracy followed by a decision tree with 96% and ANN with 95% respectively by closely matching the predictions with actual water usage. Therefore, we recommend using SysFor and DT models for irrigation water demand forecasting.
Artificial intelligence (AI) has proved to be a useful ally in the battle against antibiotic resistance. A powerful antibiotic that's even able to kill superbugs has been discovered thanks to a machine-learning algorithm Researchers from MIT used a novel computer algorithm to sift through a vast digital archive of over 100 million chemical compounds and spot those that were able to kill bacteria using different mechanisms from existing drugs. Reported in the journal Cell, this method highlighted a molecule that appeared to possess some truly remarkable antibiotic properties. The team named the molecule halicin, a hat tip to the sentient AI system "Hal" from Stanley Kubrick's film 2001: A Space Odyssey. When tested in mice, halicin was able to effectively treat tuberculosis and drug-resistant Enterobacteriaceae, the family of bacteria that includes E. coli and Salmonella.
Water is perhaps an indispensable part of life and it is also a must-have resource. For many countries or areas, to have clean, non-polluted water is a luxury. HTG founder Lin Tseng‐Hsian [Frank Lin], has over 30 years of research under his belt and a qualified international professional environmental APEC engineer. The company has over the years accrued a significant database through analyses of biological life forms in bodies of water and assessing the level of pollution in waters, leading to ever more rapid diagnosis of the condition of the water system. The development of remote monitoring of effluent processing plants has allowed them to maximize the efficiency of chemical dosages used for water sanitation, as well as making changes to wastewater processing projects on the fly.
Infrastructure monitoring is critical for safe operations and sustainability. Water distribution networks (WDNs) are large-scale networked critical systems with complex cascade dynamics which are difficult to predict. Ubiquitous monitoring is expensive and a key challenge is to infer the contaminant dynamics from partial sparse monitoring data. Existing approaches use multi-objective optimisation to find the minimum set of essential monitoring points, but lack performance guarantees and a theoretical framework. Here, we first develop Graph Fourier Transform (GFT) operators to compress networked contamination spreading dynamics to identify the essential principle data collection points with inference performance guarantees. We then build autoencoder (AE) inspired neural networks (NN) to generalize the GFT sampling process and under-sample further from the initial sampling set, allowing a very small set of data points to largely reconstruct the contamination dynamics over real and artificial WDNs. Various sources of the contamination are tested and we obtain high accuracy reconstruction using around 5-10% of the sample set. This general approach of compression and under-sampled recovery via neural networks can be applied to a wide range of networked infrastructures to enable digital twins.