LONDON: Understanding the hierarchical structure of biological networks like human brain -- a network of neurons -- could be useful in creating more complex, intelligent computational brains in the fields of artificial intelligence and robotics, says a study. Like large businesses, many biological networks are hierarchically organised, such as gene, protein, neural, and metabolic networks. This means they have separate units that can each be repeatedly divided into smaller and smaller subunits. Apple to sell solar energy now Apple is now planning to sell excess solar energy produced at its solar farms in Cupertino and Nevada. To understand as to why biological networks evolve to be hierarchical, researchers from the University of Wyoming and the French Institute for Research in Computer Science and Automation (INRIA) simulated the evolution of computational brain models, known as artificial neural networks, both with and without a cost for network connections.
Abstract: Photo-inducedprocesses are fundamental in nature, but accurate simulations are seriously limited by the cost of the underlying quantum chemical calculations, hampering their application for long time scales. Here we introduce a method based on machine learning to overcome this bottleneck and enable accurate photodynamics on nanosecond time scales, which are otherwise out of reach with contemporary approaches. Instead of expensive quantum chemistry during molecular dynamics simulations, we use deep neural networks to learn the relationship between a molecular geometry and its high-dimensional electronic properties. As an example, the time evolution of the methylenimmonium cation for one nanosecond is used to demonstrate that machine learning algorithms can outperform standard excited-state molecular dynamics approaches in their computational efficiency while delivering the same accuracy. Introduction Machine learning (ML) is revolutionizing the most diverse domains, like image recognition , playing board games , or society integration of refugees . Also in chemistry, anincreasing range of applications is being tackled with ML, for example, the design and discovery of new molecules and materials [4, 5, 6]. In the present study, we show how ML enables efficient photodynamics simulations. Photodynamics is the study of photo-induced processes that occur after a molecule is exposed to light. Photosynthesis or DNA photodamage leading to skin cancer are only two examples of phenomena that involve molecules interacting with light [7, 8, 9, 10, 11]. The simulation of such processes has been key to learn structure-dynamicsfunction relationshipsthat can be used to guide the design of photonic materials, such as photosensitive drugs , photocatalysts  and photovoltaics [13, 14].
We develop a Bayesian Poisson matrix factorization model for forming recommendations from sparse user behavior data. These data are large user/item matrices where each user has provided feedback on only a small subset of items, either explicitly (e.g., through star ratings) or implicitly (e.g., through views or purchases). In contrast to traditional matrix factorization approaches, Poisson factorization implicitly models each user's limited attention to consume items. Moreover, because of the mathematical form of the Poisson likelihood, the model needs only to explicitly consider the observed entries in the matrix, leading to both scalable computation and good predictive performance. We develop a variational inference algorithm for approximate posterior inference that scales up to massive data sets. This is an efficient algorithm that iterates over the observed entries and adjusts an approximate posterior over the user/item representations. We apply our method to large real-world user data containing users rating movies, users listening to songs, and users reading scientific papers. In all these settings, Bayesian Poisson factorization outperforms state-of-the-art matrix factorization methods.
Amgen's drug discovery group is a few blocks beyond that. Until recently, Barzilay, one of the world's leading researchers in artificial intelligence, hadn't given much thought to these nearby buildings full of chemists and biologists. But as AI and machine learning began to perform ever more impressive feats in image recognition and language comprehension, she began to wonder: could it also transform the task of finding new drugs? The problem is that human researchers can explore only a tiny slice of what is possible. It's estimated that there are as many as 1060 potentially drug-like molecules--more than the number of atoms in the solar system. But traversing seemingly unlimited possibilities is what machine learning is good at. Trained on large databases of existing molecules and their properties, the programs can explore all possible related molecules.
Future grid scenario analysis requires a major departure from conventional power system planning, where only a handful of most critical conditions is typically analyzed. To capture the inter-seasonal variations in renewable generation of a future grid scenario necessitates the use of computationally intensive time-series analysis. In this paper, we propose a planning framework for fast stability scanning of future grid scenarios using a novel feature selection algorithm and a novel self-adaptive PSO-k-means clustering algorithm. To achieve the computational speed-up, the stability analysis is performed only on small number of representative cluster centroids instead of on the full set of operating conditions. As a case study, we perform small-signal stability and steady-state voltage stability scanning of a simplified model of the Australian National Electricity Market with significant penetration of renewable generation. The simulation results show the effectiveness of the proposed approach. Compared to an exhaustive time series scanning, the proposed framework reduced the computational burden up to ten times, with an acceptable level of accuracy.