Modern technology makes it possible to sequence individual cells and to identify which genes are currently being expressed in each cell. These methods are sensitive and consequently error prone. Devices, environment and biology itself can be responsible for failures and differences between measurements. Researchers at Helmholtz Zentrum München joined forces with colleagues from the Technical University of Munich (TUM) and the British Wellcome Sanger Institute and have developed algorithms that make it possible to predict and correct such sources of error. The work was published in'Nature Methods' and'Nature Communications'.
In the business world, partnerships on an equal footing like these are part of the key to success in competitive and fast-moving environments. They bring access to expertise, more effective products and services, and greater potential for innovation and stability. Earlier this year, SAP and NVIDIA expanded their collaboration to create business applications based on artificial intelligence. Now, as NVIDIA's GPU Technology Conference kicks off in Munich, Germany, the partnership has gained even further substance. SAP installed its first NVIDIA DGX-1 systems – the world's first AI supercomputer – in Israel and Potsdam in 2016.
Oktoberfest is the world's largest beer festival and is held annually in Munich since 1810. It lasts between 16 and 18, running from mid or late September to the first Sunday in October, with more than 6 million visitors every year. Munchen.de is the official portal of the city of Munich with contains more than 140 datasets, covering a wide rage of topics such as economy, transport, tourism, or culture. Currently, more and more European cities provide an open data portal, allowing companies, citizens, researcher, and other public institutions to make use of the data generated. For this article, we employ one of the data sets available in Munchen.de, containing information about Oktoberfest since 1985 until now.
To improve evaluation efficiency, a team of researchers at Helmholtz Zentrum München and the University Hospital, LMU Munich, trained a deep neuronal network with almost 20,000 single cell images to classify them. Dr. med Karsten Spiekermann and Simone Schwarz from the Department of Medicine III, University Hospital, LMU Munich, used images which were extracted from blood smears of 100 patients suffering from the aggressive blood disease AML and 100 controls. The new AI-driven approach was then evaluated by comparing its performance with the accuracy of human experts. The result showed that the AI-driven solution is able to identify diagnostic blast cells at least as good as a trained cytologist expert. Deep learning algorithms for image processing require two things: first, an appropriate convolutional neural network architecture with hundreds of thousands of parameters; second, a sufficiently large amount of training data.
The field of probabilistic numerics (PN), loosely speaking, attempts to provide a statistical treatment of the errors and/or approximations that are made en route to the output of a deterministic numerical method, e.g. the approximation of an integral by quadrature, or the discretised solution of an ordinary or partial differential equation. This decade has seen a surge of activity in this field. In comparison with historical developments that can be traced back over more than a hundred years, the most recent developments are particularly interesting because they have been characterised by simultaneous input from multiple scientific disciplines: mathematics, statistics, machine learning, and computer science. The field has, therefore, advanced on a broad front, with contributions ranging from the building of overarching generaltheory to practical implementations in specific problems of interest. Over the same period of time, and because of increased interaction among researchers coming from different communities, the extent to which these developments were -- or were not -- presaged by twentieth-century researchers has also come to be better appreciated. Thus, the time appears to be ripe for an update of the 2014 Tübingen Manifesto on probabilistic numerics[Hennig, 2014, Osborne, 2014d,c,b,a] and the position paper[Hennig et al., 2015] to take account of the developments between 2014 and 2019, an improved awareness of the history of this field, and a clearer sense of its future directions. In this article, we aim to summarise some of the history of probabilistic perspectives on numerics (Section 2), to place more recent developments into context (Section 3), and to articulate a vision for future research in, and use of, probabilistic numerics (Section 4).