Goto

Collaborating Authors

Accelerating data-driven discoveries

#artificialintelligence

As technologies like single-cell genomic sequencing, enhanced biomedical imaging, and medical "internet of things" devices proliferate, key discoveries about human health are increasingly found within vast troves of complex life science and health data. But drawing meaningful conclusions from that data is a difficult problem that can involve piecing together different data types and manipulating huge data sets in response to varying scientific inquiries. The problem is as much about computer science as it is about other areas of science. That's where Paradigm4 comes in. The company, founded by Marilyn Matz SM '80 and Turing Award winner and MIT Professor Michael Stonebraker, helps pharmaceutical companies, research institutes, and biotech companies turn data into insights.


Prerequisites for understanding RNN at a more mathematical level – Data Science Blog

#artificialintelligence

Writing the A gentle introduction to the tiresome part of understanding RNN Article Series on recurrent neural network (RNN) is nothing like a creative or ingenious idea. It is quite an ordinary topic. But still I am going to write my own new article on this ordinary topic because I have been frustrated by lack of sufficient explanations on RNN for slow learners like me. I think many of readers of articles on this website at least know that RNN is a type of neural network used for AI tasks, such as time series prediction, machine translation, and voice recognition. But if you do not understand how RNNs work, especially during its back propagation, this blog series is for you.


How will data be managed and transferred in autonomous cars?

#artificialintelligence

As the development of autonomous cars continues, the challenges around how data from those vehicles is managed needs to be addressed, according to Dell Technologies' Florian Baumann. There's a lot of buzz around the development of autonomous cars, from discussions about the software that goes into them to the time it will take to have fully autonomous vehicles on the road. However, an area less commonly discussed in relation to autonomous vehicles is the data involved in autonomous cars. The sheer amount of data storage they require highlights questions around how that data will be safely managed, held and transferred when self-driving cars start appearing on our roads. Florian Baumann is the global CTO for automotive and AI in Dell Technologies.


How to Program UMAP from Scratch

#artificialintelligence

And how to improve UMAP. This is the thirteenth article of my column Mathematical Statistics and Machine Learning for Life Sciences where I try to explain some mysterious analytical techniques used in Bioinformatics, Biomedicine, Genetics etc. in a simple way. In the previous post How Exactly UMAP works I started with an intuitive explanation of the math behind UMAP. The best way to learn it is to program UMAP from scratch, this is what we are going to do today. The idea of this post is to show that it is relatively easy for everyone to create their own neighbor graph dimension reduction technique that can provide even better visualization than UMAP. It is going to be lots of coding, buckle up!


EETimes - Will Neuromorphic Computing Compete with Today's AI Accelerators? -

#artificialintelligence

At first glance, the new breed of neuromorphic chips have several things in common with the similarly cutting-edge field of AI accelerators. Both are designed to process artificial neural networks, both offer improvements in performance compared to CPUs, and both claim to be more power efficient. That's where the similarity ends, though: Neuromorphic chips are designed only for special neural networks called spiking networks, and their structure is fundamentally different from anything seen in traditional computing (nothing so conventional as multiply-accumulate units). It is perhaps a too soon to say what the market for these devices will look like, as new applications and technologies continue to emerge. Everything You Need to Know about Neuromorphic Computing.


Banks and insurers expect 86% rise in AI tech investment by 2025

#artificialintelligence

Banks and insurance firms are planning to increase their artificial intelligence-related investment into technology by 2025, according to research from The Economist Intelligence Unit. The report, commissioned by AI-analytics and search firm ThoughtSpot, surveyed 200 business executives and c-suite leaders at investment banks, retail banks and insurance companies in North America, Europe and Asia Pacific. It found that while a large majority (86 per cent) of respondents had a strong degree of confidence in the benefits of AI to shape the future of financial institutions, more than half of respondents said the technology was not yet in use in the business' processes and offerings, with just 15 per cent saying the technology is used extensively across the organisation. However, despite relatively low levels of implementation, the research found that many institutions are beginning to invest in AI over the next five years, with 27 per cent saying it will spur new products and services, a quarter believing it will open up new markets or industries and the same amount saying it is paving the way for innovation in their industry. Looking to the future, 29 per cent of respondents expect between 51 per cent and 75 per cent of their workloads to be supported by AI technologies in five years' time, as processes become increasingly automated.


Top 3 Artificial Intelligence Research Papers – May 2020

#artificialintelligence

The results from all the categories are mindblowing. For example, for traditional language modeling tasks, GPT-3 sets a new SOTA on the Penn Tree Bank dataset by a margin of 15 points based on zero-shot perplexity. GPT-3 showed amazing results in question answering tests. In general, these tests are separated into open-book and closed-book tests. Due to the number of possible queries, open-book tests use an information retrieval system to find relevant text and then the model learns to generate the answer from the question and retrieved text. Closed-book tests don't have this retrieval system.


Artificial Intelligence: A Complete Introduction

#artificialintelligence

As you already know, AI is one of the leading technologies in the world today, and people are talking about it much more than ever. We now can find AI applications every where: from finances, marketing, healthcare, to autonomous vehicles, security, or robotics. However, the domain of AI still lacks of qualified employees while the number of investments in AI is increasing rapidly. Thus, open a great opporturnity for people having a background in this domain. After several years of researching and working in AI, now I'd like to share my knowledge and my experiences to people who want to learn about AI, because I really hope that my small contribution can help many ones find a fast and easy way in learning AI.


Using Artificial Intelligence in Big Data

#artificialintelligence

Simply put, Artificial Intelligence (AI) is the level of intelligence exhibited by machines, in comparison to natural intelligence exhibited by human beings and animals. Therefore it is sometimes referred to as Machine Intelligence. When taught, a machine can effectively perceive its environment and take certain actions to better its chances of achieving set goals successfully. How can a machine be taught? The root of Machine learning involves writing codes or commands using a programming language that the machine understands.


3D printing and artificial intelligence: how they are working

#artificialintelligence

Here at Crendon Insurance Ltd we often cover topics on 3D printing and artificial intelligence. Reporting on the progress of the 3D printing industry and how it is modernising many sectors including manufacturing, construction and automotive has taken our interest for some years now. Whilst in addition, over more recent months, we have begun highlighting the expansion of AI (artificial intelligence) and how it too, is changing the way in which humans will engage with products of the future. Over the last few years, 3D printing has demonstrated as real'gamechanger' in the world of manufacturing. Offering the ability to produce several copies of the same component at a much lower cost, cutting out the middle man to save transportation cost and time and allowing new and innovative entrepreneurs to realise their designs more independently by installing much lower cost 3D printers on-site, has provided just some of the benefits to the growth of the 3D printing industry.