Goto

Collaborating Authors

deep learning


Machine Learning for Future System Designs

#artificialintelligence

As an engineering director leading research projects into the application of machine learning (ML) and deep learning (DL) to computational software for electronic design automation (EDA), I believe I have a unique perspective on the future of the electronic and electronic design industries. The next leap in design productivity for semiconductor chips and the systems built around them will come from the fusion of fully integrated EDA computational software tool flows, the application of distributed and multi-core computing on a broader scale and ML/DL. The current wave of artificial intelligence (AI) and ML innovation began with improved GPU computing capacity and the smart engineers who figured out how to harness it to accelerate deep neural network training. AI/ML will play a key role in the design of next-generation platforms, enabling the proliferation of today's technology drivers including 5G, hyperscale computing and others. In my role, the fun comes from the numerous non-deterministic polynomial (NP)-hard and NP-complete problems that exist at every stage of the design and verification process.


Monte Carlo Simulations for Predicting American Stock Prices

#artificialintelligence

First of all, to say that today, predicting as such the exact market structure that will happen in X given time, is not possible, I'm sorry:( . This is due to factors that are unknown in advance, since the future price does not depend only on the past price, but also on macroeconomic changes and concrete business decisions. Examples of this are the advanced recurrent neural networks (RNN) or the new LSTM, they are going to give us a late prediction, although at first sight they seem good, they will have a certain delay. However, we would have to approach the market with another strategy if we want to implement neural networks, but this will be in another article. In this article we are interested in explaining how we can establish a maximum and minimum price for any asset (in this case we will work on American equities) with a certain probability.


The History of Deep Learning: Top Moments That Shaped the Technology

#artificialintelligence

The origins of deep learning and neural networks date back to the 1950s, when British mathematician and computer scientist Alan Turing predicted the future existence of a supercomputer with human-like intelligence and scientists began trying to rudimentarily simulate the human brain. Here's an excellent summary of how that process worked, courtesy of the very smart MIT Technology Review: A program maps out a set of virtual neurons and then assigns random numerical values, or "weights," to connections between them. These weights determine how each simulated neuron responds--with a mathematical output between 0 and 1--to a digitized feature such as an edge or a shade of blue in an image, or a particular energy level at one frequency in a phoneme, the individual unit of sound in spoken syllables. Programmers would train a neural network to detect an object or phoneme by blitzing the network with digitized versions of images containing those objects or sound waves containing those phonemes. If the network didn't accurately recognize a particular pattern, an algorithm would adjust the weights.


Edge to Core AI Futures for OEMs

#artificialintelligence

The ability of computers to autonomously learn, predict, and adapt using massive datasets is driving innovation and competitive advantage across many industries and applications. The artificial intelligence (AI) is budding faster and prompting businesses to hop aboard the next big wave of computing to uncover deeper insight, quickly resolve their most difficult problems, and differentiate their products and services. Whether the goal is to build a smarter city, power an intelligent car, or deliver personalized medicine, we've only just begun to understand the real potential of AI. For the implementation of AI, HPE OEM has the expertise, edge to core technologies and partner ecosystem to help explore different use cases, experiment with AI and data technologies, and build the solution to be enterprise-ready. HPE OEM will benefit at all stages of the journey from formulating a roadmap through implementation and data migration.


Detecting Alzheimer's Earlier with the Help of Machine-Learning Algorithm

#artificialintelligence

Functional magnetic resonance imaging (fMRI) is a noninvasive diagnostic technique for brain disorders, such as Alzheimer's disease (AD). It measures minute changes in blood oxygen levels within the brain over time, giving insight into the local activity of neurons; however, fMRI has not been widely used in clinical diagnosis. Their limited use is due to the fact fMRI data are highly susceptible to noise, and the fMRI data structure is very complicated compared to a traditional x-ray or MRI scan. Scientists from Texas Tech University now report they developed a type of deep-learning algorithm known as a convolutional neural network (CNN) that can differentiate among the fMRI signals of healthy people, people with mild cognitive impairment, and people with AD. Their findings, "Spatiotemporal feature extraction and classification of Alzheimer's disease using deep learning 3D-CNN for fMRI data," is published in the Journal of Medical Imaging and led by Harshit Parmar, doctoral student at Texas Tech University.


Google AI executive sees a world of trillions of devices untethered from human care

ZDNet

If artificial intelligence is going to spread to trillions of devices, those devices will have to operate in a way that doesn't need a human to run them, a Google executive who leads a key part of the search giant's machine learning software told a conference of chip designers this week. "The only way to scale up to the kinds of hundreds of billions or trillions of devices we are expecting to emerge into the world in the next few years is if we take people out of the care and maintenance loop," said Pete Warden, who runs Google's effort to bring deep learning to even the simplest embedded devices. "You need to have peel-and-stick sensors," said Warden, ultra-simple, dirt-cheap devices that require only tiny amounts of power and cost pennies. "And the only way to do that is to make sure that you don't need to have people going around and doing maintenance." Warden was the keynote speaker Tuesday at a microprocessor conference held virtually, The Linley Fall Processor Conference, hosted by chip analysts The Linley Group.


Save hundreds on these Python, AI and data science courses

Engadget

In this age of big data, companies worldwide need to sift through the avalanche of information at their disposal to enhance their products, services and overall profitability. Many companies rely on programming languages like Python and the advancements made in artificial intelligence (AI) and data science to get that job done. Right now, you can save hundreds on The Ultimate Python & Artificial Intelligence Certification Bundle, featuring nine in-depth courses and 38 hours of video content that catches you up to speed on everything Python, AI and data science.


What is Reinforcement Learning and 9 examples of what you can do with it.

#artificialintelligence

Reinforcement Learning is a subset of machine learning. It enables an agent to learn through the consequences of actions in a specific environment. It can be used to teach a robot new tricks, for example. Reinforcement learning is a behavioral learning model where the algorithm provides data analysis feedback, directing the user to the best result. It differs from other forms of supervised learning because the sample data set does not train the machine.


The Next Generation Of Artificial Intelligence (Part 2)

#artificialintelligence

Deep learning pioneer Yoshua Bengio has provocative ideas about the future of AI. For the first part of this article series, see here. It has only been 8 years since the modern era of deep learning began at the 2012 ImageNet competition. Progress in the field since then has been breathtaking and relentless. If anything, this breakneck pace is only accelerating. Five years from now, the field of AI will look very different than it does today.


My failed startup: Lessons I learned by not becoming a millionaire

#artificialintelligence

Let's start with the one minute version: I was part of the EF12 London cohort in 2019, where I met my co-founder. A privacy-preserving medical-data marketplace and AI platform built around federated deep learning. The purpose of the platform would have been to allow data scientists to train deep learning models on highly sensitive healthcare data without that data ever leaving the hospitals. At the same time, thanks to a novel data monetization strategy and marketplace component, hospitals would have been empowered to make money from the data they are generating. We received pre-seed funding, valued at $1 million. Then the race for demo day began with frantic product building and non-stop business development.