Results


Andrew Ng sees an eternal springtime for AI

ZDNet

"We may be in the eternal spring of AI," says Andrew Ng, a luminary in the field of machine learning. Ng, a co-founder and former director of Google's AI team, sat down for an interview with ZDNet to discuss his just-published "playbook" for how to use the technology, which is available as a free download. He dismissed worries that artificial intelligence technology may be entering another one of its periodic "winters," when interest, and funding, drops off sharply. Andrew Ng explains the five principles of his "Playbook for AI." Machine learning, in the form of "connectionist" theories that model computing loosely along the lines of neurons in the brain, has gone through boom and bust cycles, flowering initially with Frank Rosenblatt's "perceptron" in the late 1950s, cooling in the late 60s, emerging again in the late 1980s only to again fall out of favor, and now suddenly back in vogue in the last several years. Those periodic coolings have been termed an "AI winter."


New climate model to be built from the ground up

MIT News

The following news article is adapted from a press release issued by Caltech, in partnership with the MIT School of Science, the Naval Postgraduate School, and the Jet Propulsion Laboratory. Facing the certainty of a changing climate coupled with the uncertainty that remains in predictions of how it will change, scientists and engineers from across the country are teaming up to build a new type of climate model that is designed to provide more precise and actionable predictions. Leveraging recent advances in the computational and data sciences, the comprehensive effort capitalizes on vast amounts of data that are now available and on increasingly powerful computing capabilities both for processing data and for simulating the Earth system. The new model will be built by a consortium of researchers led by Caltech, in partnership with MIT; the Naval Postgraduate School (NPS); and the Jet Propulsion Laboratory (JPL), which Caltech manages for NASA. The consortium, dubbed the Climate Modeling Alliance (CliMA), plans to fuse Earth observations and high-resolution simulations into a model that represents important small-scale features, such as clouds and turbulence, more reliably than existing climate models.


Dual 8-bit breakthroughs bring AI to the edge

#artificialintelligence

This week, at the International Electron Devices Meeting (IEDM) and the Conference on Neural Information Processing Systems (NeurIPS), IBM researchers will showcase new hardware that will take AI further than it's been before: right to the edge. Our novel approaches for digital and analog AI chips boost speed and slash energy demand for deep learning, without sacrificing accuracy. On the digital side, we're setting the stage for a new industry standard in AI training with an approach that achieves full accuracy with eight-bit precision, accelerating training time by two to four times over today's systems. On the analog side, we report eight-bit precision--the highest yet--for an analog chip, roughly doubling accuracy compared with previous analog chips while consuming 33x less energy than a digital architecture of similar precision. These achievements herald a new era of computing hardware designed to unleash the full potential of AI.


Nvidia AI research points to an evolution of the chip business

ZDNet

What happens as more of the world's computer tasks get handed over to neural networks? That's an intriguing prospect, of course, for Nvidia, a company selling a whole heck of a lot of chips to train neural networks. The prospect cheers Bryan Catanzaro, who is the head of applied deep learning research at Nvidia. "We would love for model-based to be more of the workload," Catanzaro told ZDNet this week during an interview at Nvidia's booth at the NeurIPS machine learning conference in Montreal. Catanzaro was the first person doing neural network work at Nvidia when he took a job there in 2011 after receiving his PhD from the University of California at Berkeley in electrical engineering and computer science.


The Fourth Industrial Revolution Built On Blockchain And Advanced With AI

#artificialintelligence

Blockchain technology is so synonymous with cryptocurrencies, and especially Bitcoin that it is almost like the financial sector has usurped its potential. In times like these, where an investing bear market has befallen the cryptocurrency space, it is easy to get down on the revolutionary possibilities of blockchain technology. Since the original Blockchain, that is Bitcoin, emerged, there has been a considerable focus on transactional blockchains which have been at the forefront of the mainstream understanding of the technology. Bitcoin is often the layman's first point of call with the stories of investing success stories obscuring the view of other possibilities. However, blockchain technology is moving along in an undercurrent separate from the comings and goings of the cryptocurrency market and the financial interest it has garnered in just a few short years.


IBM boosts AI chip speed, bringing deep learning to the edge

#artificialintelligence

IBM is unveiling new hardware that brings power efficiency and improved training times to artificial intelligence (AI) projects this week at the International Electron Devices Meeting (IEDM) and the Conference on Neural Information Processing Systems (NeurIPS), with 8-bit precision for both their analog and digital chips for AI. Over the last decade, computing performance for AI has improved at a rate of 2.5x per year, due in part to the use of GPUs to accelerate deep learning tasks, the company noted in a press release. However, this improvement is not sustainable, as most of the potential performance from this design model--a general-purpose computing solution tailored to AI--will not be able to keep pace with hardware designed exclusively for AI training and development. Per the press release, "Scaling AI with new hardware solutions is part of a wider effort at IBM Research to move from narrow AI, often used to solve specific, well-defined tasks, to broad AI, which reaches across disciplines to help humans solve our most pressing problems." While traditional computing has been in a decades-long path of increasing address width--with most consumer, professional, and enterprise-grade hardware using 64-bit processors--AI is going the opposite direction.


What You Need to Know About TensorFlow -

#artificialintelligence

There are some detection problems in the world that only experts can solve, and by doing so are saving lives every day. Radiologists looking for intracerebral hemorrhage (ICH) save lives, but their time is scarce and expensive. But what if we could build an AI to perform this sort of detection? It is no simple task to train a CNN model, such as U-Net, to achieve this. But with the progress of deep learning libraries such as TensorFlow, the revolution of cloud providers such as AWS, Azure, and GCP, and deep learning platforms such as MissingLink, it's becoming increasingly feasible for startups to build an app at almost any scale--including to mimic the work of radiologists and other experts.


The Fourth Industrial Revolution Built On Blockchain And Advanced With AI

#artificialintelligence

Blockchain technology is so synonymous with cryptocurrencies, and especially Bitcoin that it is almost like the financial sector has usurped its potential. In times like these, where an investing bear market has befallen the cryptocurrency space, it is easy to get down on the revolutionary possibilities of blockchain technology. Since the original Blockchain, that is Bitcoin, emerged, there has been a considerable focus on transactional blockchains which have been at the forefront of the mainstream understanding of the technology. Bitcoin is often the layman's first point of call with the stories of investing success stories obscuring the view of other possibilities. However, blockchain technology is moving along in an undercurrent separate from the comings and goings of the cryptocurrency market and the financial interest it has garnered in just a few short years.


Amazon Wants You to Code the AI Brain for This Little Car

WIRED

Two years ago, Alphabet researchers made computing history when their artificial intelligence software AlphaGo defeated a world champion at the complex board game Go. Amazon now hopes to democratize the AI technique behind that milestone--with a pint-size self-driving car. The 1/18th-scale vehicle is called DeepRacer, and it can be preordered for $249; it will later cost $399. It's designed to make it easier for programmers to get started with reinforcement learning, the technique that powered AlphaGo's victory and is loosely inspired by how animals learn from feedback on their behavior. Although the approach has produced notable research stunts, such as bots that can play Go, chess, and complicated multiplayer electronic games, it isn't as widely used as the pattern-matching learning techniques used in speech recognition and image analysis.


How To Become A Machine Learning Engineer: Learning Path

#artificialintelligence

We will walk you through all the aspects of machine learning from simple linear regressions to the latest neural networks, and you will learn not only how to use them but also how to build them from scratch. Big part of this path is oriented on Computer Vision(CV), because it's the fastest way to get general knowledge, and the experience from CV can be simply transferred to any ML area. We will use TensorFlow as a ML framework, as it is the most promising and production ready. Learning will be better if you work on theoretical and practical materials at the same time to get practical experience on the learned material. Also if you want to compete with other people solving real life problems I would recommend you to register on Kaggle, as it could be a good addition to your resume.