Goto

Collaborating Authors

NVIDIA's AI Turns 2D Images Into 3D Models

#artificialintelligence

The team states that they trained their DIB-R neural network using multiple datasets including images previously turned into 3D assets, 3D models presented from multiple angles, and more. NVIDIA wrote that it takes about two days to train the neural network on how to extrapolate the extra dimensions, and then it can transform a 2D photo in less than 100 milliseconds.


Essentials

#artificialintelligence

Discover the latest AI research & find out how AI, Machine Learning and advanced algorithms impact our lives, our jobs and the economy, all thanks to expert articles that include discussion on the potential, limits and consequences of AI.


Powering artificial intelligence: The explosion of new AI hardware accelerators

#artificialintelligence

AI's rapid evolution is producing an explosion in new types of hardware accelerators for machine learning and deep learning. Some people refer to this as a "Cambrian explosion," which is an apt metaphor for the current period of fervent innovation. From that point onward, these creatures--ourselves included--fanned out to occupy, exploit, and thoroughly transform every ecological niche on the planet. The range of innovative AI hardware-accelerator architectures continues to expand. Although you may think that graphic processing units (GPUs) are the dominant AI hardware architecture, that is far from the truth.


Artificial intelligence is good for at least one thing – making hardware important again

#artificialintelligence

Red Hat Summit If you're cynical about artificial intelligence, here's one ray of sunshine for you: it's got engineers around the globe focusing on improving number-crunching and computing performance right down to the silicon level. Rather than throwing thousands upon thousands of generic boring servers at problems, now techies are doubling down on accelerating particular workloads – such training neural network models and AI inference – with faster processors, highly customized chips, FPGAs, GPUs, and similar technology. Daniel Riek, senior director of AI for Red Hat, said on Wednesday that the machine-learning software explosion has forced developers, and hardware and system designers, to go back and look at boosting per-chip throughput rather than scaling out platforms over warehouses of boxes. "Performance wasn't a key differentiator, it was scale that mattered. We traded off performance for scale," Riek said of the days before enthusiasm in AI was recently rekindled.


Deep Learning for Industrial IoT with NVIDIA

#artificialintelligence

Deploying valuable, measurable, and scalable IoT initiatives is top of mind for fortune 500 businesses. Andrew will discuss where AI in Industrial IoT is heading, how to apply the significant advances in GPU deep learning to perform deep learning training, develop an end to end AI infrastructure for your training and inference needs as you deploy AI in industrial IoT industry, and implement AI software applications to increase users' productivity.