Collaborating Authors

GPU vs FPGA: The Battle For AI Hardware Rages On


The hardware requirement for AI and deep learning applications has evolved exponentially. With a large number of computations being processed from AI-based and deep learning-based systems, there is a need for a stronger and reliable support system to carry it. This is where GPUs (Graphics Processing Unit) and FPGAs (Field Programmable Gate Arrays) come into the picture, which has considerably sped the development of AI and ML. Both FPGA and GPU vendors offer a platform to process information from raw data in a fast and efficient manner. While in an earlier article we have compared the use of these two AI chips for autonomous car makers, in this article we would do a comparison for other data-intensive work such as deep learning.

Nvidia's self-driving car doesn't need road markings or even pavement


Most people will recognize Nvidia as a company that makes graphics cards for computers. But the company is also heavily involved in the auto industry, making computer chips with powerful processing capabilities that control everything from infotainment to self-driving systems. It turns out Nvidia is also developing its own self-driving system, and judging from this demonstration video it looks like Nvidia's systems could be one of the most advanced. The key is machine learning (also referred to as deep learning), a form of artificial intelligence where computers "learn" how to perform actions on their own, i.e. without the need to be explicitly programmed. Nvidia says its system didn't require any programming for object detection, mapping, path planning or control components.

Jetson Community Projects


This is a sample showing how to do real-time video analytics with NVIDIA Deepstream [SDK] on a NVIDIA Jetson Nano device connected to Azure via Azure IoT Edge. Deepstream is a highly-optimized video processing pipeline, capable of running deep neural networks. It is a must-have tool whenever you have complex video analytics requirements, whether its real-time or with cascading AI models. IoT Edge gives you the possibility to run this pipeline next to your cameras, where the video data is being generated, thus lowering your bandwitch costs and enabling scenarios with poor internet connectivity or privacy concerns. With this solution, you can transform cameras into sensors to know when there is an available parking spot, a missing product on a retail store shelf, an anomaly on a solar panel, a worker approaching a hazardous zone, etc.

Nvidia CEO: Software is eating the world, but AI is going to eat software MIT Tech Review


Tech companies and investors have recently been piling money into artificial intelligence--and plenty has been trickling down to chip maker Nvidia. The company's revenues have climbed as it has started making hardware customized for machine-learning algorithms and use cases such as autonomous cars. At the company's annual developer conference in San Jose, California, this week, the company's CEO Jensen Huang spoke to MIT Technology Review about how the machine-learning revolution is just starting.

NVIDIA is officially buying Arm for $40 billion


Just as we expected, NVIDIA just announced that it's buying the semiconductor design company Arm for $40 billion. The deal will make NVIDIA into an even larger presence in mobile computing, especially when it comes to bringing its AI technology into platforms like smartphones, PCs and self-driving cars. Arm, meanwhile, will get even more support for R&D efforts as well as access to NVIDIA's entire suite of products. And to cement its commitment, NVIDIA says it will build an AI supercomputer powered by Arm CPUs at the company's Cambridge headquarters. "AI is the most powerful technology force of our time and has launched a new wave of computing," Jensen Huang, NVIDIA's CEO, said in a statement.