Goto

Collaborating Authors

Hardware


[D] I have a Jetson Nano and my friend gave me their eGPU with an NVIDIA 750 TI. Is there a way I can Frankenstein them?

#artificialintelligence

It might be possible but super rough if so. M.2 key e PCIe riser cables exist, ARM nvidia drivers exist for that GPU. Your eGPU likely has a thunderbolt or usb-c port so you'd need a card for that... which may not work with the x1 PCIe connection you get (if you're using the standard board IIRC). You could pull the 750 ti out and use a separate power supply for it. I feel like it could work but I wouldn't necessarily trust my judgement on this.


Lenovo's Google-powered Smart Clocks hit all-time low prices for Black Friday

Engadget

You might have an easier time deciding on a Google Assistant smart display this Black Friday. Lenovo's Smart Clock is on sale now at Best Buy for $35, an all-time low and less than half its original $80 asking price. You can get it for the same price at Walmart. And if you don't need a touchscreen, the Smart Clock Essential is on sale for $25 (half its usual price) at Best Buy and Walmart. Both smart screens are very focused, and that's mostly a good thing.


NVIDIA Is Building an AI Supercomputer for Healthcare Research in England

#artificialintelligence

We Fools may not all hold the same opinions, but we all believe that considering a diverse range of insights makes us better investors. Follow Anders on Twitter, LinkedIn, and Google . Semiconductor designer NVIDIA (NASDAQ:NVDA) today announced that it is building the United Kingdom's most powerful supercomputer ever. Dubbed Cambridge-1, the system will give healthcare researchers access to impressive artificial intelligence (AI) tools. The $52 million Cambridge-1 will be installed at the university it's named after, and is scheduled to come online by the end of 2020.


Nvidia's StyleGAN2: Analyzing and Improving the Image Quality of StyleGAN

#artificialintelligence

Nvidia launches its upgraded version of StyleGAN by fixing artifacts features and further improves the quality of generated images. StyleGAN being the first of its type image generation method to generate very real images was launched last year and open-sourced in February 2019. StyleGAN2 redefines state of the art in unconditional image modeling, both in terms of existing distribution quality metrics as well as perceived image quality. According to the research paper, In StyleGAN2, several methods and characteristics are improved, and changes in both model architecture and training methods are addressed.


Adafruit BrainCraft HAT: Easy AI on Raspberry Pi

#artificialintelligence

Adafruit is very well known in the maker and electronics community. For 15 years the New York-based company has provided kits and boards for Arduino, Beaglebone and Raspberry Pi and their latest board is the $39.95 BrainCraft HAT. Designed for use with the Raspberry Pi 4, this HAT is a hub of inputs and outputs, including a screen that shows image recognition, to facilitate machine learning Raspberry Pi projects. If you are keen to try out machine learning projects using TensorFlow Lite then the Raspberry Pi 4 is the ideal machine for taking your first steps. It is cheap to buy, has plenty of power and adaptable to your needs.


StradVision Joins NVIDIA Inception Program as Premier Partner

#artificialintelligence

StradVision has joined NVIDIA Inception, a virtual accelerator program designed to nurture companies that are revolutionizing industries with advancements in AI and data sciences. Distinguishing itself as a collaborator of choice from among other AI companies, StradVision has also been selected as one of the program's Premier Partners, an exclusive group within NVIDIA Inception's global network of over 6,000 startups. StradVision specializes in AI-based vision processing technology for Advanced Driver-Assistance Systems (ADAS) and Autonomous Vehicles (AVs) via SVNet, their flagship product. It is a lightweight embedded software that allows vehicles to detect and identify objects on the road accurately, even in harsh weather conditions or poor lighting. Thanks to StradVision's patented Deep Neural Network-enabled technology, SVNet can be optimized for any hardware system.


Nvidia developed a radically different way to compress video calls

#artificialintelligence

Last month, Nvidia announced a new platform called Maxine that uses AI to enhance the performance and functionality of video conferencing software. The software uses a neural network to create a compact representation of a person's face. This compact representation can then be sent across the network, where a second neural network reconstructs the original image--possibly with helpful modifications. Nvidia says that its technique can reduce the bandwidth needs of video conferencing software by a factor of 10 compared to conventional compression techniques. It can also change how a person's face is displayed.


FPGAs could replace GPUs in many deep learning applications

#artificialintelligence

The renewed interest in artificial intelligence in the past decade has been a boon for the graphics cards industry. Companies like Nvidia and AMD have seen a huge boost to their stock prices as their GPUs have proven to be very efficient for training and running deep learning models. Nvidia, in fact, has even pivoted from a pure GPU and gaming company to a provider of cloud GPU services and a competent AI research lab. But GPUs also have inherent flaws that pose challenges in putting them to use in AI applications, according to Ludovic Larzul, CEO and co-founder of Mipsology, a company that specializes in machine learning software. The solution, Larzul says, are field programmable gate arrays (FPGA), an area where his company specializes. FPGA is a type of processor that can be customized after manufacturing, which makes it more efficient than generic processors.


Nvidia Q3 revenue $4.73 billion, EPS $2.91 beat expectations, Q4 view higher as well

ZDNet

Shares of graphics chip titan Nvidia declined slightly in late trading after the company this afternoon reported fiscal Q3 revenue and profit that comfortably exceeded Wall Street's expectations. CEO Jensen Huang said NVIDIA is "firing on all cylinders. Demand, he said, for the company's latest video game compute cards, the GeForce RTX GPU, "is overwhelming." Revenue in the three months ended in October rose 57%, year over year, to $4.73 billion, yielding EPS of $2.91. Analysts had been modeling $4.42 billion in revenue and $2.58 per share in earnings. Nvidia's revenue from its data center business rose by 162%, year over year, to $1.9 billion, the company said. Revenue from video gaming was up 37% at $2.27 billion. For the current quarter, the company sees revenue of $4.896 billion to $4.704 billion. That is well ahead of consensus for $4.4 billion and $2.54. Nvidia has been riding the success of its new generation of GPUs, the A100 series, introduced in May. Huang said that cloud companies are "deploying it globally," referring to the parts. "And our customers are moving some of the world's most popular AI services into production, powered by NVIDIA technology.


Nvidia unveils new DGX Station AI workstation, joining market for AI appliances

ZDNet

GPU titan Nvidia on Monday morning unveiled what it calls AI computing on your desktop, the DGX Station A100, which will be sold by a variety of partners and is expected to be available "this quarter," Nvidia said. The announcement comes at the start of SC20, a supercomputing conference usually held in San Diego every year and this time around held as a virtual event given the COVID-19 pandemic. Nvidia calls the DGX Station A100 an "AI appliance you can place anywhere." The box, measuring 25 inches high, 10 inches across, and 20 inches deep, comes with four GPUs, either existing 40-gigabyte A100 GPUs, or a newly unveiled 80-gigabyte version. It weighs 91 lbs, though fully outfitted, it tops out at 127 lbs.