Goto

Collaborating Authors

Results


NVIDIA Unveils Jetson Nano 2GB: The Ultimate AI and Robotics

#artificialintelligence

NVIDIA expanded the NVIDIA Jetson AI at the Edge platform with an entry-level developer kit priced at just $59, opening the potential of AI and robotics to a new generation of students, educators and hobbyists. The Jetson Nano 2GB Developer Kit is designed for teaching and learning AI by creating hands-on projects in such areas as robotics and intelligent IoT. To support the effort, NVIDIA also announced the availability of free online training and AI-certification programs, which will supplement the many open-source projects, how-tos and videos contributed by thousands of developers in the vibrant Jetson community. "While today's students and engineers are programming computers, in the near future they'll be interacting with, and imparting AI to, robots," said Deepu Talla, vice president and general manager of Edge Computing at NVIDIA. "The new Jetson Nano is the ultimate starter AI computer that allows hands-on learning and experimentation at an incredibly affordable price."


NVIDIA Shield TV returns to all-time low of $129 before Black Friday

Engadget

If you like the Android TV ecosystem, NVIDIA's Shield TV is a great streaming option if you also want something a bit more powerful than a standard Chromecast. It's an even more compelling pick now that it's down to $129 in an early Black Friday sale -- that's an all-time-low price that has come around before, but not very often. This streaming device earned a spot in our holiday gift guide this year for being one of the more powerful dongles you can get thanks to NVIDIA's Tegra X1 processor. The company claims this chip is 25 percent faster than the X1 that came before it, and in the Shield TV, it's put to good use by helping power an AI-powered HD to 4K upscaler. That means lower-res footage will more closely resemble 4K footage when streamed with the Shield TV.


[D] I have a Jetson Nano and my friend gave me their eGPU with an NVIDIA 750 TI. Is there a way I can Frankenstein them?

#artificialintelligence

It might be possible but super rough if so. M.2 key e PCIe riser cables exist, ARM nvidia drivers exist for that GPU. Your eGPU likely has a thunderbolt or usb-c port so you'd need a card for that... which may not work with the x1 PCIe connection you get (if you're using the standard board IIRC). You could pull the 750 ti out and use a separate power supply for it. I feel like it could work but I wouldn't necessarily trust my judgement on this.


Lenovo's Google-powered Smart Clocks hit all-time low prices for Black Friday

Engadget

You might have an easier time deciding on a Google Assistant smart display this Black Friday. Lenovo's Smart Clock is on sale now at Best Buy for $35, an all-time low and less than half its original $80 asking price. You can get it for the same price at Walmart. And if you don't need a touchscreen, the Smart Clock Essential is on sale for $25 (half its usual price) at Best Buy and Walmart. Both smart screens are very focused, and that's mostly a good thing.


NVIDIA Is Building an AI Supercomputer for Healthcare Research in England

#artificialintelligence

We Fools may not all hold the same opinions, but we all believe that considering a diverse range of insights makes us better investors. Follow Anders on Twitter, LinkedIn, and Google . Semiconductor designer NVIDIA (NASDAQ:NVDA) today announced that it is building the United Kingdom's most powerful supercomputer ever. Dubbed Cambridge-1, the system will give healthcare researchers access to impressive artificial intelligence (AI) tools. The $52 million Cambridge-1 will be installed at the university it's named after, and is scheduled to come online by the end of 2020.


Nvidia's StyleGAN2: Analyzing and Improving the Image Quality of StyleGAN

#artificialintelligence

Nvidia launches its upgraded version of StyleGAN by fixing artifacts features and further improves the quality of generated images. StyleGAN being the first of its type image generation method to generate very real images was launched last year and open-sourced in February 2019. StyleGAN2 redefines state of the art in unconditional image modeling, both in terms of existing distribution quality metrics as well as perceived image quality. According to the research paper, In StyleGAN2, several methods and characteristics are improved, and changes in both model architecture and training methods are addressed.


Adafruit BrainCraft HAT: Easy AI on Raspberry Pi

#artificialintelligence

Adafruit is very well known in the maker and electronics community. For 15 years the New York-based company has provided kits and boards for Arduino, Beaglebone and Raspberry Pi and their latest board is the $39.95 BrainCraft HAT. Designed for use with the Raspberry Pi 4, this HAT is a hub of inputs and outputs, including a screen that shows image recognition, to facilitate machine learning Raspberry Pi projects. If you are keen to try out machine learning projects using TensorFlow Lite then the Raspberry Pi 4 is the ideal machine for taking your first steps. It is cheap to buy, has plenty of power and adaptable to your needs.


StradVision Joins NVIDIA Inception Program as Premier Partner

#artificialintelligence

StradVision has joined NVIDIA Inception, a virtual accelerator program designed to nurture companies that are revolutionizing industries with advancements in AI and data sciences. Distinguishing itself as a collaborator of choice from among other AI companies, StradVision has also been selected as one of the program's Premier Partners, an exclusive group within NVIDIA Inception's global network of over 6,000 startups. StradVision specializes in AI-based vision processing technology for Advanced Driver-Assistance Systems (ADAS) and Autonomous Vehicles (AVs) via SVNet, their flagship product. It is a lightweight embedded software that allows vehicles to detect and identify objects on the road accurately, even in harsh weather conditions or poor lighting. Thanks to StradVision's patented Deep Neural Network-enabled technology, SVNet can be optimized for any hardware system.


Nvidia developed a radically different way to compress video calls

#artificialintelligence

Last month, Nvidia announced a new platform called Maxine that uses AI to enhance the performance and functionality of video conferencing software. The software uses a neural network to create a compact representation of a person's face. This compact representation can then be sent across the network, where a second neural network reconstructs the original image--possibly with helpful modifications. Nvidia says that its technique can reduce the bandwidth needs of video conferencing software by a factor of 10 compared to conventional compression techniques. It can also change how a person's face is displayed.


FPGAs could replace GPUs in many deep learning applications

#artificialintelligence

The renewed interest in artificial intelligence in the past decade has been a boon for the graphics cards industry. Companies like Nvidia and AMD have seen a huge boost to their stock prices as their GPUs have proven to be very efficient for training and running deep learning models. Nvidia, in fact, has even pivoted from a pure GPU and gaming company to a provider of cloud GPU services and a competent AI research lab. But GPUs also have inherent flaws that pose challenges in putting them to use in AI applications, according to Ludovic Larzul, CEO and co-founder of Mipsology, a company that specializes in machine learning software. The solution, Larzul says, are field programmable gate arrays (FPGA), an area where his company specializes. FPGA is a type of processor that can be customized after manufacturing, which makes it more efficient than generic processors.