Goto

Collaborating Authors

Results


Chip industry is going to need a lot more software to catch Nvidia's lead in AI

ZDNet

Anil Mankar, head of product development at AI chip startup BrainChip, presented details on the company's technology Tuesday at the prestigious Linley Fall Processor conference. The conference organizer, Linley Gwennap, presented the case that the entire industry needs more software capability to catch up with an enormous lead that Nvidia has in AI. The semiconductor industry is in the midst of a renaissance in chip design and performance improvement, but it will take a lot more software to catch up with graphics chip titan Nvidia, an industry conference Tuesday made clear. The Linley Fall Processor conference, which is taking place as a virtual event this week and next week, is one of the main meet-and-greet events every year for promising young chip companies. To kick off the show, the conference host, Linley Gwennap, who has been a semiconductor analyst for two decades, offered a keynote Tuesday morning in which he said that software remains the stumbling block for all companies that want to challenge Nvidia's lead in processing artificial intelligence.


Nvidia makes a clean sweep of MLPerf predictions benchmark for artificial intelligence

ZDNet

Graphics chip giant Nvidia mopped up the floor with its competition in a benchmark set of tests released Wednesday afternoon, demonstrating better performance on a host of artificial intelligence tasks. The benchmark, called MLPerf, announced by the MLPerf organization, an industry consortium that administers the tests, showed Nvidia getting better speed on a variety of tasks that use neural networks, from categorizing images to recommending which products a person might like. Predictions are the part of AI where a trained neural network produces output on real data, as opposed to the training phase when the neural network system is first being refined. Benchmark results on training tasks were announced by MLPerf back in July. Many of the scores on the test results pertain to Nvidia's T4 chip that has been in the market for some time, but even more impressive results were reported for its A100 chips unveiled in May.


Lenovo Smart Clock Essential review: Basic doesn't mean bad

Engadget

One of our favorite gadgets from 2019 was the Google-powered Lenovo Smart Clock. It doesn't have all the bells and whistles of a typical Google smart display, but its alarm clock features, affordable price point and small form factor more than make up for it. Recently, however, the company debuted an even simpler version of the device, appropriately called the Lenovo Smart Clock Essential. With the Essential, the pretense of a smart display is gone altogether; the LCD screen has been replaced with a basic LED display. As a result, I don't quite like it as much as the original Lenovo Smart Clock, but it's also $30 cheaper (the Essential retails for $50 while the original Smart Clock is $80) and if all you really want is an alarm clock with some Google Assistant smarts, then the Essential certainly fits the bill. At its core, the Lenovo Smart Clock Essential is simply a Google-powered smart speaker with a built-in alarm clock.


Deconstructing Maxine, Nvidia's AI-powered video-conferencing technology

#artificialintelligence

This article is part of "Deconstructing artificial intelligence," a series of posts that explore the details of how AI applications work. One of the things that caught my eye at Nvidia's flagship event, the GPU Technology Conference (GTC), was Maxine, a platform that leverages artificial intelligence to improve the quality and experience of video-conferencing applications in real-time. Maxine used deep learning for resolution improvement, background noise reduction, video compression, face alignment, and real-time translation and transcription. In this post, which marks the first installation of our "deconstructing artificial intelligence" series, we will take a look at how some of these features work and how they tie-in with AI research done at Nvidia. We'll also explore the pending issues and the possible business model for Nvidia's AI-powered video-conferencing platform.


Check Out NVIDIA's Cloud-AI Video-Streaming Platform

#artificialintelligence

NVIDIA has recently announced project Maxine, a cloud-native streaming video AI platform for applications like video calls. Using AI, the project perceives important features of a face, sends changes of those features, and re-animates faces based on such points. The AI also allows you to reorient your face so that you'll be making eye contact with each person on the call individually. You can turn the tool on and become an alien or get a stylized face. What is more, Maxine allows users to remove background noise, see better in low light, replace the background, and more.


Google Coral Dev Board Mini SBC Brings Raspberry Pi-Sized AI Computing To The Edge

#artificialintelligence

Single-board computers (SBCs) are wildly popular AI development platforms and excellent tools to teach students of all ages how to code. The de facto standard in SBCs has been the Raspberry Pi family of mini computers. NVIDIA of course has its own lineup of programmable AI development platforms in its Jetson family, including the recently-announced low cost version of the Jetson Nano. There are a host of others from the likes of ASUS, Hardkernel, and Google. Google's Coral development kit was a rather pricey option at $175, but now the same power is much more affordable.


Lenovo Smart Clock Essential review: A great budget smart speaker

PCWorld

I've often wondered why Google doesn't come out with an answer to Amazon's Echo Dot with Clock. Lenovo must have been on the same wavelength, because that's just what the Lenovo Smart Clock Essential is. Actually, it's a better value than the Echo Dot with Clock, because it simultaneously displays all the information you want most frequently--not just the time--and it does it for $10 less than the 4th-gen Echo Dot with Clock. A shrunken sibling of the Lenovo Smart Clock, the Essential is a smart speaker with a 4-inch LED display that shows the current time (with an a.m./p.m. indicator, unless it's set to 24-hour mode), the day of the week (the date would be more useful), the current outdoor temperature (obtained via the internet), and an indicator for an alarm (if one is set). Four LEDs on its face light up when you say the'Hey Google' wake word.


Nvidia leaps forward into AI and Supercomputing

#artificialintelligence

Most of you are probably familiar with the chip giants like Intel & AMD which command a bigger share of the computing processor market, but this entrant to the chip market in 1993 has solidified its reputation as a big name in the arena. Although most well-known for its graphical processing units (GPUs) -- GeForce is its primary & most popular product line, the company also provides system-on-a-chip units (SoCs) for the mobile computing and automotive market. Since 2014, Nvidia has begun to diversify its business from the niche markets of gaming, automotive electronics, and mobile devices. It is now venturing into the futuristic AI, along with providing parallel processing capabilities to researchers and scientists that allow them to efficiently run high-performance applications. Let's review of some these endeavors.


This $59 A.I. Kit Could Change How You Think About Smart Devices Forever

#artificialintelligence

Artificial intelligence has been heralded as the next wave of computing for years, but learning or even tinkering with it has required access to expensive hardware with powerful GPUs capable of crunching massive data sets. That's starting to change with the debut of cheap all-in-one A.I. computers from companies like Nvidia, which introduced its latest Jetson Nano A.I. developer kit this week -- for just $59. The Jetson is a full computer in a tiny package, similar to a Raspberry Pi, that allows hacking on projects or learning from home, while making A.I. accessible to a much broader audience. The debut of the Raspberry Pi in 2012 was a watershed moment for computing because it made computers accessible in a tiny, all-in-one package, for just $35. It meant that hobbyists like me could buy a full computer and hack around with ideas, like building a magic mirror or DIY smart screen.


Nvidia will power world's fastest AI supercomputer, to be located in Europe – TechCrunch

#artificialintelligence

Nvidia is is going to be powering the world's fastest AI supercomputer, a new system dubbed "Leonardo" that's being built by the Italian multi-university consortium CINECA, a global supercomputing leader. The Leonardo system will offer as much as 10 exaflops of FP16 AI performance capabilities, and be made up of more than 14,000 Nvidia Ampere-based GPUS once completed. Leonardo will be one of four new supercomputers supported by a cross-European effort to advance high-performance computing capabilities in the region, which will eventually offer advanced AI capabilities for processing applications across both science and industry. Nvidia will also be supplying its Mellanox HDR InfiniBand networks to the project in order to enable performance across the clusters with low-latency broadband connections. The other computers in the cluster include MeluXina in Luxembourg and Vega in Slovenia, as well as a new supercooling unit coming online in the Czech Republic.