Goto

Collaborating Authors

Graphics


Boosting machine learning workflows with GPU-accelerated libraries

#artificialintelligence

Abstract: In this article, we demonstrate how to use RAPIDS libraries to improve machine learning CPU-based libraries such as pandas, sklearn and NetworkX. We use a recommendation study case, which executed 44x faster in the GPU-based library when running the PageRank algorithm and 39x faster for the Personalized PageRank. Scikit-learn and Pandas are part of most data scientists' toolbox because of their friendly API and wide range of useful resources-- from model implementations to data transformation methods. However, many of these libraries still rely on CPU processing and, as far as this thread goes, libraries like Scikit-learn do not intend to scale up to GPU processing or scale out to cluster processing. To overcome this drawback, RAPIDS offers a suite of Python open source libraries that takes these widely used data science solutions and boost them up by including GPU-accelerated implementations while still providing a similar API.


Nvidia's AI-powered scaling makes old games look better without a huge performance hit

#artificialintelligence

Nvidia's latest game-ready driver includes a tool that could let you improve the image quality of games that your graphics card can easily run, alongside optimizations for the new God of War PC port. The tech is called Deep Learning Dynamic Super Resolution, or DLDSR, and Nvidia says you can use it to make "most games" look sharper by running them at a higher resolution than your monitor natively supports. DLDSR builds on Nvidia's Dynamic Super Resolution tech, which has been around for years. Essentially, regular old DSR renders a game at a higher resolution than your monitor can handle and then downscales it to your monitor's native resolution. This leads to an image with better sharpness but usually comes with a dip in performance (you are asking your GPU to do more work, after all). So, for instance, if you had a graphics card capable of running a game at 4K but only had a 1440p monitor, you could use DSR to get a boost in clarity.


Artificial Intelligence Roundup: Week #50

#artificialintelligence

Plainsight's Carlos Anchia and Elizabeth Spears recently sat down with Grant Larsen of ClickAI Radio to discuss how AI ethics can affect your business. Listen below or read more here. Meta AI has shared a fascinating way to generate character animation from a single unique drawing. The demonstration here shows how it can turn drawings from a child into fun entertainment, but you can imagine how this technique could be used to help anyone create interactive art for digital experiences, such as augmented and virtual reality. Read more about how Meta used computer vision and artificial intelligence to create these animations for these unique characters here.


CES 2022: AMD, Intel, and Nvidia make CPUs and GPUs buddy up

#artificialintelligence

Late last year, I wrote about Apple's first M1 series-powered MacBook Pros and how the company spared no opportunity to bring out the big benchmark guns against its previous efforts as well as rivals. At CES, the empires (at least those that rule PC chips) struck back, with AMD, Intel, and Nvidia all announcing new versions of flagships that address the need to deliver more performance more efficiently. Among the techniques they've addressed, they've all performance and efficiency by tapping into the versatility of the Windows ecosystem for new ways for CPUs and GPUs to work together. With AMD having the longest history offering both CPUs and discrete GPUs, it's been no surprise to see the company embrace more intelligent power shifts between the two. The company upped its SmartShift technology for routing computational load between CPU and GPU to SmartShift Max.


CES 2022: AMD, Intel, and Nvidia make CPUs and GPUs buddy up

ZDNet

Late last year, I wrote about Apple's first M1 series-powered MacBook Pros and how the company spared no opportunity to bring out the big benchmark guns against its previous efforts as well as rivals. At CES, the empires (at least those that rule PC chips) struck back, with AMD, Intel, and Nvidia all announcing new versions of flagships that address the need to deliver more performance more efficiently. Among the techniques they've addressed, they've all performance and efficiency by tapping into the versatility of the Windows ecosystem for new ways for CPUs and GPUs to work together. With AMD having the longest history offering both CPUs and discrete GPUs, it's been no surprise to see the company embrace more intelligent power shifts between the two. The company upped its SmartShift technology for routing computational load between CPU and GPU to SmartShift Max.


How does artificial intelligence learn? -- Design and Animation

#artificialintelligence

For this project, Champ mentioned he had a lot of fun art directing, designing and animating this piece. The vision for this came out beyond what he expected and in the end Champ says "I felt that I really pushed myself. I can say that I'm very proud of it."


Dogs seem to know the basic way objects should behave, study claims

Daily Mail - Science & tech

Dogs have a sense of the basic way objects should behave, according to scientists, who say they stare longer if a computer animation breaks the laws of physics. Humans use a process known as'contact causality' from an early age to make sense of the physical environment, but little is known about the processes that non-primate animals use to make sense of the world and how things work. To better understand this in dogs, a team at the University of Veterinary Medicine, Vienna, adapted an eye tracking system used on human infants. Dogs were presented with realistic 3D animations of balls that obey and break Newton's basic laws of physics, and tracked their pupil dilation and attention span. The animals tracked the movements of balls closely throughout the study, but pupils were larger when objects in the animations broke the laws of physics.


Animation Using AI

#artificialintelligence

That film will be legislated by robots and amped comparably as delivered by profound literacy computations. With the help of big data in media and entertainment assiduity, AI is prominently used in vitality assiduity. There are data that indicate there are certain jobs that now ought not to be executed physically and AI robotization will do them effectively. Still, there is a lesser demand for able individualities who'll prepare profound literacy computations to perform routine assignments like making an advanced character look life-suchlike. AI would help innovative artists to concentrate on redundant bewitching effects rather than on the work escalated figure by figure redaction fashion.


5 killer Radeon GPU features that level up your gaming experience

PCWorld

Today's GPUs are so capable you might not even consider that off-the-shelf you could be leaving performance on the table. Indeed, if you thought you were "locked in" to the performance limits of your Radeon GPU at the time of purchase, know this: You can "unlock" more performance and even more eye-candy to bring your graphics to another level. We'll show you how free, easy, and fun it is to "boost" your GPU! (To learn more about today's graphics hardware, see our roundup of the best GPUs for PC gaming.) Like the name implies, Radeon Boost is a variation on the resolution-altering tools that increase GPU performance intelligently. It will take its cues from movement on the screen, as opposed to a traditional frames-per-second metric.


GPU From Imagination Works With RISC-V - AI Summary

#artificialintelligence

The activity around creating a legit graphics processor for RISC-V chip designs, an emerging competitor to x86 and ARM, is gaining steam. Special interest groups at RISC-V next year will expand the focus on extensions for shaders and advanced matrix operations, which is important for artificial intelligence and machine learning, Mark Himelstein, chief technology officer at RISC-V, told The Register. "There is no reason why you could not integrate C-series -- which is the part that has ray tracing -- with RISC-V," David Harold, chief marketing officer at Imagination, told The Register. Andes Technology, which creates RISC-V chip designs, has verified that Imagination's GPUs work with RISC-V, and so has RIOS Lab, which has David Patterson, vice chair of the Board at RISC-V Foundation, on staff. The need for a GPU on RISC-V could be fundamental as the chip architecture gains importance, Shreyas Derashri, vice president of compute at Imagination, told The Register.