It's called the Snapdragon 8 Plus Gen 1, which just rolls off the tongue, and Qualcomm says it'll offer 10 percent faster CPU performance, 10 percent faster GPU clocks, and -- get this -- use 15 percent less power for "nearly 1 hour" of extra gameplay or, say, 50 minutes of social media browsing. Technically, Qualcomm says it's achieved "up to 30 percent" better power efficiency from both the CPU and GPU, and 20 percent better AI performance per watt, but that doesn't necessarily all transfer into more battery life -- some of it's about performance, too. Qualcomm is particularly touting better sustained performance from the new chip too -- theoretically maintaining its clockspeed for longer as it heats up while gaming or tapping into 5G. Of course, that all depends on how phone manufacturers decide to cool the chip. The company's not breaking down where the extra performance and efficiencies are coming from, but you can see some of the chip's other features in the slide above, even though many of them (like Wi-Fi, Bluetooth, 10Gbps of theoretical 5G, and 8K HDR video capture) haven't changed from the original Snapdragon 8 Gen 1. Qualcomm says it'll live alongside that older chip, so you can probably expect a price premium. Qualcomm's also announcing a new Snapdragon 7 Gen 1 today, suggesting to journalists that it's aimed at gamers with a 20 percent graphics performance boost over the prior gen and the trickle-down of features like its "Adreno Frame Motion Engine" to make games see smoother by interpolating frames.
You won't be able to see the long-awaited Super Mario Bros. movie in theatres for the holidays this year: Nintendo has pushed back the animated film's release date to April 2023 from December 2022. Acclaimed video game designer Shigeru Miyamoto has announced the delay on Twitter, along with film's the new premiere dates of April 28th in Japan and April 7th in North America. Miyamoto didn't reveal the reason behind the delay or say if the COVID-19 pandemic had anything to do with it. He only said that he and Chris Meledandri, the CEO of Illumination animation studio, have decided to move the film's global release date. The Nintendo exec also apologized and promised that "it will be well worth the wait."
The latest trailer for Pixar's Lightyear gives us a bit more of the legendary Space Ranger's origin story. Captain Buzz Lightyear (voiced by Chris Evans) journeys through hyperspace to a new planet, only to realize that his journey took over 60 years in real time. Lightyear and his robot cat Sox (voiced by Peter Sohn) team up with a new squad of Rangers (voiced by Keke Palmer, Taika Waititi, and Dale Soules) in order to get home. But it won't be so easy: Emperor Zurg and his army's arrival pose a major threat to Buzz's mission... and the whole universe. Lightyear hits theaters June 17.
FacePhi and CyberLink earned passing marks in evaluations to the biometric presentation attack detection (PAD) standard from the International Organization for Standardization (ISO) and International Electrotechnical Commission (IEC) to demonstrate that their face biometrics systems can detect fraud attacks known as spoofing. The trials were conducted by iBeta Quality Assurance in Colorado, which is certified by U.S. National Institute of Standards and Technology National Voluntary Laboratory Accreditation Program (NIST NVLAP). FacePhi received confirmation of its ISO/IEC 30107-3 Level 2 compliance to prove its facial recognition system's protection against identity fraud and impersonation with more sophisticated attack methods. The test consisted of FacePhi's digital onboarding and authentication solutions being subjected to phishing attacks using animation software, latex and resin masks, and 3D photography. "With this letter of compliance granted by iBeta, FacePhi demonstrates that its technology is strong and resistant to attacks, both level 1 and level 2 in accordance with ISO 30107-3", says Jorge Félix, quality and systems director of FacePhi.
At its annual GTC conference for AI developers, Nvidia today announced its next-gen Hopper GPU architecture and the Hopper H100 GPU, as well as a new data center chip that combines the GPU with a high-performance CPU, which Nvidia calls the "Grace CPU Superchip" (not to be confused with the Grace Hopper Superchip). With Hopper, Nvidia is launching a number of new and updated technologies, but for AI developers, the most important one may just be the architecture's focus on transformer models, which have become the machine learning technique de rigueur for many use cases and which powers models like GPT-3 and asBERT. The new Transformer Engine in the H100 chip promises to speed up model training by up to six times and because this new architecture also features Nvidia's new NVLink Switch system for connecting multiple nodes, large server clusters powered by these chips will be able to scale up to support massive networks with less overhead. "The largest AI models can require months to train on today's computing platforms," Nvidia's Dave Salvator writes in today's announcement. AI, high performance computing and data analytics are growing in complexity with some models, like large language ones, reaching trillions of parameters.
Depending on your point of view, the last two years have either gone by very slowly, or very quickly. While the COVID pandemic never seemed to end – and technically still hasn't – the last two years have whizzed by for the tech industry, and especially for NVIIDA. The company launched its Ampere GPU architecture just two years ago at GTC 2020, and after selling more of their chips than ever before, now in 2022 it's already time to introduce the next architecture. So without further ado, let's talk about the Hopper architecture, which will underpin the next generation of NVIDIA server GPUs. As has become a ritual now for NVIDIA, the company is using its Spring GTC event to launch its next generation GPU architecture. Introduced just two years ago, Ampere has been NVIDIA's most successful server GPU architecture to date, with over $10B in data center sales in just the last year.
Did you miss a session at the Data Summit? Nvidia packed about three years' worth of news into its GPU Technology Conference today. Flamboyant CEO Jensen Huang's 1 hour, 39-minute keynote covered a lot of ground, but the unifying themes to the majority of the two dozen announcements were GPU-centered and Nvidia's platform approach to everything it builds. Most people know Nvidia as the world's largest manufacturer of a graphics processing unit, or GPU. The GPU is a chip that was first used to accelerate graphics in gaming systems.
TL;DR: Get the ByteBoi: DIY Advanced Game Console for $109.99 instead of $119 as of March 5 -- that's 8% off. The DIY enthusiasts at CircuitMess have been cranking out awesome educational DIY kits for years now. If you loved the first product, the MAKERbuino, or any of the others, check out the new ByteBoi -- an improved, reimagined version of the 8-bit educational gaming console. Built for kids (ages 11 and up) and adults, the ByteBoi is a hands-on way to learn about electronics, coding, game graphics, game engines, character animation, and more. Funded on Kickstarter, just like most other CircuitMess DIY kits, it comes with all the parts you'll need to build the console, minus the tools for assembly -- including a full-color TFT display, main circuit board, Li-Po battery, acrylic casing, a bag of small components, and of course, instructions.
The largest proposed semiconductor acquisition in IT history – Nvidia merging with Arm – was called off today due to significant regulatory challenges, with antitrust issues being the main hurdle. The $40 billion deal was initially announced in September 2020, and there has been wide speculation that this would eventually be the outcome based on several factors that I believed were either not true or overblown. Before I get into that, it's important to understand why this deal was so important. Nvidia's core product is the graphics processing unit, or GPU, which was initially used to improve graphics capabilities on computers for uses such as gaming. It just so happens that the architecture of a GPU makes it ideal for other tasks that require accelerated computing, such as real-time graphics rendering, virtual reality, and artificial intelligence.
Among various traditional art forms, brush stroke drawing is one of the widely used styles in modern computer graphic tools such as GIMP, Photoshop and Painter. In this paper, we develop an AI-aided art authoring (A4) system of non-photorealistic rendering that allows users to automatically generate brush stroke paintings in a specific artist's style. Within the reinforcement learning framework of brush stroke generation proposed, our contribution in this paper is to learn artists' drawing styles from video-captured stroke data by inverse reinforcement learning. Through experiments, we demonstrate that our system can successfully learn artists' styles and render pictures with consistent and smooth brush strokes.