Goto

Collaborating Authors

Hardware


Is quantum computing worth the leaps of faith? [Status Report]

ZDNet

As an editor, I often gave my writers this admonition: Never promise something without having established at least one feasible means of delivery. A promise should never be something to which one resorts in the absence of reason, otherwise every marriage would be predicated upon perjury. As a writer, I've learned to omit the word "promise" from my commitments, because editors tend to interpret "I promise" more as a statement of desperation than commitment. It is in that context that we can confidently assert quantum computing has an abundance of promises. If it ends up working, QC will enable the class of large, complex matrix calculations that comprise the bulk of machine learning work, to be executed in moments rather than days, or moments rather than years -- or, with enough data in hand, moments rather than centuries.


China five-year plan aims for supremacy in AI, quantum computing

Engadget

China's tech industry has been hit hard by US trade battles and the economic uncertainties of the pandemic, but it's eager to bounce back in the relatively near future. According to the Wall Street Journal, the country used its annual party meeting to outline a five-year plan for advancing technology that aids "national security and overall development." It will create labs, foster educational programs and otherwise boost research in fields like AI, biotech, semiconductors and quantum computing. The Chinese government added that it would increase spending on basic research (that is, studies of potential breakthroughs) by 10.6 percent in 2021, and would create a 10-year research strategy. China has a number of technological advantages, such as its 5G availability and the sheer volume of AI research it produces.


Next Raspberry Pi CPU Will Have Machine Learning Built In

#artificialintelligence

At the recent tinyML Summit 2021, Raspberry Pi co-founder Eben Upton teased the future of'Pi Silicon' and it looks like machine learning could see a massive improvement thanks to Raspberry Pi's news in-house chip development team. It is safe to say that the Raspberry Pi Pico and its RP2040 SoC have been popular. The Pico has only been on the market for a few weeks, but already has sold 250,000 units with 750,000 on back order. There is a need for more boards powered by the RP2040 and partners such as Adafruit, Pimoroni, Adafruit and Sparkfun are releasing their own hardware, many with features not found on the Pico. Raspberry Pi's in house application specific integrated circuit (ASIC) team are working on the next iteration, and seems to be focused on lightweight accelerators for ultra low power machine learning applications.


AMD's $479 Radeon RX 6700 XT targets silky-smooth 1440p gaming

PCWorld

AMD's new $479 Radeon RX 6700 XT graphics card will target gamers who want to play at 1440p with max settings but its best feature may be that it'll be available in "significantly" higher volumes than Radeon RX 6800 GPUs were at launch, the company said Wednesday morning. AMD also announced that its performance-boosting Smart Access Memory technology will be coming to Ryzen 3000 processors. But the star of the show was the Radeon RX 6700 XT, which features 40 compute units, a game clock of 2,424MHz, 96MB of AMD's radical Infinity Cache, a 230 watt power rating, and an ample 12GB of GDDR6 memory. AMD is especially touting that massive memory buffer to differentiate the Radeon RX 5700 XT from its rivals when they go on sale on March 18. At 1440p resolution, AMD said Call of Duty: Black Ops Cold War uses about 11GB of RAM, Horizon Zero Dawn uses just over 10GB, and both Dirt 5 and Red Dead Redemption 2 push between 9GB and 10GB of memory use. AMD said these situations can give the larger 12GB frame buffer in the Radeon RX 6700 XT an advantage--but it will largely depend on the game.


Quantum sensors could soon be heading into space

ZDNet

An outer-space mission inevitably calls for next-generation tools. Quantum technologies are on track to reach new heights – quite literally: quantum company Q-CTRL has plans to send ultra-sensitive quantum sensors and navigation devices to space, as part of a mission to explore the moon for water and other resources that will support NASA astronauts in future landings. The Australian company, which applies the principles of control engineering to improve the hardware performance of quantum devices, will provide the quantum technology to assist un-crewed missions organized by the Seven Sisters space industry consortium, and planned to start in 2023. Formed last year by space start-up Fleet Space, the consortium is working to send nanosatellites and exploration sensors to the moon to search for resources, and generate useful data for future human exploration. The information gathered will inform NASA's Artemis program, which will land the first woman and next man on the Moon by 2024, creating a sustainable human presence for later crewed Martian exploration.


Renesas boosts AI with Arm Cortex-A55 on RZ/G2L microprocessors - Softei.com

#artificialintelligence

Renesas has announced the expansion of its RZ/G2 general-purpose 64-bit microprocessors, with improved artificial intelligence (AI) processing. The company has added three entry level microprocessor models built around the Arm Cortex-A55 core. Renesas adds that the seven RZ/G2 microprocessors provide scalability from entry-level to high-end design. The RZ/G2Lx microprocessors' Arm Cortex-A55 CPU core delivers approximately 20 per cent improved processing performance compared with the previous Cortex-A53 core. It also provides approximately six times faster essential processing for AI applications, says the company.


Samsung creates RAM with integrated AI processing hardware

#artificialintelligence

A processing unit (CPU, GPU or whatever) and RAM are typically separate things built on separate chips. But what if they were part of the same chip, all mixed together? That's exactly what Samsung did to create the world's first High Bandwidth Memory (HBM) with built-in AI processing hardware called HBM-PIM (for processing-in-memory). It took its HBM2 Aquabolt chips and added Programmable Computing Units (PCU) between the memory banks. These are relatively simple and operate on 16-bit floating point values with a limited instruction set – they can move data around and perform multiplications and additions.


Gaming, datacenters boost Nvidia's Q4 revenues to $5 billion

#artificialintelligence

Nvidia reported revenues of $5.0 billion for its fourth fiscal quarter ended January 31, up 61% from a year earlier. The revenues and non-GAAP earnings per share of $3.10 beat expectations as new gaming hardware and AI products generated strong demand. A year ago, Nvidia reported non-GAAP earnings per share of $1.89 on revenues of $3.1 billion. The Santa Clara, California-based company makes graphics processing units (GPUs) that can be used for games, AI, and datacenter computing. While many businesses have been hit hard by the pandemic, Nvidia has seen a boost in those areas.


Use DirectML to add machine learning to C code

#artificialintelligence

The modern GPU is more than a graphics device. Technologies such as the open-standard OpenCL and Nvidia's CUDA turn the many small processors in a GPU into a parallel computing fabric, allowing desktop PCs to complete tasks that used to be the sole purview of supercomputers. Those same GPUs are also capable of supporting many modern machine learning tasks, using GPU compute to build neural networks and to support model-building, data-parallel analytical and processing tasks. Microsoft has been investing in simplifying GPU programming for a long time now, starting with its DirectX GPU tools, initially via the Direct3D graphics tools, and extending it to GPU compute with DirectCompute. Recent developments have included tools to map OpenGL calls to Direct3D, related to work building a graphical layer onto the WSL 2 Linux virtual machine system bundled with Windows 10. Although they make it easier to work with hardware, these remain low-level programming tools, using C to access hardware features.


Poly Effects Beebo review: A versatile and complex touchscreen guitar pedal

Engadget

It's not enough to have a pressure cooker, you need an Instant Pot that's also a slow cooker, and a rice cooker, and a yogurt maker. Your video game console is also now a media center and live streaming platform. And if your printer doesn't also make copies and send faxes, then what are you even doing with your life? This obsession with do-it-all gadgets has even hit the world of music gear. While there were certainly earlier examples, it really started to take off in the '90s with the emergence of the groovebox.