Goto

Collaborating Authors

Hardware


Renesas boosts AI with Arm Cortex-A55 on RZ/G2L microprocessors - Softei.com

#artificialintelligence

Renesas has announced the expansion of its RZ/G2 general-purpose 64-bit microprocessors, with improved artificial intelligence (AI) processing. The company has added three entry level microprocessor models built around the Arm Cortex-A55 core. Renesas adds that the seven RZ/G2 microprocessors provide scalability from entry-level to high-end design. The RZ/G2Lx microprocessors' Arm Cortex-A55 CPU core delivers approximately 20 per cent improved processing performance compared with the previous Cortex-A53 core. It also provides approximately six times faster essential processing for AI applications, says the company.


Samsung creates RAM with integrated AI processing hardware

#artificialintelligence

A processing unit (CPU, GPU or whatever) and RAM are typically separate things built on separate chips. But what if they were part of the same chip, all mixed together? That's exactly what Samsung did to create the world's first High Bandwidth Memory (HBM) with built-in AI processing hardware called HBM-PIM (for processing-in-memory). It took its HBM2 Aquabolt chips and added Programmable Computing Units (PCU) between the memory banks. These are relatively simple and operate on 16-bit floating point values with a limited instruction set – they can move data around and perform multiplications and additions.


Gaming, datacenters boost Nvidia's Q4 revenues to $5 billion

#artificialintelligence

Nvidia reported revenues of $5.0 billion for its fourth fiscal quarter ended January 31, up 61% from a year earlier. The revenues and non-GAAP earnings per share of $3.10 beat expectations as new gaming hardware and AI products generated strong demand. A year ago, Nvidia reported non-GAAP earnings per share of $1.89 on revenues of $3.1 billion. The Santa Clara, California-based company makes graphics processing units (GPUs) that can be used for games, AI, and datacenter computing. While many businesses have been hit hard by the pandemic, Nvidia has seen a boost in those areas.


Use DirectML to add machine learning to C code

#artificialintelligence

The modern GPU is more than a graphics device. Technologies such as the open-standard OpenCL and Nvidia's CUDA turn the many small processors in a GPU into a parallel computing fabric, allowing desktop PCs to complete tasks that used to be the sole purview of supercomputers. Those same GPUs are also capable of supporting many modern machine learning tasks, using GPU compute to build neural networks and to support model-building, data-parallel analytical and processing tasks. Microsoft has been investing in simplifying GPU programming for a long time now, starting with its DirectX GPU tools, initially via the Direct3D graphics tools, and extending it to GPU compute with DirectCompute. Recent developments have included tools to map OpenGL calls to Direct3D, related to work building a graphical layer onto the WSL 2 Linux virtual machine system bundled with Windows 10. Although they make it easier to work with hardware, these remain low-level programming tools, using C to access hardware features.


Poly Effects Beebo review: A versatile and complex touchscreen guitar pedal

Engadget

It's not enough to have a pressure cooker, you need an Instant Pot that's also a slow cooker, and a rice cooker, and a yogurt maker. Your video game console is also now a media center and live streaming platform. And if your printer doesn't also make copies and send faxes, then what are you even doing with your life? This obsession with do-it-all gadgets has even hit the world of music gear. While there were certainly earlier examples, it really started to take off in the '90s with the emergence of the groovebox.


Could quantum computers fix political polls?

ZDNet

It would be the harbinger of an entirely new medium of calculation, harnessing the powers of subatomic particles to obliterate the barriers of time in solving incalculable problems. You and I are being continually surveyed. We reveal information about ourselves with astonishingly little resistance. Social media has made many of us into veritable slot machines for our own personal data. We're fed a little token of encouragement that someone may yet like us, our arm is gently pulled, and we disgorge something we hope people will find valuable enough for commencing small talk. What personal facts, real or trivial, we do end up disclosing -- perhaps unwittingly -- immediately undergo unceasing analysis. The inferences these analyses draw about us as people are being aggregated, baselined, composited, deliberated, and profiled.


The Decline of Computers as a General Purpose Technology

Communications of the ACM

Perhaps in no other technology has there been so many decades of large year-over-year improvements as in computing. It is estimated that a third of all productivity increases in the U.S. since 1974 have come from information technology,a,4 making it one of the largest contributors to national prosperity. The rise of computers is due to technical successes, but also to the economics forces that financed them. Bresnahan and Trajtenberg3 coined the term general purpose technology (GPT) for products, like computers, that have broad technical applicability and where product improvement and market growth could fuel each other for many decades. But, they also predicted that GPTs could run into challenges at the end of their life cycle: as progress slows, other technologies can displace the GPT in particular niches and undermine this economically reinforcing cycle. We are observing such a transition today as improvements in central processing units (CPUs) slow, and so applications move to specialized processors, for example, graphics processing units (GPUs), which can do fewer things than traditional universal processors, but perform those functions better. Many high profile applications are already following this trend, including deep learning (a form of machine learning) and Bitcoin mining. With this background, we can now be more precise about our thesis: "The Decline of Computers as a General Purpose Technology." We do not mean that computers, taken together, will lose technical abilities and thus'forget' how to do some calculations.


The best deals we found this week: $100 off the Mac mini M1 and more

Engadget

Even as Presidents' Day sales came to a close this week, plenty of other tech deals cropped up across the web. Those who have been holding out for a deal on an OLED TV can get an LG CX model at a deeply discounted price, and Apple's Mac mini M1 hit a new all-time low of $600. Here are the best tech deals from this week that you can still get today. Apple's new Mac mini with the M1 chip is down to $600, which is $100 off its normal price and a new all-time low for the desktop. It's dropped to $650 a few times since its release late last year, but this is the best deal we've seen yet.


Apple co-founder Steve Jobs job application up for auction in London

Daily Mail - Science & tech

A job application form signed by Apple co-founder Steve Jobs as a teenager back in 1973 -- one that hints at his computer skills -- is being sold at auction in London. The paperwork for the unspecified position dates from a year before Mr Jobs joined then video game start-up Atari as a technician and worked alongside Steve Wozniak. The duo would go on to found the Apple Computer Company, releasing their first machine, the Apple-1, just two years later. Had teenaged Jobs' application been successful, he may not have met Mr Wozniak and the landscape of modern computing would likely have ended up very different. London auctioneer Charterfields is auctioning the item online on February 24, 2021 -- with bids set to open at £15,000 (which is around $20,950).


Apple's M1 chip-equipped Mac mini just hit a new record-low price on Amazon

Mashable

The new Mac Mini with Apple's in-house M1 chip is on sale on Amazon as of Feb. 18 -- get the 256GB model for the all-time-low price of just $599.99 with an extra discount applied at checkout (normally $699), or upgrade to the 512GB version for only $849 (down $50 from $899). Apple sent the tech world into a tizzy last fall when it dropped some new hardware with its own custom-made M1 silicon chip, the company's first-ever proprietary processor. It was a huge move for a couple of reasons: One, it meant Tim Cook and pals were no longer relying on Intel to power their machines. More importantly, it meant us consumers could now get our hands on Apple devices with improved performance, better battery life, and iOS app support. If you're in need of a new computer and want to see what all the hype's about, the M1-equipped Mac mini is a great place to start: It connects to the monitor of your choosing and comes in at a fraction of the price of the M1 MacBook Air and Pro (which start at $999 and $1,299, respectively).