During modern computing's first epoch, one trend reigned supreme: Moore's Law. Actually a prediction by Intel Corp. co-founder Gordon Moore rather than any sort of physical law, Moore's Law held that the number of transistors on a chip doubles roughly every two years. It also meant that performance of those chips--and the computers they powered--increased by a substantial amount on roughly the same timetable. This formed the industry's core, the glowing crucible from which sprang trillion-dollar technologies that upended almost every aspect of our day-to-day existence. As chip makers have reached the limits of atomic-scale circuitry and the physics of electrons, Moore's law has slowed, and some say it's over.
With the rise of AI at the edge comes a whole host of new requirements for memory systems. Can today's memory technologies live up to the stringent demands of this challenging new application, and what do emerging memory technologies promise for edge AI in the long-term? The first thing to realize is that there is no standard "edge AI" application; the edge in its broadest interpretation covers all AI-enabled electronic systems outside the cloud. That might include "near edge," which generally covers enterprise data centers and on-premise servers. Further out are applications like computer vision for autonomous driving.
The next great leap for computing may be a bit closer with the help of joint efforts between the U.S. government, the private sector -- and hundreds of millions of dollars. And along the way, we might see a benefit for the financial services sector in the form of reduced false positives in fraud detection. The U.S. Department of Energy said this week that it will spend $625 million over the next five years to develop a dozen research centers devoted to artificial intelligence (AI) and quantum computing. Another $340 million will come from the private sector and academia, bringing Uncle Sam together with the likes of IBM, Amazon and Google to apply the highest of high tech to a variety of verticals and applications. In an interview with Karen Webster, Dr. Stefan Wörner, global leader for quantum finance and optimization at IBM, said we're getting closer to crossing the quantum-computing Rubicon from concept to real-world applications. The basic premise behind quantum computing is that it can tackle tasks with blinding speed and pinpoint accuracy that aren't possible with "regular" computers.
Back in 2010, Kyle Conroy wrote a blogpost entitled, What if I had bought Apple stock instead?: Currently, Apple's stock is at an all time high. A share today is worth over 40 times its value seven years ago. So, how much would you have today if you purchased stock instead of an Apple product? See for yourself in the table below. Conroy kept the post up-to-date until April 1, 2012; at that point, my first Apple computer, a 2003 12″ iBook, which cost $1,099 on October 22, 2003, would have been worth $57,900.
Something about Novaruu looked like dollar signs from the moment she landed on Twitch. Blonde and with a radiant smile, Novaruu, then 19, had been gaming and hanging out with her growing fanbase for only a few weeks when she began receiving messages from entrepreneurial viewers offering to get her "deals"--vague promises to connect her with a capital-b Brand. One in particular stood out. He offered management tips--what game to play, how to play it--and said he'd get her free products to advertise on-stream. After a while, though, Novaruu discerned that this guy was, in her words, "suspicious." His management tips mostly comprised telling her to play games and emulate other streamers he liked.
SoftBank Group said Monday it is selling British chip designer Arm Ltd. to U.S. chip company Nvidia for up to $40 billion, potentially creating a new giant in the industry. "We reached a final agreement with … Nvidia to sell all shares in Arm" at the value of up to $40 billion dollars (about ¥4.2 trillion), SoftBank said in a statement. The deal is subject to approval by authorities in several jurisdictions, including Britain, China, the United States and European Union, the statement added. If approved, it will be one of the biggest merger-acquisitions in the world this year and promises to propel Nvidia to the forefront of the semiconductor sector. Founded in 1990 in the United Kingdom, Arm specializes in microprocessors, and dominates the global smartphone market.
Nvidia agreed to purchase Arm for up to $40 billion in cash and stock, the companies said Sunday night. This mammoth deal in the chip industry is expected to bolster AI and GPU powerhouse Nvidia's chip portfolio, even as it's sure to attract antitrust attention in the smartphone market. Nvidia will pay Softbank, the company's current owner, a total of $21.5 billion in Nvidia stock and $12 billion in cash, including $2 billion payable at signing. Nvidia will also issue $1.5 billion in equity to Arm employees. It may also pay Softbank up to $5 billion in cash or stock if Arm meets specific financial performance targets--bringing the final purchase price up to $40 billion -- the largest chip deal ever.
OLED TVs, as expensive as they can be, are a favorite among display aficionados. Both Sony and LG announced new OLED TVs earlier this year at CES and now there's actually a decent sale on one of them. The Sony 55-inch A8H 4K OLED TV is $400 off right now at a bunch of retailers, which brings its price down to $1,499. It goes without saying that OLED TVs are worth the money if you care about color quality and getting the blackest blacks possible. It's also an Android TV, so it has built-in Google Cast and support for voice assistants including the Google Assistant and Amazon's Alexa.
Deep learning is a field with intense computational requirements, and your choice of GPU will fundamentally determine your deep learning experience. But what features are important if you want to buy a new GPU? How to make a cost-efficient choice? This blog post will delve into these questions, tackle common misconceptions, give you an intuitive understanding of how to think about GPUs, and will lend you advice, which will help you to make a choice that is right for you. This blog post is designed to give you different levels of understanding about GPUs and the new Ampere series GPUs from NVIDIA. You have the choice: (1) If you are not interested in the details of how GPUs work, what makes a GPU fast, and what is unique about the new NVIDIA RTX 30 Ampere series, you can skip right to the performance and performance per dollar charts and the recommendation section. You might want to skip a section or two based on your understanding of the presented topics. I will head each major section with a small summary, which might help you to decide if you want to read the section or not. This blog post is structured in the following way. First, I will explain what makes a GPU fast. I will discuss CPUs vs GPUs, Tensor Cores, memory bandwidth, and the memory hierarchy of GPUs and how these relate to deep learning performance. These explanations might help you to get a more intuitive sense of what to look for in a GPU. Then I will make theoretical estimates for GPU performance and align them with some marketing benchmarks from NVIDIA to get reliable, unbiased performance data. I discuss the unique features of the new NVIDIA RTX 30 Ampere GPU series that are worth considering if you buy a GPU. From there, I make GPU recommendations for 1-2, 4, 8 GPU setups, and GPU clusters. After that follows a Q&A section of common questions posed to me in Twitter threads; in that section, I will also address common misconceptions and some miscellaneous issues, such as cloud vs desktop, cooling, AMD vs NVIDIA, and others. If you use GPUs frequently, it is useful to understand how they work. This knowledge will come in handy in understanding why GPUs might be slow in some cases and fast in others. In turn, you might be able to understand better why you need a GPU in the first place and how other future hardware options might be able to compete.
If you've just bought a new 4K TV, you might be wondering if you need to buy a new HDMI cable to go with it. The very fast answer is: Probably not. Here's how to tell if the final call is a yes or a no. An HDMI cable is just a conduit between your TV and media device, be it DVD, Blu-ray, or 4K UHD Blu-ray player; a media streamer; a video game console; or a PC. Different types of HDMI cables do exist, but their designations indicate how much data they can carry.