Goto

Collaborating Authors

Results


Google's $100 Linux Coral Dev Board mini quietly launches – but sells out fast

ZDNet

Google's Coral Dev Board mini has made a tantalizingly brief appearance for pre-order for $100 on Seeed's website – but stocks are already sold out, according to the company. Google unveiled its Linux Coral Dev Board mini in January, offering developers a smaller, cheaper and lower-power version of the Coral Dev Board, which launched for $149 but now costs $129. Instead of an NXP system on chip (SoC), the Mini combines the new Coral Accelerator Module with a MediaTek 8167s SoC, which consists of a quad-core Arm Cortex-A35 CPU. It also features an IMG PowerVR GE8300 GPU that's integrated in the SoC, while the machine-learning accelerator consists of the Google Edge TPU coprocessor that's capable of performing four trillion operations per second (TOPS) or two TOPS per watt. According to Google, it can execute mobile vision models such as Mobile:Net v2 at almost 400 frames per second. The device runs a derivative of Debian Linux called Mendel.


IonQ Unveils World's Most Powerful Quantum Computer

#artificialintelligence

"Demonstrating the first successful quantum logic gate in 1995 was almost an accident, but doing so opened a path forward towards deploying quantum computers on previously unsolvable problems," said IonQ Co-Founder & Chief Scientist Chris Monroe. "The new system we're deploying today is able to do things no other quantum computer has been able to achieve, and even more importantly, we know how to continue making these systems much more powerful moving forward." One way is to fix errors through circuit encoding, capitalizing on a recent demonstration of quantum error correction in a nearly identical system. Monroe says "with our new IonQ system, we expect to be able to encode multiple qubits to tolerate errors, the holy grail for scaling quantum computers in the long haul." This encoding requires just 13 qubits to make a near-perfect logical qubit, while in other hardware architectures it's estimated to take more than 100,000.


TensorFlow Quantum Boosts Quantum Computer Hardware Performance

#artificialintelligence

Google recently released TensorFlow Quantum, a toolset for combining state-of-the-art machine learning techniques with quantum algorithm design. This is an essential step to build tools for developers working on quantum applications. Simultaneously, they have focused on improving quantum computing hardware performance by integrating a set of quantum firmware techniques and building a TensorFlow-based toolset working from the hardware level up – from the bottom of the stack. The fundamental driver for this work is tackling the noise and error in quantum computers. Here's a small overview of the above and how the impact of noise and imperfections (critical challenges) is suppressed in quantum hardware.


Is your Apple Watch battery worn and in need of replacing? Here's how to tell without taking it off your wrist

ZDNet

Potentially one of the best feature to hit the iPhone in the past few years was Optimized Battery Charging. First introduced a year ago with iOS 13, Optimized Battery Charging was designed to reduce battery wear, and therefore increase its lifespan, by limiting how long your iPhone remained fully charged by pausing the charging process at 80 percent, and then using on-device machine learning to learn your daily charging routine to determine when to add the past 20 percent of charge so your iPhone is ready for you when you wake up. Apple then went on to add the same to macOS Catalina 10.15.5. And based on testing I've carried out, this feature does indeed reduce battery wear, and the machine learning is quick to pick up on your habits and make this feature something that works in the background and which doesn't inconvenience you when your schedules change. And watchOS 7, released last month, has now brought this battery saving feature to the Apple Watch.


As AI chips improve, is TOPS the best way to measure their power?

#artificialintelligence

Once in a while, a young company will claim it has more experience than would be logical -- a just-opened law firm might tout 60 years of legal experience, but actually consist of three people who have each practiced law for 20 years. The number "60" catches your eye and summarizes something, yet might leave you wondering whether to prefer one lawyer with 60 years of experience. There's actually no universally correct answer; your choice should be based on the type of services you're looking for. A single lawyer might be superb at certain tasks and not great at others, while three lawyers with solid experience could canvas a wider collection of subjects. If you understand that example, you also understand the challenge of evaluating AI chip performance using "TOPS," a metric that means trillions of operations per second, or "tera operations per second."


ESA's Φ-Week: Digital Twin Earth, Quantum Computing and AI Take Center Stage

#artificialintelligence

Digital Twin Earth will help visualize, monitor, and forecast natural and human activity on the planet. The model will be able to monitor the health of the planet, perform simulations of Earth's interconnected system with human behavior, and support the field of sustainable development, therefore, reinforcing Europe's efforts for a better environment in order to respond to the urgent challenges and targets addressed by the Green Deal. ESA's 2020 Φ-week event kicked off this morning with a series of stimulating speeches on Digital Twin Earth, updates on Φ-sat-1, which was successfully launched into orbit earlier this month, and an exciting new initiative involving quantum computing. The third edition of the Φ-week event, which is entirely virtual, focuses on how Earth observation can contribute to the concept of Digital Twin Earth – a dynamic, digital replica of our planet which accurately mimics Earth's behavior. Constantly fed with Earth observation data, combined with in situ measurements and artificial intelligence, the Digital Twin Earth provides an accurate representation of the past, present, and future changes of our world.


The Hardware Lottery

arXiv.org Artificial Intelligence

Hardware, systems and algorithms research communities have historically had different incentive structures and fluctuating motivation to engage with each other explicitly. This historical treatment is odd given that hardware and software have frequently determined which research ideas succeed (and fail). This essay introduces the term hardware lottery to describe when a research idea wins because it is suited to the available software and hardware and not because the idea is superior to alternative research directions. Examples from early computer science history illustrate how hardware lotteries can delay research progress by casting successful ideas as failures. These lessons are particularly salient given the advent of domain specialized hardware which make it increasingly costly to stray off of the beaten path of research ideas. This essay posits that the gains from progress in computing are likely to become even more uneven, with certain research directions moving into the fast-lane while progress on others is further obstructed.


EETimes - Memory Technologies Confront Edge AI's Diverse Challenges

#artificialintelligence

With the rise of AI at the edge comes a whole host of new requirements for memory systems. Can today's memory technologies live up to the stringent demands of this challenging new application, and what do emerging memory technologies promise for edge AI in the long-term? The first thing to realize is that there is no standard "edge AI" application; the edge in its broadest interpretation covers all AI-enabled electronic systems outside the cloud. That might include "near edge," which generally covers enterprise data centers and on-premise servers. Further out are applications like computer vision for autonomous driving.


Quantifying Quantum computing's value in financial services - Fintech News

#artificialintelligence

The next great leap for computing may be a bit closer with the help of joint efforts between the U.S. government, the private sector -- and hundreds of millions of dollars. And along the way, we might see a benefit for the financial services sector in the form of reduced false positives in fraud detection. The U.S. Department of Energy said this week that it will spend $625 million over the next five years to develop a dozen research centers devoted to artificial intelligence (AI) and quantum computing. Another $340 million will come from the private sector and academia, bringing Uncle Sam together with the likes of IBM, Amazon and Google to apply the highest of high tech to a variety of verticals and applications. In an interview with Karen Webster, Dr. Stefan Wörner, global leader for quantum finance and optimization at IBM, said we're getting closer to crossing the quantum-computing Rubicon from concept to real-world applications. The basic premise behind quantum computing is that it can tackle tasks with blinding speed and pinpoint accuracy that aren't possible with "regular" computers.


Nvidia's Integration Dreams

#artificialintelligence

Back in 2010, Kyle Conroy wrote a blogpost entitled, What if I had bought Apple stock instead?: Currently, Apple's stock is at an all time high. A share today is worth over 40 times its value seven years ago. So, how much would you have today if you purchased stock instead of an Apple product? See for yourself in the table below. Conroy kept the post up-to-date until April 1, 2012; at that point, my first Apple computer, a 2003 12″ iBook, which cost $1,099 on October 22, 2003, would have been worth $57,900.