Apple's new-model, top-of-the-line MacBook Pro laptop computer could set you back nearly $4,000 before taxes. But that will seem like a Black Friday steal when a 45-year-old Apple computer goes on sale this week in Monrovia, where it may fetch six figures or more, even without a 16-inch, high-definition screen and the latest microprocessors. On Tuesday, John Moran Auctioneers will auction off a functioning Apple-1 computer hand-built by Steve Wozniak, Steve Jobs and others in a Los Altos, Calif., garage in 1976. The system was the rock upon which the trillion-dollar Apple empire was built. In his 2011 biography "Steve Jobs," Walter Isaacson quotes Wozniak as saying of the Apple-1: "We were participating in the biggest revolution that had ever happened, I thought. I was so happy to be a part of it."
Samsung may have announced a bunch of new devices this week, but it was Apple and Amazon that led the week when it came to online deals. While Woot's flash sale on the Apple Watch Series 6 Product Red Edition came and went quickly, you can still get the smartwatch for $299 at Amazon. The Mac Mini M1 got a $100 discount while a number of Echo devices went on sale as well -- including the new, second-generation Echo Show 5. And through Sunday, you can save on laptops, tablets, TVs and more in Best Buy's anniversary sale. Here are the best tech deals from this week that you can still get today.
Silicon Valley adaptive computing bellwether Xilinx announced its entrance into the growing system-on-module (SOM) market today, with a portfolio of palm-sized compute modules for embedded applications that accelerate AI, machine learning and vision at the edge. Xilinx Kria will eventually expand into a family of single board computers based on reconfigurable FPGA (Field Programmable Gate Array) technology, coupled to Arm core CPU engines and a full software stack with an app store, the first of which is specifically is targeted at AI machine vision and inference applications. The Xilinx Kria K26 SOM employs the company's UltraScale multi-processor system on a chip (MPSoC) architecture, which sports a quad-core Arm Cortex A53 CPU, along with over 250 thousand logic cells and an H.264/265 video compression / decompression engine (CODEC). This may sound like alphabet soup as I spit out acronyms, however, the underlying solution is a compelling offering for developers and engineers looking to give new intelligent systems, in industries like security, smart cities, retail analytics, autonomous machines and robotics, the ability to see, infer information and adapt to their deployments in the field. Also on board the Xilinx Kria K26 SOM is 4GB of DDR4 memory and 245 general purpose IO, along with the ability to support 15 cameras, up to 40 Gbps of combined Ethernet throughput, and four USB 2/3 compatible ports.
This article reviews recent progress in the development of the computing framework Vector Symbolic Architectures (also known as Hyperdimensional Computing). This framework is well suited for implementation in stochastic, nanoscale hardware and it naturally expresses the types of cognitive operations required for Artificial Intelligence (AI). We demonstrate in this article that the ring-like algebraic structure of Vector Symbolic Architectures offers simple but powerful operations on high-dimensional vectors that can support all data structures and manipulations relevant in modern computing. In addition, we illustrate the distinguishing feature of Vector Symbolic Architectures, "computing in superposition," which sets it apart from conventional computing. This latter property opens the door to efficient solutions to the difficult combinatorial search problems inherent in AI applications. Vector Symbolic Architectures are Turing complete, as we show, and we see them acting as a framework for computing with distributed representations in myriad AI settings. This paper serves as a reference for computer architects by illustrating techniques and philosophy of VSAs for distributed computing and relevance to emerging computing hardware, such as neuromorphic computing.
Google's new Quantum AI Campus in Santa Barbara, California, will employ hundreds of researchers, engineers and other staff. Google has begun building a new and larger quantum computing research center that will employ hundreds of people to design and build a broadly useful quantum computer by 2029. It's the latest sign that the competition to turn these radical new machines into practical tools is growing more intense as established players like IBM and Honeywell vie with quantum computing startups. The new Google Quantum AI campus is in Santa Barbara, California, where Google's first quantum computing lab already employs dozens of researchers and engineers, Google said at its annual I/O developer conference on Tuesday. A few initial researchers already are working there. One top job at Google's new quantum computing center is making the fundamental data processing elements, called qubits, more reliable, said Jeff Dean, senior vice president of Google Research and Health, who helped build some of Google's most important technologies like search, advertising and AI.
Google has unveiled its new Quantum AI campus in Santa Barbara, California, where engineers and scientists will be working on its first commercial quantum computer – but that will probably be a decade way. The new campus has a focus on both software and hardware. On the latter front, these include its first quantum data center, quantum hardware research labs, and Google's own quantum processor chip fabrication facilities, says Erik Lucero, lead engineer for Google Quantum AI in a blogpost. Quantum computers offer great promise for cryptography and optimization problems. ZDNet explores what quantum computers will and won't be able to do, and the challenges we still face.
Google developers are confident they can build a commercial-grade quantum computer by 2029. Google CEO Sundar Pichai announced the plan during today's I/O stream, and in a blog post, quantum AI lead engineer Erik Lucero further outlined the company's goal to "build a useful, error-corrected quantum computer" within the decade. Executives also revealed Google's new campus in Santa Barbara, California, which is dedicated to quantum AI. The campus has Google's first quantum data center, hardware research laboratories, and the company's very own quantum processor chip fabrication facilities. The main benefits of quantum computing come in terms of processing power, scale and accuracy, allowing researchers to run complex computations incredibly quickly.
Buluc, Aydin, Kolda, Tamara G., Wild, Stefan M., Anitescu, Mihai, DeGennaro, Anthony, Jakeman, John, Kamath, Chandrika, Ramakrishnan, null, Kannan, null, Lopes, Miles E., Martinsson, Per-Gunnar, Myers, Kary, Nelson, Jelani, Restrepo, Juan M., Seshadhri, C., Vrabie, Draguna, Wohlberg, Brendt, Wright, Stephen J., Yang, Chao, Zwart, Peter
Randomized algorithms have propelled advances in artificial intelligence and represent a foundational research area in advancing AI for Science. Future advancements in DOE Office of Science priority areas such as climate science, astrophysics, fusion, advanced materials, combustion, and quantum computing all require randomized algorithms for surmounting challenges of complexity, robustness, and scalability. This report summarizes the outcomes of that workshop, "Randomized Algorithms for Scientific Computing (RASC)," held virtually across four days in December 2020 and January 2021.
Forty years ago, the word "hacker" was little known. Its march from obscurity to newspaper headlines owes a great deal to tech journalist Steven Levy, who in 1984 defied the advice of his publisher to call his first book Hackers: Heroes of the Computer Revolution.11 Hackers were a subculture of computer enthusiasts for whom programming was a vocation and playing around with computers constituted a lifestyle. Hackers was published only three years after Tracy Kidder's The Soul of a New Machine, explored in my last column (January 2021, p. 32–37), but a lot had changed during the interval. Kidder's assumed readers had never seen a minicomputer, still less designed one. By 1984, in contrast, the computer geek was a prominent part of popular culture. Unlike Kidder, Levy had to make people reconsider what they thought they already knew. Computers were suddenly everywhere, but they remained unfamiliar enough to inspire a host of popular books to ponder the personal and social transformations triggered by the microchip. The short-lived home computer boom had brought computer programming into the living rooms and basements of millions of middle-class Americans, sparking warnings about the perils of computer addiction. A satirical guide, published the same year, warned of "micromania."15 The year before, the film Wargames suggested computer-obsessed youth might accidentally trigger nuclear war.
Nvidia reported revenues of $5.0 billion for its fourth fiscal quarter ended January 31, up 61% from a year earlier. The revenues and non-GAAP earnings per share of $3.10 beat expectations as new gaming hardware and AI products generated strong demand. A year ago, Nvidia reported non-GAAP earnings per share of $1.89 on revenues of $3.1 billion. The Santa Clara, California-based company makes graphics processing units (GPUs) that can be used for games, AI, and datacenter computing. While many businesses have been hit hard by the pandemic, Nvidia has seen a boost in those areas.