Goto

Collaborating Authors

Hardware


Supercomputing, 5G, & Enterprise AI: Takeaways from Nvidia's 2021 Conference

#artificialintelligence

From making GPU based video games to leading groundbreaking innovations in data science, AI and computing, Nvidia is leading the fourth industrial revolution. After years of innovation and experimentation, Nvidia launched "Grace," the company's 1st data center CPU. Grace will use NLP processing, recommendation systems, and AI supercomputing to tap large datasets. After opening the Omniverse design and collaboration platform in December, the company is developing its Omniverse Enterprise with over 400 major companies already using it. With Blue-field3, the company is facilitating real-time network visibility, and detection of cyberthreats for multiple organisations and with many other innovations such as autonomous vehicles, new SDK for quantum circuit simulation and more that's yet to disrupt AI and technology.


What's so great about Google Tensor? The new Pixel 6 chip, explained

Mashable

To be more specific, Google is developing something called the Tensor chip for the Pixel 6 phones. There are still plenty of important details about Tensor that Google probably won't reveal until the Oct. 19 Pixel 6 launch event -- such as which companies are providing which exact components for it -- but we can use what little we have to paint a picture of what this means for the future of Pixel. In technical terms, Tensor is a new system on a chip (or SoC) that will power the Pixel 6 and Pixel 6 Pro phones. You're probably wondering what the heck an SoC is. This is actually the rare tech term that's somewhat self-explanatory, as an SoC is a group of the essential components that make up a computing system (like CPU, GPU, and RAM) packed together into a silicon chip.


📅 This Week in Quantum Machine Learning – Chippr Robotics

#artificialintelligence

There is currently a large interest in understanding the potential advantages quantum devices can offer for probabilistic modelling. In this work we investigate, within two different oracle models, the probably approximately correct (PAC) learnability of quantum circuit Born machines, i.e., the output distributions of local quantum circuits. We first show a negative result, namely, that the output distributions of super-logarithmic depth Clifford circuits are not sample-efficiently learnable in the statistical query model, i.e., when given query access to empirical expectation values of bounded functions over the sample space. This immediately implies the hardness, for both quantum and classical algorithms, of learning from statistical queries the output distributions of local quantum circuits using any gate set which includes the Clifford group. As many practical generative modelling algorithms use statistical queries -- including those for training quantum circuit Born machines -- our result is broadly applicable and strongly limits the possibility of a meaningful quantum advantage for learning the output distributions of local quantum circuits.


Add a Eufy video doorbell to your porch for under $70

Mashable

As of Oct. 15, the Eufy WiFi Video Doorbell is on sale. Add an extra Amazon coupon to snag it for $69.99, which beats its previous lowest price. Now is a great time to try a video doorbell -- both because the holiday season usually brings an uptick in visitors and because we found a really good deal on a budget-friendly model. Eufy's WiFi video doorbell is on sale at Amazon for $79.99 as of Oct. 15. That already ties its lowest price ever, but Amazon also has an extra $10 coupon to add.


It's Play Time: Video Games and Peripherals Are on Sale

WIRED

This year, holiday shipping timelines are going to be tighter than ever. WIRED will be covering all things Black Friday and Cyber Monday, but we're keeping our eyes peeled for early-bird discounts on the tried-and-true gear we love so that you can ensure loved one receives their gifts on time. This week, we found plenty of price drops on video games and gaming accessories. Special offer for Gear readers: Get a 1-year subscription to WIRED for $5 ($25 off). This includes unlimited access to WIRED.com and our print magazine (if you'd like).


Four MIT faculty members receive 2021 US Department of Energy early career awards

#artificialintelligence

The U.S. Department of Energy (DoE) recently announced the names of 83 scientists who have been selected for their 2021 Early Career Research Program. The list includes four faculty members from MIT: Riccardo Comin of the Department of Physics; Netta Engelhardt of the Department of Physics and Center for Theoretical Physics; Philip Harris of the Department of Physics and Laboratory for Nuclear Science; and Mingda Li of the Department of Nuclear Science and Engineering. Each year, the DoE selects researchers for significant funding the "nation's scientific workforce by providing support to exceptional researchers during crucial early career years, when many scientists do their most formative work." The quantum technologies of tomorrow –– more powerful computing, better navigation systems, and more precise imaging and magnetic sensing devices –– rely on understanding the properties of quantum materials. Quantum materials contain unique physical characteristics, and can lead to phenomena like superconductivity.


JPMorgan's guide to quantum machine learning in finance

#artificialintelligence

We suggested in January that it might be a good idea to familiarize yourself with quantum computing if you want to maximize your future employability in financial services. A new academic paper from JPMorgan's Future Lab for Applied Research and Engineering helps explain why. Authored by Marco Pistoia, JPMorgan's head of quantum technology and head of research, plus members of his team, the paper stresses that quantum computing will impact financial services sooner than you think. Goldman Sachs and JPMorgan have both been building teams of quantum researchers and Goldman has already used quantum methods to speed up derivatives pricing by over a thousand times. The finance industry stands to benefit from quantum computing "even in the short term," says JPMorgan.


The 'Battlefield 2042' beta leaves five big questions unanswered

Washington Post - Technology News

The Post played the Beta on an Alienware laptop with an Intel Core i7-9700K CPU (3.60GHz) with 16 GB of RAM and an NVIDIA GeForce RTX 2070 graphics card. The game looks stunning, with rain swirling and foliage swaying in the wind. Dice emphasized in the preview briefing that the team had added a lot of fixes and polish that would not be seen in the beta, but still, there were a good number of bugs. Unseen forces would spastically yank the pixels of some dead bodies, helicopters struck by tank shells emerged unscathed, and the map's rocket centerpiece, which blew up on launch when shot by a tank, exploded, stopped and then exploded again.


2021 Qiskit Global Summer School on Quantum Machine Learning

#artificialintelligence

Quantum computing experts and mentors share valuable insights through twenty lectures and five applied lab exercises that provide deep-dives exploring concepts in quantum computing, focused on the implementations of quantum machine learning algorithms in Qiskit.


11 Important Quantum Computing Industrial Revolutionizations

#artificialintelligence

Quantum computing is still very much in infancy, as conventional computing pushes the boundaries of what can be done using known manufacturing methods. According to some predictions, the quantum computing business will be worth US$5 billion by 2020, indicating that it will grow rapidly in the next few years. So, how can businesses profit from this development? What are the areas where quantum computing excels? Here are 11 quantum computing revolutionaries to look into.