Goto

Collaborating Authors

quantum


5 years until enterprise quantum, but your prep begins now

#artificialintelligence

Quantum computing technology is advancing rapidly and is on track to solve extraordinarily complex business problems through enhanced optimization, machine learning, and simulation. Make no mistake, the technology promises to be one of the most disruptive of all time. In fact, I believe quantum computing will hand a significant competitive advantage to the companies that can successfully leverage its potential to transform their business and their industries. While quantum technologies are still maturing, companies are already preparing, with spending on quantum computing projected to surge from $260 million in 2020 to $9.1 billion by 2030, according to research from Tractica. Companies are pursuing the promise of quantum aggressively, as evidenced by the recently announced combination of Honeywell Quantum Solutions and Cambridge Quantum Computing.


Williams F1 drives digital transformation in racing with AI, quantum

#artificialintelligence

"The thing that really attracted me to Formula 1 is that it's always been about data and technology," says Graeme Hackland, Williams Group IT director and chief information officer of Williams Racing. Since joining the motorsport racing team in 2014, Hackland has been putting that theory into practice. He is pursuing what he refers to as a data-led digital transformation agenda that helps the organization's designers and engineers create a potential competitive advantage for the team's drivers on race day. Hackland explains to VentureBeat how Williams F1 is looking to exploit data to make further advances up the grid and how emerging technologies, such as artificial intelligence (AI) and quantum computing, might help in that process. This interview has been edited for clarity.


Vizio M-Series Quantum 4K UHD TV review: Same accurate color, now with upgraded ports

PCWorld

It didn't take long to confirm what I suspected during my Vizio V5-Series review--the slightly more expensive M-Series Quantum offers a far better picture. It's not perfect perfect by any means, but the color is more accurate, and the screen uniformity far outstrips that of the V-Series. If you're shopping mid-range Vizio, the M-Series Quantum is what you want. Skip a couple of lunches to save up the extra cash. The M-Series, including the 55-inch class model M55Q6 that I tested, are 60Hz, 3840 x 2160 (4K UHD), 10-bit TVs.


Hisense U8G-series 4K UHD TV review: Nice for the price, especially for gamers

PCWorld

It's a very nice set, although we missed the deep black performance we've seen in some competitors outfitted with mini-LED backlights. The U8G-series is available in both 55-inch ($950) and 65-inch ($1,300, reviewed here) sizes. It uses a 120Hz, 10-bit, 4K UHD (3840 x 2160) panel featuring quantum dots for extremely accurate color. The TV is a bit on the heavy side, weighing close to 53.4 pounds on the wall (with VESA 400 mm x 400 mm mount), and 56 pounds including the stand. The bezel is thin, and there's a classy look to the whole deal.


Brain-inspired computing: We need a master plan

arXiv.org Artificial Intelligence

New computing technologies inspired by the brain promise fundamentally different ways to process information with extreme energy efficiency and the ability to handle the avalanche of unstructured and noisy data that we are generating at an ever-increasing rate. To realise this promise requires a brave and coordinated plan to bring together disparate research communities and to provide them with the funding, focus and support needed. We have done this in the past with digital technologies; we are in the process of doing it with quantum technologies; can we now do it for brain-inspired computing?


Theoretical bounds on data requirements for the ray-based classification

arXiv.org Machine Learning

The problem of classifying high-dimensional shapes in real-world data grows in complexity as the dimension of the space increases. For the case of identifying convex shapes of different geometries, a new classification framework has recently been proposed in which the intersections of a set of one-dimensional representations, called rays, with the boundaries of the shape are used to identify the specific geometry. This ray-based classification (RBC) has been empirically verified using a synthetic dataset of two- and three-dimensional shapes [1] and, more recently, has also been validated experimentally [2]. Here, we establish a bound on the number of rays necessary for shape classification, defined by key angular metrics, for arbitrary convex shapes. For two dimensions, we derive a lower bound on the number of rays in terms of the shape's length, diameter, and exterior angles. For convex polytopes in R^N, we generalize this result to a similar bound given as a function of the dihedral angle and the geometrical parameters of polygonal faces. This result enables a different approach for estimating high-dimensional shapes using substantially fewer data elements than volumetric or surface-based approaches.


Are quantum computers good at picking stocks? This project tried to find out

ZDNet

The researchers ran a model for portfolio optimization on Canadian company D-Wave's 2,000-qubit quantum annealing processor. Consultancy firm KPMG, together with a team of researchers from the Technical University of Denmark (DTU) and a yet-to-be-named European bank, has been piloting the use of quantum computing to determine which stocks to buy and sell for maximum return, an age-old banking operation known as portfolio optimization. The researchers ran a model for portfolio optimization on Canadian company D-Wave's 2,000-qubit quantum annealing processor, comparing the results to those obtained with classical means. They found that the quantum annealer performed better and faster than other methods, while being capable of resolving larger problems – although the study also indicated that D-Wave's technology still comes with some issues to do with ease of programming and scalability. The smart distribution of portfolio assets is a problem that stands at the very heart of banking.


Quantum sensors could soon be heading into space

ZDNet

An outer-space mission inevitably calls for next-generation tools. Quantum technologies are on track to reach new heights – quite literally: quantum company Q-CTRL has plans to send ultra-sensitive quantum sensors and navigation devices to space, as part of a mission to explore the moon for water and other resources that will support NASA astronauts in future landings. The Australian company, which applies the principles of control engineering to improve the hardware performance of quantum devices, will provide the quantum technology to assist un-crewed missions organized by the Seven Sisters space industry consortium, and planned to start in 2023. Formed last year by space start-up Fleet Space, the consortium is working to send nanosatellites and exploration sensors to the moon to search for resources, and generate useful data for future human exploration. The information gathered will inform NASA's Artemis program, which will land the first woman and next man on the Moon by 2024, creating a sustainable human presence for later crewed Martian exploration.


Dimensions Technology Solutions .LLC

#artificialintelligence

In present-day history, 2020 was an extraordinary year. Over the most recent hundred years, humanity has not gone through a worldwide pandemic like COVID-19. All countries, businesses, and practically all people on our planet have been influenced by it. Fortunately, we have antibodies on our doorsteps and can at long last, with a ton of fervor and expectation, welcome the New Year 2021 with the best asp .net The advanced change has been incredibly quickened by COVID-19, and the cycle will be more noteworthy in 2021.


Generalization in Quantum Machine Learning: a Quantum Information Perspective

arXiv.org Machine Learning

We study the machine learning problem of generalization when quantum operations are used to classify either classical data or quantum channels, where in both cases the task is to learn from data how to assign a certain class $c$ to inputs $x$ via measurements on a quantum state $\rho(x)$. A trained quantum model generalizes when it is able to predict the correct class for previously unseen data. We show that the accuracy and generalization capability of quantum classifiers depend on the (R\'enyi) mutual informations $I(C{:}Q)$ and $I_2(X{:}Q)$ between the quantum embedding $Q$ and the classical input space $X$ or class space $C$. Based on the above characterization, we then show how different properties of $Q$ affect classification accuracy and generalization, such as the dimension of the Hilbert space, the amount of noise, and the amount of neglected information via, e.g., pooling layers. Moreover, we introduce a quantum version of the Information Bottleneck principle that allows us to explore the various tradeoffs between accuracy and generalization.