We may (sadly) live in a world devoid of flying cars and personal teleportation devices, but that doesn't mean technology isn't moving forward at an incredible pace. We compared the processing power for various computers and devices from 1956 to 2015 to visualize the 1 trillion-fold increase in performance over those six decades. By comparing each processor's floating operations per second (FLOPS), we avoided any differences in microarchitectures. In 1965 Gordon Moore, co-founder of Intel, predicted that the number of transistors in an integrated circuit would double every two years. This is the basis of Moore's law, and it's why we currently have pocket-sized devices that are more powerful than 1980s supercomputers that took up entire rooms.
We have read about such things in science fiction for decades, but this new technology is shaping what some call the Internet of Things. It is created in the form of the "Internet of Things," or IoT, or "Internet of Connected Things," as it is called. Internet of Everything, also known as "IoE," consists of web-enabled devices that collect, send, edit and capture data from the environment using embedded sensors, processors and communication hardware. These devices, often referred to as "connected" or "intelligent," can sometimes talk to each other and pretend to receive information from each other through a process called "machine-to-machine" (M2 M) communication. IoT devices are transformed by physical objects so that they can be connected to the Internet to be controlled and communicate information.
Computers smaller than a large room used to be the thoughts of science fiction writers, while these days, every person on the planet has more computing power in their pocket than presidents had access to just 50 years ago. Computers have become an important part of society. Individuals these days rely on computers, including smart phones, for social interactions, their jobs, and entertainment. In more developed countries, it is unheard of to not have access to a computer or even the internet. Electronic devices have come a long way in the past century, and have grown exponentially in the past few decades, so it is interesting to contemplate their future.
Other major firms are following suit. Microsoft has announced dedicated silicon hardware to accelerate deep-learning in its Azure cloud. And in July, the firm also revealed that its augmented reality headset, the Hololens, will have a customized chip in it to optimize machine learning applications. Apple has a long track-record of designing its own silicon for specialist requirements. Earlier this year Apple ended a relationship with Imagination Technologies, a firm that has been providing designs for GPUs in iPhones, in favor of its own GPU designs.
While AI has many uses for securing a business, it is also instrumental in a better bottom line. The convergence of artificial intelligence (AI) and the internet of things (IoT) has created a new business ecosystem that stretches from the front-lines of physical and cybersecurity to the server rooms of enterprise organizations and smaller commercial companies. The one constant among all current and emerging applications of AI is that data collected is refined and consumed in a proactive way that can protect, expedite and expand the business interests of most organizations. It seems that almost anything that plugs into an electrical socket or represents the latest "must-have" in business or consumer technology solutions is being marketed to provide some form of AI. New gas, electric and hybrid vehicles all have AI.