CES opens in Las Vegas next week, and you know what that means. If it fits into a PC or your pocket; if it sits on a console or kitchen counter; if it beeps, buzzes, or talks; and if it's artificially intelligent or at least somewhat smart; it will be there, somewhere, in a great river of technology flowing through the hotels and convention centers along the Strip.
Intel is reportedly preparing to fabricate "Loihi," a self-learning "brain chip" that mimics how the human intellect functions, as a foundation for further developments in artificial intelligence. Named after an active undersea volcano south of the island of Hawaii, Intel said in a statement Monday that Loihi includes a total of 130,000 silicon "neurons" connected with 130 million "synapses," the junctions that in humans connect the neurons within the brain. The Loihi chip, which Wired reported will be manufactured next month on Intel's 14-nm process technology, will be shared with leading universities and research institutions next year in a bid to advance AI development, Intel said. Coincidentally, Microsoft said Monday that it, too, is working on ways to develop new avenues for alternative computing, including manufacturing actual chips and systems, as well as developing software to power quantum computers. Intel said it believes the Loihi chip could be used autonomously, A Loihi-powered medical device, for example, could determine what a "normal" heart rate was and therefore be able to figure out when an abnormal heart condition presented itself.
The GPU, the first one based on the brand-new Volta architecture, was introduced at the company's GPU Technology Conference in San Jose, California, on Wednesday. The new supercomputer 40,960 CUDA cores, which Nvidia says equals the computing power of 800 CPUs. The Tesla V100 in the DGX-1 is five times faster than the current Pascal architecture, Huang said. Nvidia has also included a cube-like Tensor Core, which will work with the regular processing cores to improve deep learning.
Dell is now focused more on server, storage, cloud, networking, and internet-of-things offerings. Shipments in that limited market will be shared among Lenovo, HP, Dell, Asus, Apple, Acer, and other vendors. Dell's PC shipments grew by 6.2 percent in the first quarter of this year to 9.6 million units, according to IDC. Worldwide PC shipments during the quarter totaled 60.3 million units.
New intelligence can be added to mobile devices like the iPhone, Android devices, and low-power computers like Raspberry Pi with Facebook's new open-source Caffe2 deep-learning framework. Caffe2 can be used to program artificial intelligence features into smartphones and tablets, allowing them to recognize images, video, text, and speech and be more situationally aware. It's important to note that Caffe2 is not an AI program, but a tool allowing AI to be programmed into smartphones. It takes just a few lines of code to write learning models, which can then be bundled into apps. The release of Caffe2 is significant.
Cloud computing is all well and good for enterprises with big-data applications and consumers with virtual assistants, but it runs into some limits in an isolated cornfield. On farms and other places far from powerful computers and network connections, there's a trend away from centralized computing even while most of the IT world is embracing it. In remote places, the internet of things requires local processing as well as data-center analysis. So-called edge computing is coming to industries including manufacturing, utilities, shipping, and oil and gas. Agriculture is getting it, too.
Dumping Moore's Law is perhaps the best thing that could happen to computers, as it'll hasten the move away from an aging computer architecture holding back hardware innovation. That's the view of prominent scientist R. Stanley Williams, a senior fellow in the Hewlett Packard Labs. Williams played a key role in the creation of the memristor by HP in 2008. Moore's Law is an observation made by Intel co-founder Gordon Moore in 1965 that has helped make devices smaller and faster. It predicts that the density of transistors would double every 18 to 24 months, while the cost of making chips goes down.
The U.S. Defense Advanced Research Projects Agency (DARPA) has come up with some crazy ideas in the past, and its latest idea is to create computers that are always learning and adapting, much like humans. DARPA's aptly named Lifelong Learning Machine (L2M) program has the ambitious goal to create technology for "new AI systems that learn online, in the field, and based on what they encounter -- without having to be taken offline for reprogramming or retraining for new conditions," according to a document published Thursday detailing the program. An adaptive computer that draws on experience to make decisions has been a "long-standing" goal, said Hava Siegelmann, program manager for the L2M project at DARPA. The ability to give biological intelligence will involve developing new computer architectures and new machine-learning techniques.
SAP has added some new capabilities to SAP Vora, its in-memory distributed computing system based on Apache Spark and Hadoop. Version 1.3 of Vora includes a number of new distributed, in-memory data-processing engines, including ones for time-series data, graph data and schema-less JSON data, that accelerate complex processing. Common uses for the graph engine might be analyzing social graphs or supply chain graphs, said Ken Tsai, SAP's head of product marketing for database and data management. One application that would benefit from the new time-series engine is looking for patterns of electricity consumption in smart metering data. "You can certainly do it without the time-series engine, but it's not as efficient," Tsai said.
Intel realizes there will be a post-Moore's Law era and is already investing in technologies to drive computing beyond today's PCs and servers. One way to resolve that crisis--which all chipmakers face--is to completely change the current computing model in PCs, smartphones, and servers. Some short-term answers can resolve the bottlenecks based on Von Neumann model, including Optane, Intel's new form of super-fast memory and storage. D-Wave recently released a 2,000-qubit quantum computer based on quantum annealing, while IBM has a 5-bit quantum computer accessible via the cloud.