computing


This cyberwar just got real DW 24.05.2018

#artificialintelligence

Cyberwar may not feel like "real" war -- the kind we've known and loathed for eons and the very same we perversely reenact in video games. But some military and legal experts say cyberwar is as real as it gets. David Petraeus, a retired US General and (some say disgraced) former intelligence chief says the internet has created an entirely distinct domain of warfare, one which he calls "netwar." And that's the kind being waged by terrorists. Then there's another kind, and technically any hacker with enough computer skills can do it -- whatever the motivation.


Innovations in Artificial intelligence, Machine Learning, Cloud, and Blockchain

#artificialintelligence

This edition of ITCC TOE provides a snapshot of the emerging ICT led innovations in machine learning, blockchain, cloud computing, and artificial intelligence. This issue focuses on the application of information and communication technologies in alleviating the challenges faced across industry sectors in areas such as brick & mortar retail, e-commerce, data labelling, 5G, photo and video editing, manufacturing, talent and business intelligence, amongst others. ITCC TechVision Opportunity Engine (TOE)'s mission is to investigate emerging wireless communication and computing technology areas including 3G, 4G, Wi-Fi, Bluetooth, Big Data, cloud computing, augmented reality, virtual reality, artificial intelligence, virtualization and the Internet of Things and their new applications; unearth new products and service offerings; highlight trends in the wireless networking, data management and computing spaces; provide updates on technology funding; evaluate intellectual property; follow technology transfer and solution deployment/integration; track development of standards and software; and report on legislative and policy issues and many more. The Information & Communication Technology cluster provides global industry analysis, technology competitive analysis, and insights into game-changing technologies in the wireless communication and computing space. Innovations in ICT have deeply permeated various applications and markets.


AI Technology Revolution Is Just Getting Started

#artificialintelligence

That should be very good for the companies that are the arms merchants in AI technology, particularly chip companies like Micron Technology (ticker: MU) and Xilinx (XLNX). A new form of computing is emerging, and it demands new chips. The change is every bit as profound as the rise of micro-computing in the 1970s that made Intel a king of microprocessors. It makes Micron and Xilinx more important, but it will probably also lead to future chip stars that aren't public now or may not even have been founded yet. Barron's first explored the new AI in an October 2015 cover story, "Watch Out Intel, Here Comes Facebook."


Highway to The Future: Artificial Intelligence for Smart Vehicles

#artificialintelligence

John Ludwig is an electrical engineer and the president of Xevo's Artificial Intelligence (AI) Group. Xevo is a tier-one OEM software company, located in Seattle, that manages automotive software for driver assistance, engagement, and in-vehicle entertainment. Its main product is the Xevo Market, a merchant-to-driver commerce platform that uses a vehicle's infotainment screen to make purchases and transations from inside the car. Xevo Market launched at the end of 2017 and is already available in millions of vehicles. Prior to working with Xevo, Ludwig was a software manager with Microsoft, overseeing operating systems and online service projects.


How AI Impacts Memory Systems

#artificialintelligence

Throughout the 1980's and early 1990's computer systems were bottlenecked by relatively slow CPU performance, thereby limiting what applications could do. Driven by Moore's Law, transistor counts increased significantly over the years, improving system performance and enabling exciting new computing possibilities. Although computing capabilities have advanced significantly in recent years, bottlenecks have shifted to other parts of the computing system. Put simply, while Moore's Law has addressed processing needs and enabled new computing paradigms, there are now a new set of challenges for the industry to address. Evolving devices and computing models The period between 1990-2000 was characterized by centralized computing that revolved around desktops and workstations.


Qualcomm will gain more than its rivals do, as artificial intelligence grows at the 'edge'

#artificialintelligence

The artificial intelligence market, though still in its infancy, is expected to be the primary driver for many technology companies in the next decade. Shifts in how artificial intelligence (AI) is applied to computing and how consumers will interact with it are moving the emphasis from the server room to the devices in our pockets, a change that could benefit mobile-first players like Qualcomm QCOM, 0.90% In the current vision of AI, computing is associated with the cloud, large clusters of servers computing constantly in data centers. This model is led by Nvidia NVDA, 0.30% a company that CEO Jensen Huang has deftly maneuvered into the pole position for large-scale machine learning. There are fast followers to watch, though, including both Intel INTC, 0.71% a stalwart in the data-center space, and Google GOOG, -0.25% building its own chips for AI processing. This need for server-based artificial intelligence won't be going away as it is responsible for training the complex models and data sets required for AI to be applied on consumer and commercial devices.


Evolutionary computation will drive the future of creative AI

#artificialintelligence

AI is arguably the biggest tech topic of 2018. From Google Duplex's human imitations and Spotify's song recommendations to Uber's self-driving cars and the Pentagon's use of GoogleAI, the technology seems to offer everything to everyone. You could say AI has become synonymous with progress via computing. However, not all AI is created equal, and for AI to fulfill its many promises, it needs to be creative. Let's start by addressing what I mean by "creative."


Digital transformation and the CIO: Everything you need to know today

ZDNet

Although the concept of digital transformation is not new, many IT organizations still struggle with the impact these changes will have on them. ZDNet and TechRepublic looks at the dramatic effect of AI, big data, cloud computing, and automation on IT jobs, and how companies can adapt. This week, I hosted a CIO Summit at the Digital Enterprise Show in Madrid. Digital transformation is a broad umbrella term that covers all these shifts, so it is worth taking a deeper dive. Oracle interviewed me on this topic, so edited the responses and include them here.


How is artificial intelligence changing science?

#artificialintelligence

Intel's Gadi Singer believes his most important challenge is his latest: using artificial intelligence (AI) to reshape scientific exploration. In a Q&A timed with the first Intel AI DevCon event, the Intel vice president and architecture general manager for its Artificial Intelligence Products Group discussed his role at the intersection of science--computing's most demanding customer--and AI, how scientists should approach AI and why it is the most dynamic and exciting opportunity he has faced. How is AI changing science? Scientific exploration is going through a transition that, in the last 100 years, might only be compared to what happened in the '50s and '60s, moving to data and large data systems. In the '60s, the amount of data being gathered was so large that the frontrunners were not those with the finest instruments, but rather those able to analyze the data that was gathered in any scientific area, whether it was climate, seismology, biology, pharmaceuticals, the exploration of new medicine, and so on.


Chances are You Don't Have a Quantum Computer, but IBM Will let you Use One

#artificialintelligence

There are many simulation and optimization problems that are difficult or impossible to solve using your existing computing resources. You do not have a quantum computer, which may be able to solve them, and you do not expect your company to get one soon. You are not alone, but don't worry IBM will let you use their quantum computing resources to make a start in formulating their solutions. For years, quantum computing was little more than an idea that fascinated computer scientists. Now it is offering direct utility for researchers and engineers even before the promise of a large-scale universal quantum computer is fulfilled.