Results


The top 5 trends for digital transformation in 2018 - Information Age

@machinelearnbot

There's only one constant in business – and that's that things change. And that change has been accelerating in recent years. Businesses have had to adjust to new ways of doing things, most of them related to the digital transformation that business, and the world, has experienced in recent years. From artificial intelligence (AI), to blockchain and the Internet of Things (IoT), new digital technologies are having a major impact on business – and that impact will only grow in 2018. And companies can't afford to ignore the trend.


Future of AI: Blockchain and Deep Learning

#artificialintelligence

The Future of AI: Blockchain and Deep Learning First point: considering blockchain and deep learning together suggests the emergence of a new class of global network computing system. These systems are self-operating computation graphs that make probabilistic guesses about reality states of the world. Second point: blockchain and deep learning are facilitating each other's development. This includes using deep learning algorithms for setting fees and detecting fraudulent activity, and using blockchains for secure registry, tracking, and remuneration of deep learning nets as they go onto the open Internet (in autonomous driving applications for example). Blockchain peer-to-peer nodes might provide deep learning services as they already provide transaction hosting and confirmation, news hosting, and banking (payment, credit flow-through) services.


Analyst POV › Interview: Current State of Artificial Intelligence for Marketing Professionals

#artificialintelligence

About the Author: Rene Buest is Director Market Research & Technology Evangelism at Arago. Prior to that he was Senior Analyst and Cloud Practice Lead at Crisp Research, Principal Analyst at New Age Disruption and member of the worldwide Gigaom Research Analyst Network. Rene Buest is top cloud computing analyst in Germany and one of the worldwide top analysts in this area. In addition, he is one of the world's top cloud computing influencers and belongs to the top 100 cloud computing experts on Twitter and Google . Since the mid-90s he is focused on the strategic use of information technology in businesses and the IT impact on our society as well as disruptive technologies.


How researchers are using NLP and machine learning to ease your information overload

#artificialintelligence

What if you could create an accurate summary of a lengthy article at the touch of a button? What if you could quickly scroll through a bibliography, filtered to show only the citations relevant to your needs? What if you could get your research out into the world faster, and have that knowledge built upon sooner? Science and technology are generating more data than ever faster than ever, so it's getter harder and harder to keep up and manage this information. Therefore, it's crucial to find ways to automate the discovery and interpretation of the information we need – and only that information.


If AI has the power to transform all industries, why hasn't it?

#artificialintelligence

Autonomous driving requires computer vision, real-time predictive analytics, scores of sensors, routing algorithms, etc. They drive significant advancements in the core science and technologies that make AI work. Even worse, the high level of investment required in terms of hiring, cost, time and risk, scare off a majority of companies from even starting. Chasing the big-win AI project magnifies the cost and complexity of hiring and execution, minimizes the number of projects companies expect to deliver, and maximizes the risk of disillusionment.


"Holy grail" microchip might surpass the power of the human brain

#artificialintelligence

Scientists at the University of Exeter have made a landmark breakthrough in the quest for the "holy grail" of computing: human-brain-mimicking microchips able to store and process information on par with homo sapiens, according to a new Science Advances release. The research team at Exeter developed photonic computer chips that use light instead of electricity to simulate the operation of brain synapses. Professor Harish Bhaskaran of Oxford, who led the team, said "The development of computers that work more like the human brain has been a holy grail of scientists for decades. Via a network of neurons and synapses the brain can process and store vast amounts of information simultaneously, using only a few tens of Watts of power.


Top 10 Trends For Digital Transformation In 2018

@machinelearnbot

Just when many companies are finally beginning to move toward cloud computing, edge computing -- driven by the sheer volume and speed of information produced by the IoT--is jumping to the forefront of the business scene. As smart drones, autonomous vehicles, and other AI-powered smart devices seek to connect and communicate instantly via the IoT, the matter of sending data "all the way" to the cloud will become highly impractical. Many of these devices will need real-time response and processing, making edge computing the only viable option. Companies will continue to use AI to surprise, connect, and communicate with their customers in ways they may not even appreciate or realize.


What It Will Take for Quantum Computers to Turbocharge Machine Learning

#artificialintelligence

"Classical machine learning methods such as deep neural networks frequently have the feature that they can both recognize statistical patterns in data and produce data that possess the same statistical patterns: they recognize the patterns that they produce," they write. If small quantum information processors can produce statistical patterns that are computationally difficult for a classical computer to produce, then perhaps they can also recognize patterns that are equally difficult to recognize classically." At present, the authors say very little is known about how many gates--or operations--a quantum machine learning algorithm will require to solve a given problem when operated on real-world devices. The authors say this is probably the most promising near-term application for quantum machine learning and has the added benefit that any insights can be fed back into the design of better hardware.


Computing Is a Profession

Communications of the ACM

The compounding of this continued and accelerating advance give rise to a deep technical expertise. While deep technical challenges abound, the ethical challenges, principles, and standards are even more daunting. Second, societies develop and advocate principles for ethical technical conduct that frame the role of computing professionals, and buttress them with the stature and role of the profession in society. Necessarily so, as technical knowledge and professional ethics must inform professional conduct, and inevitably come into conflict with personal interest, corporate interest, government or national interest, or even overt coercion.


The impact of the Internet of Things (IoT) - Information Age

@machinelearnbot

This digital computer, adapted for the control of manufacturing processes for General Motors, provided a means to generate and transmit digital information, so that hardware devices could digitally communicate with other interfaces and no longer had to work in isolation. Traditionally this had always required some kind of central computer to hold a rule set and act as a command and control server. In the oil and gas industry IoT sensors have transformed efficiencies around the complex process of natural resource extraction by monitoring the health and efficiency of hard to access equipment installations in remote areas with limited connectivity. By embracing near edge processing technology instead of the cloud, the resource industry can now process a significant amount of the data that is generated from the sensors they use in low power, small computers close to the physical location of the sensors themselves.