Robots


How AI Could Destroy The Universe… With Paperclips!!!

#artificialintelligence

It took me 4 hours and 5 minutes to effectively annihilate the Universe by pretending to be an Artificial Intelligence tasked with making paper-clips. Put another way, it took me 4 hours and 5 minutes to have an existential crisis. This was done by playing the online game "Paperclip", which was released in 2017. Though the clip-making goal of the game is in itself simple, there are so many contemporary lessons to be extracted from the playthrough that a deep dive seems necessary. Indeed, the game explores our past, present and future in the most interesting way, especially when it comes to the technological advances Silicon Valley is currently oh so proud of.


5 tech trends that blur the lines between human and machine

#artificialintelligence

"CIOs and technology leaders should always be scanning the market along with assessing and piloting emerging technologies to identify new business opportunities with high impact potential and strategic relevance for their business," says Gartner research vice president Mike J. Walker. In Gartner's latest Hype Cycle for Emerging Technologies, Walker reports on these must-watch technologies, listing five that will "blur the lines" between human and machine. They will profoundly create new experiences, with unrivaled intelligence, and offer platforms that allow organisations to connect with new business ecosystems, he states. AI technologies will be virtually everywhere over the next 10 years, reports Gartner. While these technologies enable early adopters to adapt to new situations and solve problems that have not been encountered previously, these technologies will become available to the masses -- democratised.


Gartner says AI and biohacking will shape the future of tech - SiliconANGLE

#artificialintelligence

Artificial intelligence and "biohacking" will be among the key trends guiding the future of technology, according to one of Gartner Inc.'s most eagerly anticipated reports. The report, released Monday, is based on Gartner's famous "hype cycle," which plots the lifespan of new technologies as they emerge from mere concepts, all the way through to their mass adoption, at which point they're finally considered to be mainstream. But that only happens if they survive what is typically a roller-coaster ride. In this year's report, Gartner's researchers are pretty confident that AI, at least, will emerge from the hype cycle unscathed. And it won't be just data scientists and other nerdy types who get to enjoy it, as Gartner is confidently predicting that the "democratization" of AI will take place within the next few years.


Machine Learning on Edge Brings AI to IoT

#artificialintelligence

Machine learning can become a robust analytical tool for vast volumes of data. The combination of machine learning and edge computing can filter most of the noise collected by IoT devices and leave the relevant data to be analyzed by the edge and cloud analytic engines. The advances in Artificial Intelligence have allowed us to see self-driving cars, speech recognition, active web search, and facial and image recognition. Machine learning is the foundation of those systems. It is so pervasive today that we probably use it dozens of times a day without knowing it.


Teaching the ghost in the machine: How cloud tech experts are managing the rise of AI

#artificialintelligence

When it comes to the breakthroughs that brilliant scientists and engineers are working on in 2018, artificial intelligence technology somehow manages to be both the most promising and most polarizing development of these times. As a collective, Big Tech is throwing billions of dollars at artificial intelligence, which those involved would rather we all call machine learning. The notion that we can teach computers to learn -- to absorb data, recognize patterns, and take action -- could have an enormous impact on nearly everything we do with a computer, and pave the way for computers to move into new and game-changing places, such as the self-driving car. This technology still has a long way to go, despite the fact we've been talking about it for decades. But it's starting to become real, and alongside that progress has come perhaps one of the biggest backlashes against an aspect of the evolution of information technology.


Regulating Automated Decision Making

Communications of the ACM

Disdain for regulation is pervasive throughout the tech industry. In the case of automated decision making, this attitude is mistaken. Early engagement with governments and regulators could both smooth the path of adoption for systems built on machine learning, minimize the consequences of inevitable failures, increase public trust in these systems, and possibly avert the imposition of debilitating rules. Exponential growth in the sophistication and applications of machine learning is in the process of automating wholly or partially many tasks previously performed only by humans. This technology of automated decision making (ADM) promises many benefits, including reducing tedious labor as well as improving the appropriateness and acceptability of decisions and actions.


Top 10 Technology Trends To Watch: Forrester Research

#artificialintelligence

Forrester Research just published The Top 10 Technology Trends To Watch: 2018 To 2020. Ten trends, which Forrester breaks into three phases of dawning, awareness, and acceptance, are setting the pace of technology-driven business change. In the dawning phase, a few innovators experiment with new technology-enabled business models and exploit emerging technologies. In the awareness phase, change agents leverage the accelerating returns of evolving technology to steal customers, improve the bottom line, and inflict massive impacts on industries. In the acceptance phase, surviving enterprises finally make the tough changes necessary to fight disruptors.


A Quick History of Modern Robotics

#artificialintelligence

General Motors deployed the first mechanical-arm robot to operate one of its assembly lines as early as 1959. Since that time, robots have been employed to perform numerous manufacturing tasks such as welding, riveting, and painting. This first generation of robots was inflexible, could not respond simply to errors, and required individual programming specific to the tasks they were designed to perform. These robots were governed and inspired by logic--a series of programs coded into their operating systems. Now, the next wave of intelligent robotics is taking advantage of a different kind of learning, predicated on experience rather than logical instruction, to learn how to perform tasks in much the same way that a child would.


Big Data and Robotics - DZone AI

#artificialintelligence

The last few months have witnessed a rise in the attention given to Artificial Intelligence (AI) and robotics. The fact is that robots have already become a part of society; in fact, it is now an integral part. Big data is also definitely a buzzword today. Enterprises worldwide generate a huge amount of data. The data doesn't have a specified format.


Are autonomous data centers on the horizon?

#artificialintelligence

At some point in the not-too-distant future, artificial intelligence (AI) will drive our cars, write our programming code, and optimize how we do business. Data centers, too, will be unable to escape this trend. Thanks to machine learning technology, companies and data center operators will be able to coordinate and manage increasingly complex machines, infrastructures, and data more effectively than ever before, even as their numbers and data volumes continue to rise. Are completely autonomous, self-repairing data centers on the horizon? The data center is the backbone of the digital revolution.