Information Technology

Robotics in business: Everything humans need to know


One kind of robot has endured for the last half-century: the hulking one-armed Goliaths that dominate industrial assembly lines. These industrial robots have been task-specific -- built to spot weld, say, or add threads to the end of a pipe. They aren't sexy, but in the latter half of the 20th century they transformed industrial manufacturing and, with it, the low- and medium-skilled labor landscape in much of the U.S., Asia, and Europe. You've probably been hearing a lot more about robots and robotics over the last couple years. That's because for the first time since the 1961 debut of GM's Unimate, regarded as the first industrial robot, the field is once again transforming world economies. Only this time the impact is going to be broader.

The Industrial Era Ended, and So Will the Digital Era


In a famous scene in the 1967 movie The Graduate, a family friend takes aside Dustin Hoffman's character, Benjamin Braddock, and whispers in a conspiratorial tone, "Plastics….There's a great future in plastics." It seems quaint today, but back then plastics really were new and exciting. If the movie had been set in another age, the advice to young Braddock would have been different. He might have been counseled to go into railroads or electronics or simply to "Go West, young man!" Every age has things that seem novel and wonderful at the time, but tepid and banal to future generations. Today digital technology is all the rage because after decades of development it has become incredibly useful.

Building AI: Key Steps For Adoption And Scaling Up

Forbes Technology

And yet according to 313 executives recently surveyed by Forbes Insights--63% of whom were in the C-Suite--almost all (95%) believe that AI will play an important role in their responsibilities in the near-future. The majority of CEOs today are not drivers of AI adoption--that responsibility falls on C-level technology leaders who need to build a strong business case and show results that encourage a deeper dive into change. With that firmly in mind, Forbes Insights and Intel have taken their combined experience covering and developing technology to produce this introductory guide to AI adoption, from buy-in and deployment to building a corporate culture around data. Consider the below three steps your beginner's guide to AI. It's important to see beyond the swirl of hype and expectations around AI technologies and view them for what they really are--massive accelerators of processes and insights and profound amplifiers of human capability.

Intel AIVoice: Stepping Out Of Science Fiction: A History Of Intel Powering AI

Forbes Technology

That patent, awarded April 25, 1961, recognizes Robert Noyce as the inventor of the silicon integrated circuit (IC). Integrated circuits forever changed how computers were made while adding power to a process of another kind: the growth of a then-nascent field called artificial intelligence (AI). And the potential of Noyce's invention truly took flight when he and Gordon Moore founded Intel on July 18, 1968. Fifty years later, the "eternal spring" of artificial intelligence is in full swing. To understand how we arrived, here's the truth in a nutshell: The rise of artificial intelligence is intertwined with the history of faster, more robust microprocessors.

3 Types Of Machine Learning Systems - Coffee with CIS - Latest News & Articles


Developers know a whole lot about the machine learning (ML) systems that they produce and manage, that is a given. But, there's a demand for non-developers to have a higher level understanding of the kinds of systems. Expert systems and artificial neural networks would be the classical two important classes. With the advancements in computing functionality, softwares capacities, algorithm complexity and analytical algorithm could be said to have combined both of them. This article is a summary of the three different types of systems.

Many fear Artificial Intelligence as the road to robots taking over the world -- are they right?

FOX News

Disruptive technologies such as artificial intelligence (AI) and encryption hold the promise of solving some of the world's most pressing issues. Future innovations relying on their use are endless: creating remote health care for the elderly and people with disabilities, for instance, or protecting our privacy and creating smart cities that can reduce waste and ease congestion. But some people look at these innovations with a deep sense of fear, envisioning a future where robots take over our jobs and eventually eclipse us. It's an understandable fear – and one that's long been popularized by the movies and the media. It's even become a polarizing battle within the tech industry itself, with Elon Musk warning about the possible misuse and militarization of AI, while tech execs, including Google's Eric Schmidt and Facebook's Mark Zuckerberg, call Musk's views misleading and alarmist.

Experts debate moral issues of artificial intelligence at ESOF 2018 Computing


A professor of ethics and technology has told scientists and policymakers that digital technology like artificial intelligence'desperately' need to be regulated using an institutional framework and system of values. "It is presented to us mainly by big corporations who want to make some profit." Van den Hoven, a member of the European Group on Ethics in Science and New Technologies (EGE), said: "We need to think about governance, inspection, monitoring, testing, certification, classification, standardisation, education, all of these things. We need to desperately, and very quickly, help ourselves to it." He also spoke about the need for a cross-Europe network of institutions that could provide a set of values, based on the EU's Charter of Fundamental Rights, which the technology industry could use to inform future work on AI.

French tech firm Atos launches AI suite to accelerate app development


French technology consulting firm Atos has launched an artificial intelligence (AI) software suite for businesses across the world, which intends to make it simpler for teams to build AI-based applications using a combination of intellectual properties, a report said. Called Atos Codex AI Suite, the software package also intends to make it easier for teams of developers and data scientists to collaborate on the development and training of AI models, The Economic Times reported. The report also added that, with the help of the suite, the apps can be deployed and relocated across multiple environments like public cloud, on-premises or edge computing. "Atos Codex AI Suite tackles new enterprise, scientific and industry challenges, such as precision medicine, advanced prescriptive maintenance and prescriptive security, with a new generation of cognitive applications," said Arnaud Bertrand, senior vice-president, strategy and innovation BDS (Big Data and cybersecurity), Atos. Atos Codex AI Suite can either be purchased as a standalone software platform or together with a server infrastructure, the company added.

Artificial Intelligence & Automation, winning combo for businesses: View - ET CIO


In the digital economy, AI and Automation are creating plenty of opportunities for businesses to reimagine processes, turbocharge performance and boost productivity like never before. Artificial Intelligence (AI) is quickly finding a place at the heart of the enterprise, with it set to affect 25 percent of technology spend going forward, according to Accenture. Enabling better-informed decisions by augmenting human intelligence with powerful computing and precise data analysis, and then automating the tasks that follow, AI and automation have the power to energize businesses and help them drive towards success. Their rapid emergence has been driven by three factors. The cloud has made huge amounts of computing and processing power available, on demand.

AI Vision IoT


This is to use Camera's camera view. Changing the width and height, 1280 x 720 worked great for me, but you can play around with the dimensions to see what fits your need. I set this to 30, the higher you set the number the more computing power it would require. You can play around to see what the benchmark for it, but 30 has worked great for me.