The fully programmable Nao robot has been used to experiment with machine ethics. In his 1942 short story'Runaround', science-fiction writer Isaac Asimov introduced the Three Laws of Robotics -- engineering safeguards and built-in ethical principles that he would go on to use in dozens of stories and novels. They were: 1) A robot may not injure a human being or, through inaction, allow a human being to come to harm; 2) A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law; and 3) A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws. Fittingly, 'Runaround' is set in 2015. Real-life roboticists are citing Asimov's laws a lot these days: their creations are becoming autonomous enough to need that kind of guidance.
ERP systems need to lose their cumbersome heritage and open up to third-party applications, in order to help businesses benefit from technological innovations more quickly. Artificial intelligence (AI) will have a significant impact on companies and their business models over the next five years--85 percent of CEOs surveyed in PwC's 22nd Annual Global CEO Survey are convinced of this. But with only 33 percent having dipped their toe into AI for'limited uses', and fewer than one in ten using it on a wide scale, the range of applications has been limited so far. However, this is soon set to change. Despite the use of AI being a distant dream for many businesses, the current maturity of intelligent technologies and the expectations of enterprise resource planning (ERP) systems in particular--to support innovations--have fundamentally changed business demands.
Data-driven experiences are rich, immersive and immediate. Think pizza delivery by drone, video cameras that can record traffic accidents at an intersection, freight trucks that can identify a potential system failure. These kinds of fast-acting activities need lots of data -- quickly. So they can't sustain latency as data travels to and from the cloud. That to-and-fro takes too long.
Over 72.5 million connected car units are estimated to be sold by 2023, enabling nearly 70% of all passenger vehicles to actively exchange data with external sources. The amount of data resulting from these smart vehicles will be overwhelming for traditional data processing solutions to gather and analyze, as well as the associated latency of processing this data-- leading to potential life-or-death scenarios, according to Ramya Ravichandar, from Foghorn. We speak with Ravichandar, about how connected car manufacturers are implementing edge AI solutions for real-time video recognition, multi-factor authentication, and other innovative capabilities to decrease network latency and optimize data gathering, analyzing and security. Digital Journal: What are the current trends with autonomous and connected cars? Ramya Ravichandar: Automotive companies are looking to improve real-time functionalities and accelerate autonomous operations of passenger vehicles.
Phantom AI has secured raised $22 Million in Series A funding round. Celeres Investments led the latest funding round. Other investors participated in the round includes Ford Motor Company and KT (Korea's largest telco). The company intends to use the latest funds to speed up product development and scale its operations in Europe and Asia regions. Executive Opinion Co-founder and CEO of Phantom AI, Hyunggi Cho, "We founded Phantom AI to fundamentally change the economics of ADAS by developing modern software-based solutions that are high performing, cost effective, and infinitely flexible and customizable. To the automakers frustrated with the lack of options in computer vision technologies--Phantom AI is here to help. We are thrilled to bring our AI-based perception technology, including computer vision, sensor fusion and control capabilities to market, and to have the support of our new investors to help us accelerate production globally."
"The future is already being automated, and it's enabled by AI" Uber, whose AI is so central to its business model that employees "…don't even think about it anymore," is betting big on self-driving cars driving down costs. As their core driver of competitiveness, it stands to reason that if Artificial Intelligence is smart enough to drive a car it can surely help the shop owner who doubles as its sole mechanic. Our previous entry explored how AI will impact the manufacturing and distribution of auto parts, but what about the businesses that purchase and use them on a daily basis? For service centers doing everything they can to move jobs out of the bays and customers through their doors, activities that add value or increase average ticket prices can fall by the wayside. "Advances in computing power will give machines abilities once reserved for humans--the ability to understand and organize unstructured data such as photos and speech, to recognize patterns, and to learn from past experiences how to improve future performance."
New-age Digital Transformation journeys are hinged on AI capabilities. While most businesses automatically relate to AI for Artificial Intelligence, there are two more AI scenarios that you should be aware of. In this article, we will tell about these three A.I. scenarios that involve Cognitive Learning and Intelligent Automation -- While digital transformation by harnessing AI and Intelligent Automation seems to be the most obvious path to sustain businesses, focusing on just one AI scenario can expose companies to unforeseeable challenges in the future. That's why you should be able to define and distinguish between these three AI scenarios. I always refer to this Venn Diagram from Peter Sommer (2017) to distinguish between AI and ML.
AI has grown to become quite the avid topic, especially with the way technology companies are using this intelligence in a variety of applications. But, prior to popular belief, AI has been around quite a while now, dating back to the 1980's. Do you remember the KITT car from the popular David Hasselhoff starrer – Knight Rider? The in-built AI in the car was able to have multi-lingual conversations with the driver and access every single nook and crevice of the vehicle. It could drive itself and take charge in tricky situations.
Researchers from Tokyo Metropolitan University have used machine learning to analyze spin models, which are used in physics to study phase transitions. Previous work showed that an image/handwriting classification model could be applied to distinguish states in the simplest models. The team showed the approach is applicable to more complex models and found that an AI trained on one model and applied to another could reveal key similarities between distinct phases in different systems. Machine learning and artificial intelligence (AI) are revolutionizing how we live, work, play, and drive. Self-driving cars, the algorithm that beat a Go grandmaster and advances in finance are just the tip of the iceberg of a wide range of applications now having a significant impact on society.