Goto

Collaborating Authors

Results


Modern Computing: A Short History, 1945-2022

#artificialintelligence

Inspired by A New History of Modern Computing by Thomas Haigh and Paul E. Ceruzzi. But the selection of key events in the journey from ENIAC to Tesla, from Data Processing to Big Data, is mine. This was the first computer made by Apple Computers Inc, which became one of the fastest growing ... [ ] companies in history, launching a number of innovative and influential computer hardware and software products. Most home computer users in the 1970s were hobbyists who designed and assembled their own machines. The Apple I, devised in a bedroom by Steve Wozniak, Steven Jobs and Ron Wayne, was a basic circuit board to which enthusiasts would add display units and keyboards. April 1945 John von Neumann's "First Draft of a Report on the EDVAC," often called the founding document of modern computing, defines "the stored program concept." July 1945 Vannevar Bush publishes "As We May Think," in which he envisions the "Memex," a memory extension device serving as a large personal repository of information that could be instantly retrieved through associative links.


Challenges of Artificial Intelligence -- From Machine Learning and Computer Vision to Emotional Intelligence

arXiv.org Artificial Intelligence

Artificial intelligence (AI) has become a part of everyday conversation and our lives. It is considered as the new electricity that is revolutionizing the world. AI is heavily invested in both industry and academy. However, there is also a lot of hype in the current AI debate. AI based on so-called deep learning has achieved impressive results in many problems, but its limits are already visible. AI has been under research since the 1940s, and the industry has seen many ups and downs due to over-expectations and related disappointments that have followed. The purpose of this book is to give a realistic picture of AI, its history, its potential and limitations. We believe that AI is a helper, not a ruler of humans. We begin by describing what AI is and how it has evolved over the decades. After fundamentals, we explain the importance of massive data for the current mainstream of artificial intelligence. The most common representations for AI, methods, and machine learning are covered. In addition, the main application areas are introduced. Computer vision has been central to the development of AI. The book provides a general introduction to computer vision, and includes an exposure to the results and applications of our own research. Emotions are central to human intelligence, but little use has been made in AI. We present the basics of emotional intelligence and our own research on the topic. We discuss super-intelligence that transcends human understanding, explaining why such achievement seems impossible on the basis of present knowledge,and how AI could be improved. Finally, a summary is made of the current state of AI and what to do in the future. In the appendix, we look at the development of AI education, especially from the perspective of contents at our own university.


OpenEI: An Open Framework for Edge Intelligence

arXiv.org Artificial Intelligence

In the last five years, edge computing has attracted tremendous attention from industry and academia due to its promise to reduce latency, save bandwidth, improve availability, and protect data privacy to keep data secure. At the same time, we have witnessed the proliferation of AI algorithms and models which accelerate the successful deployment of intelligence mainly in cloud services. These two trends, combined together, have created a new horizon: Edge Intelligence (EI). The development of EI requires much attention from both the computer systems research community and the AI community to meet these demands. However, existing computing techniques used in the cloud are not applicable to edge computing directly due to the diversity of computing sources and the distribution of data sources. We envision that there missing a framework that can be rapidly deployed on edge and enable edge AI capabilities. To address this challenge, in this paper we first present the definition and a systematic review of EI. Then, we introduce an Open Framework for Edge Intelligence (OpenEI), which is a lightweight software platform to equip edges with intelligent processing and data sharing capability. We analyze four fundamental EI techniques which are used to build OpenEI and identify several open problems based on potential research directions. Finally, four typical application scenarios enabled by OpenEI are presented.


2018 AI Trends: Cloud Models, AI Hardware

#artificialintelligence

Nvidia's greatest growth in chips in 2017 was in the AI and cloud-based sectors, which should increase in 2018. This year tech companies will begin moving AI more to the "edge" of access, leveraging trained machine learning software with cloud-based computing, according to a VentureBeat.com The authors, Daniel Li, Principal, and S. Somasegar, Managing Director, predicted four new trends in 2018: Machine learning models will operate outside of the data centers and via phones and personal assistant devices, like Alexa and SIRI to reduce power and bandwidth consumption, reduce latency and ensure privacy. Specialized chips for AI will perform better than all-purpose chips, and computers built to optimize AI are already being designed. Text, voice, gestures and vision will all be used more widely to communicate with systems.


Raspberry Pi's latest competitor RockPro64 brings more power plus AI processor ZDNet

#artificialintelligence

Pine64 has released a budget-friendly single-board computer with the high-powered Rockchip RK3399 system on chip (SoC). Available from around $60, the RockPro64 board comes in two flavors, either with the hexa-core RK3399 SoC or the RK3399Pro, Rockchip's first "artificial-intelligence processor". Unveiled at CES 2018, it combines a CPU, GPU, and neural-network processing unit (NPU). As noted by CNX-Software, a number of RK3999-based boards have been released in the past week but, priced at around $200 each, they've been aimed at business customers rather than home developers. The RockPro64 with 2GB of RAM will cost between $59 and $65 and will be available from March, while the RockPro64-AI will cost $99 but won't be available until August 1, according to Pine64.


Apple's Macs, iPhones and Siri will get new AI brain power

#artificialintelligence

Apple's Craig Federighi touts new machine learning and AI features coming to iPhones at WWDC. Your Apple hardware is about to get a notch smarter as the company builds new artificial intelligence abilities into Macs and iPhones -- and lets other programmers tap into that power. AI technology will mean Siri better understands what you want and speaks with a computer voice that Apple says sounds natural. Craig Federighi, senior vice president in charge of Mac and iPhone software, announced the AI technology Monday at the company's annual WWDC event for developers in San Jose, California. On Macs, it'll monitor your web browsing behavior to block advertising companies from tracking some of what you do online.



Top 7 Technology Trends in 2017 That Are Moving Faster Than Ever

#artificialintelligence

With the progressing year, the technology diversified ways in which we could communicate and retrieve the information from the pocket fitting devices. Technologies such as IoT, automation, and cognitive computing moved beyond the conceptual stages in 2016. As the year takes up, companies throughout the world are developing their business strategies. In order to move forward in the competition, companies are turning towards major investments in technology. The world's biggest consumer technology convention, CES is one of the best places to find a handful of key technologies. CES 2017 finished another spectacular year with pioneering technology trends including smart homes to self-driving cars. This year is assumed to bring transformative technology trends for us to explore and invest in. AI, also known as Artificial Intelligence has been studied for decades and now the vision of transforming insentient objects into intelligence is gradually becoming a reality. AI based Innovations are now pondering into the market and becoming part of our daily lives with quick adaptability. Artificial intelligence assists humans and handles the tasks flawlessly, without interrupting your comfort. Whether to set an alarm, or remind you of something important, or to play your favorite music or to read out general news for you or to find your phone, AI can make the task more convenient and smart. Sit back and relax while you command your device to do things for you.


Microsoft squeezed AI onto a Raspberry Pi

#artificialintelligence

Artificial Intelligence and Machine Learning usually work best with a lot of horsepower behind them to crunch the data, compute possibilities and instantly come up with better solutions. That's why most AI systems rely on local sensors to gather input, while more powerful hardware in the cloud manages all the heavy lifting of output. It's how Apple's Siri and Amazon Alexa work, and how IBM Watson can tackle virtually any major task. It is, though, a limiting approach when it comes to making smarter Internet of Things and applying intelligence when there isn't Internet connectivity. "The dominant paradigm is that these [sensor] devices are dumb," said senior researcher with Microsoft Research India, Manik Varma.


Apple Is Following Google Into Making A Custom AI Chip

#artificialintelligence

Artificial intelligence has begun seeping its way into every tech product and service. Now, companies are changing the underlying hardware to accommodate this shift. Apple is the latest company creating a dedicated AI processing chip to speed up the AI algorithms and save battery life on its devices, according to Bloomberg. The Bloomberg report said the chip is internally known as the Apple Neural Engine and will be used to assist devices for facial and speech recognition tasks. The latest iPhone 7 runs some of its AI tasks (mostly related to photographer) using the image signal processor and the graphics processing unit integrated on its A10 Fusion chip.