Goto

Collaborating Authors

Results


6 Ways AI Improves Daily Living

#artificialintelligence

Through AI, technology is making life easier. Machine learning is being used to learn human behavior so that apps can predict what you might want and at what time. In this way, some activities like ordering groceries, watching movies, listening to music, and more are done for you by their respective apps. Here is how AI improves daily living. If you are looking for top technology in home security, be sure to go for alarm and camera systems that are AI-integrated.


Predicting fruit harvest with drones and artificial intelligence 7wData

#artificialintelligence

Outfield Technologies is a Cambridge-based agri-tech start-up company which uses drones and artificial intelligence, to help fruit growers maximise their harvest from orchard crops. Outfield Technologies' founders Jim McDougall and Oli Hilbourne have been working with Ph.D. student Tom Roddick from the Department's Machine Intelligence Laboratory to develop their technology capabilities to be able to count the blossoms and apples on a tree via drones surveying enormous apple orchards. "An accurate assessment of the blossom or estimation of the harvest allows growers to be more productive, sustainable and environmentally friendly", explains Outfield's commercial director Jim McDougall. "Our aerial imagery analysis focuses on yield estimation and is really sought after internationally. One of the biggest problems we're facing in the fruit sector is accurate yield forecasting. This system has been developed with growers to plan labour, logistics and storage. It's needed throughout the industry, to plan marketing and distribution, and to ensure that there are always apples on the shelves. Estimates are currently made by growers, and they do an amazing job, but orchards are incredibly variable and estimates are often wrong by up to 20%. This results in lost income, inefficient operations and can result in substantial amount of wastage in unsold crop."


Machine Learning at the Network Edge: A Survey

arXiv.org Machine Learning

Devices comprising the Internet of Things, such as sensors and small cameras, usually have small memories and limited computational power. The proliferation of such resource-constrained devices in recent years has led to the generation of large quantities of data. These data-producing devices are appealing targets for machine learning applications but struggle to run machine learning algorithms due to their limited computing capability. They typically offload input data to external computing systems (such as cloud servers) for further processing. The results of the machine learning computations are communicated back to the resource-scarce devices, but this worsens latency, leads to increased communication costs, and adds to privacy concerns. Therefore, efforts have been made to place additional computing devices at the edge of the network, i.e close to the IoT devices where the data is generated. Deploying machine learning systems on such edge devices alleviates the above issues by allowing computations to be performed close to the data sources. This survey describes major research efforts where machine learning has been deployed at the edge of computer networks.


How Will Artificial Intelligence Affect Our Lives - CITI IO

#artificialintelligence

Artificial Intelligence or, in short, AI, is slowly becoming a part of our everyday lives. While we don't yet have the perfect AI system, that can think and make decisions for itself, there are advanced artificial neural networks that can collaborate on solving difficult situations. Furthermore, modern AI systems are capable of observing and learning based on the information provided. For instance, the voice assistants most of us love to use (Siri, Alexa, Google) use AI algorithms that process the information you provide (searches, discussions, browsing history) and learn about your specific behavior. This way, they advance in the relationship with us and create a more human-like experience.


Israeli Deep-Learning Startup Hailo Raises $21 Million

U.S. News

Beyond its strategic focus - the automotive market - Hailo's processor technology can serve other markets including surveillance, smart home, Internet of Things and industrial, robotics, augmented reality and wearable devices.


Intel's 'neural network on a stick' brings AI training to you

Engadget

Ahead of its first AI developers conference in Beijing, Intel has announced it's making the process of imparting intelligence into smart home gadgets and other network edge devices faster and easier thanks to the company's latest invention: the Neural Compute Stick 2. Edge devices are generally defined as any piece of hardware that controls the flow of data between the boundaries of two networks. These include not just routers, switches and gateways but also a range IoT gadgets like Ring doorbell cameras, industrial robots, smart medical devices or self-guided camera drones. Intel's NCS2 is essentially a self-contained neural network on a thumbdrive and should make developing those sorts of devices faster and easier by offloading much of the processing power required to train them to its onboard Movidius Myriad X vision processing unit (VPU). "We can do some incredible things with artificial intelligence, and one of the fastest growing data types, as we all know, is video data," Steen Graham, General Manager of Channels and Ecosystem in Intel's Internet of Things group, told reporters during a press call on Friday. "The camera is the ultimate sensor."


Creepy AI that can predict your moves in advance could lead to next-level Big Brother surveillance

Daily Mail - Science & tech

An intelligent machine capable of anticipating your next move minutes in advance sounds like the stuff of nightmares – but is now a reality. Researchers have taught an AI to recognise patterns in people's actions, allowing it to accurately predict the next move in a sequence minutes in advance. The software, which was built by a team at the University of Bonn in Germany, was taught to anticipate actions by watching hours of cooking videos. Dr Jürgen Gall believes the intelligent software will eventually be able to prophesize your actions'hours before they happen'. If the team manages to fine-tune the algorithm to be able to anticipate actions that far in advance, it's possible to imagine a slew of real-world application, from home automation gadgets, to Big Brother-esque surveillance.


Designing for Democratization: Introducing Novices to Artificial Intelligence Via Maker Kits

arXiv.org Artificial Intelligence

Existing research highlight the myriad of benefits realized when technology is sufficiently democratized and made accessible to non-technical or novice users. However, democratizing complex technologies such as artificial intelligence (AI) remains hard. In this work, we draw on theoretical underpinnings from the democratization of innovation, in exploring the design of maker kits that help introduce novice users to complex technologies. We report on our work designing TJBot: an open source cardboard robot that can be programmed using pre-built AI services. We highlight principles we adopted in this process (approachable design, simplicity, extensibility and accessibility), insights we learned from showing the kit at workshops (66 participants) and how users interacted with the project on GitHub over a 12-month period (Nov 2016 - Nov 2017). We find that the project succeeds in attracting novice users (40\% of users who forked the project are new to GitHub) and a variety of demographics are interested in prototyping use cases such as home automation, task delegation, teaching and learning.


Artificial intelligence (AI) and cognitive computing: what, why and where

#artificialintelligence

Although artificial intelligence (as a set of technologies, not in the sense of mimicking human intelligence) is here since a long time in many forms and ways, it's a term that quite some people, certainly IT vendors, don't like to use that much anymore – but artificial intelligence is real, for your business too. Instead of talking about artificial intelligence (AI) many describe the current wave of AI innovation and acceleration with – admittedly somewhat differently positioned – terms and concepts such as cognitive computing or focus on several real-life applications of artificial intelligence that often start with words such as "smart" (omni-present in anything related to the IoT as well), "intelligent", "predictive" and, indeed, "cognitive", depending on the exact application – and vendor. Despite the term issues, artificial intelligence is essential for and in, among others, information management, healthcare, life sciences, data analysis, digital transformation, security (cybersecurity and others), various consumer applications, next gen smart building technologies, FinTech, predictive maintenance, robotics and so much more. On top of that, AI is added to several other technologies, including IoT and big, as well as, small data analytics. There are many reasons why several vendors doubt using the term artificial intelligence for AI solutions/innovations and often package them in another term (trust us, we've been there). Artificial intelligence (AI) is a term that has somewhat of a negative connotation in general perception but also in the perception of technology leaders and firms.


Turning AI, deep learning and robots from children into responsible citizens

#artificialintelligence

If there is one thing about artificial intelligence (AI) most people agree on it's the fact that AI and the way in which its many'forms' are leveraged, whether it's in a context of cobots, sentiment analysis applications, autonomous decision-making in smart buildings, intelligence at the edge of IoT (Internet of Things) or any other solution, should serve human, business and societal goals one way or the other.