Results


When it comes to controls, should we keep AI on a leash?

#artificialintelligence

In a previous article for DCD I suggested that if true data center thermal optimization is to be achieved, it requires a proven, safe process based on thousands of real-time sensors and expert spatial models. Inevitably this involves the collection of huge volumes of information, so it's critical that you can absolutely rely on the data you're gathering if you're to succeed in removing much of the uncertainty from data center cooling. That's particularly the case when you're considering the kind of control models needed to monitor critical cooling duty performance for your data center estate's multiple CRAC/AHU units. Of course, you'll want to track your data center cooling loads in real-time using standard temperature and current measurement sensors for both chilled water and direct expansion cooling systems. However, it's also important to continuously monitor air inlet and outlet temperatures – along with variables such as fan performance, filter quality, and alerts for potential CRAC/AHU blockages.


How AI Builds A Better Manufacturing Process

Forbes Technology

But as for how many humans it takes to construct those 5,000 banana-colored robots a month, don't bother counting: The robots build themselves, test themselves and inspect themselves. A generation ago, many people--including manufacturing executives themselves--would've considered such a facility as either science fiction or centuries away. To be certain, FANUC's complex of 22 sub-factories is one of a kind. It's one of the world's first "lights-out factories"--where 24/7 operation is a reality and intelligent robots create computerized offspring capable, just like them, of machine learning and computer vision. But it proves just how far artificial intelligence (AI) has come in the manufacturing process.


Plant Engineering

#artificialintelligence

Machine learning is advancing the capabilities of collaborative and industrial robots. Without 3-D sensors or neural networks, robots are blind and one-dimensional. They're restricted to one repetitive task that's been preprogrammed with no ability to account for variables in their environment. This limits a robot's productivity potential. Now, with vision sensors and machine learning capabilities, collaborative and industrial robots are able to achieve far more than they ever could on their own.


The way people walk can be used for ID and health checks

#artificialintelligence

LISTEN carefully to the footsteps in the family home, especially if it has wooden floors unmuffled by carpets, and you can probably work out who it is that is walking about. The features most commonly used to identify people are faces, voices, finger prints and retinal scans. But their "behavioural biometrics", such as the way they walk, are also giveaways. Researchers have, for several years, used video cameras and computers to analyse people's gaits, and are now quite good at it. But translating such knowledge into a practical identification system can be tricky--especially if that system is supposed to be covert.


Next-gen telehealth: AI, chatbots, genomics and sensors that advance population health

#artificialintelligence

While the use of telemedicine systems has been expanding in recent years, especially as more payers have begun reimbursing for some telehealth services, the industry is on the verge of more widespread virtual care. But what will that ultimately look like? The next generation of tools will feature enhancements ranging from chatbots, machine learning and genomics to remote diagnostic tools and better sensors. Here's a look at what to expect in the near future. Both machine learning and automation are trying to solve an inherent issue in virtual healthcare: scalability, said Roeen Roashan, senior analyst of digital health at consulting firm IHS.


DARPA digs into the details of practical quantum computing -- GCN

#artificialintelligence

Quantum computing promises enough computational power to solve problems far beyond the capabilities of the fastest digital computers, so the Defense Advanced Research Projects Agency is laying the groundwork for applying the technology to real-world problems. In a request for information, DARPA is asking how quantum computing can enable new capabilities when it comes to solving science and technology problems, such as understanding complex physical systems, optimizing artificial intelligence and machine learning and enhancing distributed sensing. Noting that it is not interested in solving cryptology issues, DARPA is asking the research community to help solve challenges of scale, environmental interactions, connectivity and memory and suggest "hard" science and technology problems the technology could be leveraged to solve. Establishing the fundamental limits of quantum computing in terms of how problems should be framed, when a model's scale requires a quantum-based solution, how to manage connectivity and errors, the size of potential speed gains and the ability to break large problems into smaller pieces that can map to several quantum platforms. Improving machine learning by leveraging a hybrid quantum/classical computing approach to decrease the time required to train machine learning models.


The five vectors of IoT progress

ZDNet

How can companies best take advantage of the internet of things (IoT)? That's likely to become an increasingly common question as more enterprises deploy an array of sensors and connect a variety of products -- in hopes of gathering huge volumes of useful data. Read also: What is the IoT? Everything you need to know Consulting firm Deloitte recently presented five "vectors of progress" that will help drive the adoption of IoT and maybe assist organizations in more successfully leveraging related technologies. Each vector addresses an important challenge of IoT adoption, and the firm notes that the progress vectors are not all applicable to every industry or application of IoT. Businesses are seeing IoT projects stall out or fail outright despite increased investments on these initiatives, the firm said in a recent report.


Decentralized Clustering on Compressed Data without Prior Knowledge of the Number of Clusters

arXiv.org Machine Learning

In sensor networks, it is not always practical to set up a fusion center. Therefore, there is need for fully decentralized clustering algorithms. Decentralized clustering algorithms should minimize the amount of data exchanged between sensors in order to reduce sensor energy consumption. In this respect, we propose one centralized and one decentralized clustering algorithm that work on compressed data without prior knowledge of the number of clusters. In the standard K-means clustering algorithm, the number of clusters is estimated by repeating the algorithm several times, which dramatically increases the amount of exchanged data, while our algorithm can estimate this number in one run. The proposed clustering algorithms derive from a theoretical framework establishing that, under asymptotic conditions, the cluster centroids are the only fixed-point of a cost function we introduce. This cost function depends on a weight function which we choose as the p-value of a Wald hypothesis test. This p-value measures the plausibility that a given measurement vector belongs to a given cluster. Experimental results show that our two algorithms are competitive in terms of clustering performance with respect to K-means and DB-Scan, while lowering by a factor at least $2$ the amount of data exchanged between sensors.


AI Flood Drives Chips to the Edge

#artificialintelligence

Editor's Note: Welcome to Aspencore's Special Project on Embedded Artificial Intelligence (AI). This article, along with the articles listed on the last page, form an in-depth look from a variety of angles at the business and technology of imbuing embedded systems with localized AI. SAN JOSE -- It's easy to list semiconductor companies working on some form of artificial intelligence -- pretty much all of them are. The broad potential for machine learning is drawing nearly every chip vendor to explore the still-emerging technology, especially in inference processing at the edge of the network. "It seems like every week, I run into a new company in this space, sometimes someone in China that I've never heard of," said David Kanter, a microprocessor analyst at Real World Technologies.


Big Data and Robotics - DZone AI

#artificialintelligence

The last few months have witnessed a rise in the attention given to Artificial Intelligence (AI) and robotics. The fact is that robots have already become a part of society; in fact, it is now an integral part. Big data is also definitely a buzzword today. Enterprises worldwide generate a huge amount of data. The data doesn't have a specified format.