sensor


GE's research scientists are learning to meld AI with machines

#artificialintelligence

So far, nearly 400 employees from across the company have completed GE's certification program for data analytics, and about 50 scientists have moved into digital analytics jobs of the kind Nichols has taken on. They enable GE to track wear and tear on its aircraft engines, locomotives, gas turbines, and wind turbines using sensor data instead of assumptions or estimates, making it easier to predict when they will need maintenance. What's more, if data is corrupted or missing, the company fills in the gaps with the aid of machine learning, a type of AI that lets computers learn without being explicitly programmed, says Colin Parris, GE Global Research's vice president for software research. Parris says GE pairs computer vision with deep learning, a type of AI particularly adept at recognizing patterns, and reinforcement learning, another recent advance in AI that enables machines to optimize operations, to enable cameras to find minute cracks on metal turbine blades even when they are dirty and dusty.


Challenges of #ArtificialIntelligence

#artificialintelligence

As AI is a vast domain, lisitng all challenges is quite impossible, yet we've listed few generic challenges of Artificial Intelligence here below, such as: AI situated approach in the real-world; Learning process with human intervention; Access to other disciplines; Multitasking; Validation and certification of AI systems. Artificial Intelligence systems must operate and interact with the real world and their environment, receiving sensor data, determining the environment in which they operate, act on the real world, are such examples. The essential element of AI's critical systems, the certification of AI systems or their validation by appropriate means, are real challenges, especially if they meet the expectations mentioned above (adaptation, multitasking, learning processes with human intervention). The privacy requirement is particularly important for AI systems confronted with personal data, such as intelligent assistants / companions or data mining systems.


UC Berkeley Releases Massive Dex-Net 2.0 Dataset

IEEE Spectrum Robotics Channel

The dataset consists of 6.7 million point object point clouds, accompanying parallel-jaw gripper poses, along with a robustness estimate of how likely it is that the grasp will be able to lift and carry the object, and now you can use it to train your own grasping system. Instead, Dex-Net 2.0 relies on "a probabilistic model to generate synthetic point clouds, grasps, and grasp robustness labels from datasets of 3D object meshes using physics-based models of grasping, image rendering, and camera noise." In other words, Dex-Net 2.0 leverages cloud computing to rapidly generate a large training set for a CNN, in "a hybrid of well-established analytic methods from robotics and Deep Learning," as Goldberg explains: The key to Dex-Net 2.0 is a hybrid approach to machine learning Jeff Mahler and I developed that combines physics with Deep Learning. Mahler: With the release we hope that other roboticists can replicate our training results to facilitate development of new architectures for predicting grasp robustness from point clouds, and to encourage benchmarking of new methods.


Getting Started with Predictive Maintenance Models - Silicon Valley Data Science

@machinelearnbot

We are also provided with a training set of full run-to-failure data for a number of engines and a test set with truncated engine data and their corresponding RUL values. One way of addressing this is to look at the distribution of sensor values in "healthy" engines, and compare it to a similar set of measurements when the engines are close to failure. The figure above shows the distribution of the values of a particular sensor (sensor 2) for each engine in the training set, where healthy values (in blue) are those taken from the first 20 cycles of the engine's lifetime and failing values are from the last 20 cycles. In blue are the values of a particular sensor (sensor 2 in this case) plotted against the true RUL value at each time cycle for the engines in the training set.


Verdigris Uses AI to Wring Energy Savings from Buildings NVIDIA Blog

#artificialintelligence

While most buildings get occasional walk-through energy usage audits, Verdigris' digital system uploads electricity consumption data to the cloud 24/7. It can even integrate the data with building management systems to automate electricity usage controls. Chung estimates this helps Verdigris train models 20 times as fast as on CPUs. Eventually, Chung said he'd like Verdigris to expand beyond smart building optimization and into enabling smart cities.


The inextricable link between IoT and machine learning

#artificialintelligence

Train a machine learning model with images of cats and not cats and a trained model will recognize cats with accuracy in the high 90th percentile. Installing a fraction of the sensors in some of the fields would provide the ground truth drone images for training the model to recognize optimally watered crops. Like so many medical imaging use cases such as diagnosing diabetic retinopathy that can be diagnosed with the same or better accuracy than an ophthalmologist with a machine learning model, the NDVI images could be diagnosed with a highly accurate model. In addition to saving the time and cost of deploying IoT devices and networks to interconnect them, machine learning could be a separate path to confirm an IoT system is working.


MK's network of AI sensors to monitor travel in real time

#artificialintelligence

Artificial Intelligence (AI) sensors are to be installed at every major junction and council car park in Milton Keynes to monitor movement of vehicles, cyclists and pedestrians in real time. "This will provide real-time congestion information that can help with strategic planning and journey planning," said Geoff Snelson, director of strategy & futures at Milton Keynes Council. The'VivaMK' project, part of Innovate UK's Smart Cities initiative, will monitor 13,000 spaces in the council's car parks. Milton Keynes Council is also installing more induction charging plates for bus services.


PSA Group semi-autonomous cars arrive this year; full autonomy in 2020

#artificialintelligence

The first PSA car with'hands-off' Level 2 autonomous system will be the DS 7 Crossback SUV, which is due for launch early in 2018. Equipped with a Level 2 system called Connected Pilot, the 7 Crossback will be capable of maintaining lane and choosing station to the left or right of the lane to allow cycles or motorcycles to pass. PSA will phase in Level 3 'eyes-off' autonomous cars from 2020, with two systems called Traffic Jam Chauffeur and Highway Chauffeur. The company plans to phase in fully autonomous Level 4 'mind-off' cars from 2025 with Level 5, completely driverless cars beyond 2030.


Feature Engineering in IoT Age - How to deal with IoT data and create features for machine learning?

#artificialintelligence

Given the fast pace of change to connected devices and our perspective of data science, we think that data science professionals need to understand and explore feature engineering of IOT or sensor data. Prior to creation of features from IOT or sensor data, it is important to consider the level of aggregation (across time) of the continuous streaming data. In these cases, both atomic level and aggregated level are used for generating the features, but in most cases, the aggregated level features prove more productive. Once the window for aggregation has been arrived at, the next step involves aggregating the sensor data over these time windows to create a set of new variables / features from the atomic ones.


Explosion of #IoT Data @ThingsExpo @Tibco #AI #IIoT #M2M #DX #SmartCities #BigData

#artificialintelligence

As noted in Forrester's 2016 Internet of Things Heat Map, "Because of the enormous range of sensors, customer scenarios, and business cases, the technologies for IoT sensor devices, radios, network protocols, software protocols, and data formats are very diverse. The cloud's role then becomes synergistic as opposed to directive: Its economy of scale can be directed to build and refine machine learning models on massive data sets, thereby further augmenting intelligence, which can further be acted on at the edge. With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo, October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation. With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo @ThingsExpo, October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-4, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.