Machine Learning


Is AI-powered video search becoming inevitable to security? - asmag.com

#artificialintelligence

Given the increasing affordability of equipment and growing awareness of security requirements, more and more cameras are being installed across the globe every day. While this is a good thing, the sheer volume of footages that come in makes it difficult for operators to find specific objects or people when needed. This is one area where artificial intelligence (AI) is all set to play a key role. Several security companies are already working on this. Make searching through videos as simple as using Google.


Galactic Evolution: AI Could Soon Reveal Early Galaxies And Their Hidden Features

International Business Times

Artificial intelligence (AI) is being applied to a number of fields, but just recently, a group of researchers managed to train a deep learning algorithm -- a branch of AI -- to analyze images of distant galaxies and reveal how they formed and evolved over time. Understanding galactic evolution is one of the key puzzles in gaining more insight into the formation of our universe. We have a bunch of ground and space-based telescopes that can peer through the cosmos and capture these galaxies, but understanding every stage of evolution for an individual galactic candidate hasn't entirely been possible. This is because galaxies change their face over several billion years and our telescopes can only show how a galaxy looked at one particular period of time. As light from distant space objects takes millions to billions of years to travel, we always have the option to peer deeper into the cosmos and look back in time at other younger galaxies.


Hatching Plans with the Chicken-and-Egg of Machine Learning - Velocity

#artificialintelligence

In order for a machine learning system to operate at its peak capacity and offer the best insights, it needs premium raw data directly from the client base. However, that data often remains inaccessible until the system itself is up and running. The algorithms inside a machine learning platform which analyse, automate and provide predictions? Or the invaluable data which drives the learning curve? Although confusing at first glance, the answer may be simpler than imagined.


Cambridge Consultants unveils smart car park

#artificialintelligence

The self-taught and low-cost car park, created by Cambridge Consultants, recognises cars and how those cars appear in parking spaces. The system aptly named Goldeneye, can do this both in the day and night as well as a variety of lighting and weather conditions, including the recent severe snow in the UK, without expensive physical infrastructure. Goldeneye uses a machine vision and deep learning solution developed entirely at Cambridge Consultants, along with the existing security camera and networking infrastructure on-site, to consistently monitor the availability of parking bays. Goldeneye uses 12 cameras to oversee 430 parking spaces and with digital signs at the entrance to the site, the system alerts a 500-strong workforce and visitors to where they can quickly find a parking space. Traditional parking monitoring solutions use sensors for each individual parking space, which can be expensive to maintain and often the business case to justify a large investment in bay sensors does not exist.


Machine Learning on Google Cloud Platform

#artificialintelligence

This learning path will introduce you to neural networks, TensorFlow, and Google Cloud Machine Learning Engine. Even if you don't have any previous experience with machine learning, that's okay, because these courses cover the basic concepts. The first course explains the fundamentals of neural networks and how to implement them using TensorFlow. Then it shows you how to train and deploy a model using Cloud ML Engine. The second course explains how to build convolutional neural networks, which are very effective at performing object detection in images, among other tasks.


Face recognition for galaxies: Artificial intelligence brings new tools to astronomy

#artificialintelligence

A machine learning method called "deep learning," which has been widely used in face recognition and other image- and speech-recognition applications, has shown promise in helping astronomers analyze images of galaxies and understand how they form and evolve. In a new study, accepted for publication in Astrophysical Journal and available online, researchers used computer simulations of galaxy formation to train a deep learning algorithm, which then proved surprisingly good at analyzing images of galaxies from the Hubble Space Telescope. The researchers used output from the simulations to generate mock images of simulated galaxies as they would look in observations by the Hubble Space Telescope. The mock images were used to train the deep learning system to recognize three key phases of galaxy evolution previously identified in the simulations. The researchers then gave the system a large set of actual Hubble images to classify.


Splunk updates flagship suites with machine learning, AI advances

ZDNet

Splunk on Tuesday outlined the latest updates to its flagship analytic suites with a focus on machine learning and AI advances. Specifically, the company updated Splunk Enterprise, Splunk Cloud, Splunk IT Service Intelligence (ITSI), Splunk User Behavior Analytics (UBA), and a new Experiment Management Interface for its Machine Learning Toolkit (MLTK). Splunk said the new MLTK interface makes it easier to view, control, evaluate and monitor the status of machine learning experiments. The toolkit also includes new algorithms for identifying patterns and determining the best predictors for training machine learning models. Splunk's algorithms are focused on investigations for security incidents, alerting, predictive tools for operations and maintenance, business optimization for demand, inventory, and analysis of historical data.


7 Tracks, 5 Events & a Deep Discount-PAW Las Vegas

#artificialintelligence

PAW Business is the leading cross-vendor conference covering the commercial deployment of machine learning and predictive analytics. PAW Financial covers the deployment of machine learning and predictive analytics for financial services. The PAW Healthcare program will feature sessions and case studies across Healthcare Business Operations and Clinical applications so you can witness how predictive analytics is employed at leading enterprises and resulting in improved outcomes, lower costs, and higher patient satisfaction. PAW Manufacturing focuses on real-world examples of deployed predictive analytics. Attend and hear how some of the world's largest and most forward-thinking manufacturers are tapping the powering predictive modeling to improve business outcomes.


The Most Talked About Technologies In 2018

#artificialintelligence

Technology is omnipresent, whether its medical or education, every department is controlled and developed with technology. Although, there have been major reforms and developments that have helped us make our lives more convenient in past few years, Artificial Intelligence and medical studies have seen some major changes in terms of technology. Apart from this, almost every sector has experienced a significant rise in the development of its own technology. However, if you talk about the most talked about and popular technologies, you might want to get your head around and analyze the data. Today, we're going to talk about the most popular technologies that have emerged in the year 2018 to make our life easier: Deep Learning is a method of teaching computers to do what comes naturally to humans.


Neural Networks Building Blocks – Eugenio Culurciello – Medium

#artificialintelligence

Neural networks are made of smaller modules or building blocks, similarly to atoms in matter and logic gates in digital circuits. Once you know what the blocks are, you can combine them to solve a variety of problems. How can we use them? Weights implement the product of an input i with a weight value w, to produce an output o. Seems easy, but addition and multiplications are at the hearth of neural networks.