algorithm


When Artificial Intelligence Meets Big Data

#artificialintelligence

"Gone are the days of data engineers manually copying data around again and again, delivering datasets weeks after a data scientist requests it"-these are Steven Mih's words about the revolution that artificial intelligence is bringing about, in the scary world of big data. By the time the term "big data" was coined, data had already accumulated massively with no means of handling it properly. In 1880, the US Census Bureau estimated that it would take eight years to process the data it received in that year's census. The government body also predicted that it would take more than 10 years to process the data it would receive in the following decade. Fortunately, in 1881, Herman Hollerith created the Hollerith Tabulating Machine, inspired by a train conductor's punch card.


Addressing Drawbacks Of AutoML With AutoML-Zero

#artificialintelligence

Automated machine learning – or AutoML – is an approach that cuts down the time spent in doing iterative tasks concerning model development. AutoML tools help developers build scalable models with great ease and minimal domain expertise. AutoML is one of the most actively researched spaces in the ML community. AutoML studies have discovered ways to constrain search spaces to isolated algorithmic aspects. This includes the learning rule used during backpropagation, the gating structure of an LSTM, or the data augmentation.


Machine Learning Framework Algorithm to recognise handwriting

#artificialintelligence

Manually transcribing large amounts of handwritten data is an arduous process that's bound to be fraught with errors. Automated handwriting recognition can drastically cut down on the time required to transcribe large volumes of text, and also serve as a framework for developing future applications of machine learning. Handwritten character recognition is an ongoing field of research encompassing artificial intelligence, computer vision, and pattern recognition. An algorithm that performs handwriting recognition can acquire and detect characteristics from pictures, touch-screen devices and convert them to a machine-readable form. There are two basic types of handwriting recognition systems – online and offline.


A neural network unpicks the knots

#artificialintelligence

Machine learning can tell different types of knot apart just by'looking' at them. For decades, mathematicians have had algorithms that calculate whether any two knots are of the same type -- that is, whether the knots can be converted into each other without cutting the string. But these algorithms are slow: the number of steps they require grows exponentially with the complexity of the knots. Liang Dai at the City University of Hong Kong and his collaborators created geometric models of the five simplest knots and fed those models into neural networks, which are computing systems modelled after the brain's networks of neurons. After training on hundreds of thousands of such models, the networks had learnt to classify knots with 99% accuracy or better.


AI Trained On Moon Craters Is Helping Find Unexploded Bombs From The Vietnam War

#artificialintelligence

Provided by Gizmodo Australia Photo: Ohio State University" data-portal-copyright "Photo: Ohio State University" There's still no completely safe and surefire method for locating unexploded ordinance after a war is over, but researchers at Ohio State University have found a way to harness image processing algorithms, powered by machine learning, to study satellite imagery and locate hot spots where UXO are likely to be located. The researchers focused their efforts on a 100-square-kilometre area near Kampong Trabaek, Cambodia, which was the target of carpet-bombing missions carried out by the United States Air Force during the Vietnam War. The team was given access to declassified military data that revealed that 3,205 bombs had been dropped in the area between 1970 and 1973. Determining exactly how many of those bombs didn't explode has gotten harder and harder as, six decades later, nature has slowly reclaimed the country's heaviest hit areas, hiding and obscuring the craters that are counted and used to make accurate estimates. The OSU study used a two-step process to come up with a more accurate estimate of how many bombs were still left in the area.


A Top Machine Learning Algorithm Explained: Support Vector Machines (SVM) - KDnuggets

#artificialintelligence

By Clare Liu, Data Scientist at fintech industry, based in HK. One of the most prevailing and exciting supervised learning models with associated learning algorithms that analyse data and recognise patterns is Support Vector Machines (SVMs). It is used for solving both regression and classification problems. However, it is mostly used in solving classification problems. SVMs were first introduced by B.E. Boser et al. in 1992 and has become popular due to success in handwritten digit recognition in 1994.


Deep Learning: What You Need To Know

#artificialintelligence

During the past decade, deep learning has seen groundbreaking developments in the field of AI (Artificial Intelligence). But what is this technology? And why is it so important? Well, let's first get a definition of deep learning. Here's how Kalyan Kumar, who is the Corporate Vice President & Chief Technology Officer of IT Services at HCL Technologies, describes it: "Have you ever wondered how our brain can recognize the face of a friend whom you had met years ago or can recognize the voice of your mother among so many other voices in a crowded marketplace or how our brain can learn, plan and execute complex day-to-day activities? The human brain has around 100 billion cells called neurons. These build massively parallel and distributed networks, through which we learn and carry out complex activities. Inspired from these biological neural networks, scientists started building artificial neural networks so that computers could eventually learn and exhibit intelligence like humans."


Inside the lab where Waymo is building the brains for its driverless cars

#artificialintelligence

Right now, a minivan with no one behind the steering wheel is driving through a suburb of Phoenix, Arizona. And while that may seem alarming, the company that built the "brain" powering the car's autonomy wants to assure you that it's totally safe. Waymo, the self-driving unit of Alphabet, is the only company in the world to have fully driverless vehicles on public roads today. That was made possible by a sophisticated set of neural networks powered by machine learning about which very is little is known -- until now. For the first time, Waymo is lifting the curtain on what is arguably the most important (and most difficult-to-understand) piece of its technology stack. The company, which is ahead in the self-driving car race by most metrics, confidently asserts that its cars have the most advanced brains on the road today. Anyone can buy a bunch of cameras and LIDAR sensors, slap them on a car, and call it autonomous. But training a self-driving car to behave like a human driver, or, more importantly, to drive better than a human, is on the bleeding edge of artificial intelligence research.


Why faces don't always tell the truth about feelings

#artificialintelligence

Human faces pop up on a screen, hundreds of them, one after another. Some have their eyes stretched wide, others show lips clenched. Some have eyes squeezed shut, cheeks lifted and mouths agape. For each one, you must answer this simple question: is this the face of someone having an orgasm or experiencing sudden pain? Psychologist Rachael Jack and her colleagues recruited 80 people to take this test as part of a study1 in 2018.


Machine learning with Python: An introduction

#artificialintelligence

Machine learning is one of our most important technologies for the future. Self-driving cars, voice-controlled speakers, and face detection software all are built on machine learning technologies and frameworks. As a software developer you may wonder how this will impact your daily work, including the tools and frameworks you should learn. If you're reading this article, my guess is you've already decided to learn more about machine learning. In my previous article, "Machine Learning for Java developers," I introduced Java developers to setting up a machine learning algorithm and developing a simple prediction function in Java.