Goto

Collaborating Authors

Why you should consider a machine learning data catalog

#artificialintelligence

The phrase "data is an asset" is something of a corporate cliché. However, it is increasingly true as companies in industry after industry undergo programs to digitize their businesses. It is obvious that Netflix is a highly digital business with data at its heart, but what about more down-to-earth enterprises like manufacturing or energy? Even in the oil industry, the talk these days is of the digital oilfield where vast amounts of sensor data about the operation of an oil platform is captured and analyzed so field production can be tweaked and tuned in real time. In order to extract value from your data, though, you first have to know what you have and where it is, and this seemingly obvious starting point is a major hurdle in a large corporation.


Deploying a Text Classification Model in Python

#artificialintelligence

This article is the last of a series in which I cover the whole process of developing a machine learning project. If you have not read the previous two articles, I strongly encourage you to do it here and here. The project involves the creation of a real-time web application that gathers data from several newspapers and shows a summary of the different topics that are being discussed in the news articles. This is achieved with a supervised machine learning classification model that is able to predict the category of a given news article, a web scraping method that gets the latest news from the newspapers, and an interactive web application that shows the obtained results to the user. As I explained in the first post of this series, the reason I'm writing these articles is because I've noticed that most of the times, the content published on the internet, books or literature regarding data science focus on the following: we have a labeled dataset and we train models to obtain a performance metric.


HDTree: A Customizable and Interactable Decision Tree Written in Python

#artificialintelligence

This story will introduce yet another implementation of Decision Trees, which I wrote as part of my thesis. Firstly, I will try to motivate why I have decided to take my time to come up with an own implementation of Decision Trees; I will list some of its features but also will list the disadvantages of the current implementation. Secondly, I will guide you through the basic usage of HDTree using code snippets and explaining some details along the way. Lastly, there will be some hints on how to customize and extend the HDTree with your own chunks of ideas. However, this article will not guide you through all of the basics of Decision Trees. There are really plenty of resources out there [1][2][3][16].


Getting Postmark's Lighthouse Performance Score to 100

#artificialintelligence

Earlier this year, our design team spent a few weeks analyzing and improving the performance of the Postmark product website. Our app is known for lightning-fast email delivery, and we wanted to provide a similar experience to the visitors of the website. Conveniently, what's good for people is also good for robots -- search engines increasingly use performance and user experience metrics as a ranking factor in search results. Once completed, this project made the Postmark site significantly faster and increased our Lighthouse Performance score from 68 to a perfect 100. We do our best to keep regressions in check, but the nature of releasing something new often works against us.


Startup launches new platform to accelerate AI in healthcare

#artificialintelligence

A startup that specialises in healthcare analytics and AI has launched a new platform that aims to help healthcare providers accelerate their digital journeys to make a return on their investment (ROI). KenSci's new AI platform, built on Microsoft Azure, aims to helps healthcare organisations accelerate their data transformation, enabling AI and analytics workloads at scale. The startup, which is headquartered in Seattle, has launched their re-imagined platform with the latest predictive analytics technology, to enable health organisations to develop business intelligence (BI) and AI-based workloads in an easy, agile way. The platform has been built specifically for the healthcare sector, designed to provide AI-based insights on operational and clinical workflows. Users can customise it to suit their needs or use the solutions already built into the software.


Deep Learning and the End of Social Science

#artificialintelligence

To date, it might well be the most effective and useful algorithm -- or family of algorithms -- that humanity ever invented. Ever-improving methods for erecting models of how the world works and then testing those models against evidence make it possible to distinguish good ideas from bad. Step-by-step, humanity's understanding of the universe, the world, and itself, has grown. The Artificial Intelligence revolution, however, could well overturn how good ideas are sifted from bad and subvert science's ultimate goal of understanding. The claim that AI could undermine scientific understanding, or even make it obsolete, is far from new.


AI chipmaker Hailo accelerates deep learning at the edge - SiliconANGLE

#artificialintelligence

Artificial intelligence chip company Hailo Technologies Ltd. said today it's launching two new acceleration modules that will boost the processing capabilities of edge devices that run its specialist hardware. Hailo burst onto the AI scene in 2019 with a customized processor for running deep learning workloads at the edge of the network. The company, which is primarily focused on the automotive sector, said at the time that its Hailo-8 Deep Learning chip enables devices such as autonomous vehicles, smart cameras, drones and AR/VR platforms to run sophisticated deep learning applications at the edge that could previously be hosted only in cloud data centers. The Hailo-8 processor, which is smaller than a penny, was built from the ground up with completely redesigned memory, control and compute architecture components that enable "higher performance, lower power and minimal latency." Hailo also provides a software development kit for developers to build apps customized for the hardware.


AI-Led Manufacturing in Industry 4.0

#artificialintelligence

The Internet of Things (IoT) brought in an unprecedented tsunami of structured and unstructured data. Information, in conjunction with the easily available inexpensive computing systems, paved the way for the next level of manufacturing. AI, being the flagbearer of the transition, plays a vital role. A village cobbler performs all shoe manufacturing-related tasks on her own – from design, manufacturing, and repair to reuse. Ameri and Dutta cited this crisp model as an ideal example of an integrated and competence-centered enterprise.


Surfing Gravity's Waves: HPC+AI Hang a Cosmic Ten

#artificialintelligence

Eliu Huerta is harnessing AI and high performance computing (HPC) to observe the cosmos more clearly. For several years, the astrophysics researcher has been chipping away at a grand challenge, using data to detect signals produced by collisions of black holes and neutron stars. If his next big design for a neural network is successful, astrophysicists will use it to find more black holes and study them in more detail than ever. Such insights could help answer fundamental questions about the universe. They may even add a few new pages to the physics textbook.


Why is Object Detection so Messy?

#artificialintelligence

To create outputs that vary in size, two approaches dominate the literature: the "one size fits all" approach, an output so broad that it suffices for all applications, and the "look-ahead" idea, we search for regions-of-interest, and then we classify them. I just made up those terms . In practice, they are known as "one-stage" and "two-stage" approaches, which is a tad less self-explanatory. If we can't have variable-sized outputs, we shall return an output so large that it will always be larger than what we need, then we can prune the excess The whole idea is to take the greedy route. The original YOLO detector can detect up to 98 bounding boxes for a 448x448 image.