Goto

Collaborating Authors

EM Algorithm

#artificialintelligence

EM (Expectation-Maximisation) Algorithm is the go to algorithm whenever we have to do parameter estimation with hidden variables, such as in hidden Markov Chains. For some reason, it is often poorly explained and students end up confused as to what exactly are we maximising in the E-step and M-steps. Here is my attempt at a (hopefully) clear and step by step explanation on exactly how EM Algorithm works.


Introduction to MLOps for Data Science

#artificialintelligence

If we break down the word itself, it is a combination of 2 words, machine learning, and operations. Where machine learning stands for model development or any kind of code development and operations means production and deployment of code. A more technical definition of MLOps is a set of principles and practices to standardize and streamline the machine learning lifecycle management. Well, it is not a new technology or tool but rather a culture with a set of principles, guidelines defined in a machine learning world to seamlessly integrate/automate the development phase with the operational development phase. It is an iterative incremental process where data scientists, data engineers, and operations worlds collaborate to build, automate, test, and monitor the machine learning pipelines like a Dev-ops project.


Bringing TrackMate in the era of machine-learning and deep-learning

#artificialintelligence

TrackMate is an automated tracking software used to analyze bioimages and distributed as a Fiji plugin. Here we introduce a new version of TrackMate rewritten to improve performance and usability, and integrating several popular machine and deep learning algorithms to improve versatility. We illustrate how these new components can be used to efficiently track objects from brightfield and fluorescence microscopy images across a wide range of bio-imaging experiments. Object tracking is an essential image analysis technique used across biosciences to quantify dynamic processes. In life sciences, tracking is used for instance to track single particles, sub-cellular organelles, bacteria, cells, and whole animals.


Don't forget the human factor when working with AI and data analytics

#artificialintelligence

After years of resisting "pretend football," I finally joined a neighborhood fantasy football league. I'm a very casual football fan and probably couldn't name 10 active players without several minutes of thought, but in the interest of participating in some neighborly fun and learning a bit more about the game, I created my first team. I frankly still don't fully understand fantasy football scoring and all the nuances, but for the unfamiliar, you select a virtual team from a pool of available players during a draft process, and each player's activities on the field that week contribute to your overall team score. For example, if my defense blocks a touchdown, I might get 10 points, while if a running back on my team rushes for a few yards in a different game, I get a fraction of a point. Theoretically, this creates interest in more teams by giving the fan more players to follow, but at this point, it's mainly creating confusion as my extremely limited "football brain" attempts to follow a half dozen simultaneous games.


DRM Driver Posted For AI Processing Unit - Initially Focused On Mediatek SoCs

#artificialintelligence

BayLibre developer Alexandre Bailon has posted a "request for comments" of a new open-source Direct Rendering Manager (DRM) driver for AI Processing Unit (APU) functionality. Initially the driver is catering to Mediatek SoCs with an AI co-processor but this DRM "APU" driver could be adapted to other hardware too. Alexandre Bailon sums up this DRM AI Processing Unit driver as "a DRM driver that implements communication between the CPU and an APU. This uses VirtIO buffer to exchange messages. For the data, we allocate a GEM object and map it using IOMMU to make it available to the APU. The driver is relatively generic, and should work with any SoC implementing hardware accelerator for AI if they use support remoteproc and VirtIO."


Significance of Data Annotation for ADAS applications

#artificialintelligence

Vehicle safety is one of the major areas in which automakers are making considerable investments. Automobile manufacturers have created a number of technologies that can aid in the prevention of traffic accidents over the years. Advanced Driver Assistance Systems are technologies that automate, facilitate, and improve vehicular systems to assist drivers in safe and better driving (ADAS). Advanced driver assistance systems (ADAS) are technological safety measures that help drivers prevent on-road incidents by alerting them to potential risks. This allows the driver to quickly regain control of their vehicle, boosting their capacity to react to road hazards.


Train a Custom Image Segmentation Model Using TensorFlow Object Detection API Mask R-CNN

#artificialintelligence

The previous article introduces Object Detection. This article will introduce the concept of Image Segmentation, and explain how to train a custom image segmentation model using TensorFlow Object Detection API through cases, including data set collection and processing, TensorFlow Object Detection API installation, and model training. Mask R-CNN to be used in this article is an Instance Segmentation model. The installation of TensorFlow Object Detection API is consistent with the Object Detection, please refer to Previous, so I won't repeat it here. Note:!!! From here, please make sure to execute under the environment of conda od.


Combating Software System Complexity: Entities Should Not Be Multiplied Unnecessarily

#artificialintelligence

We are often faced with the problem of how to evaluate the quality of a large software system. The primary evaluation metric is definitely functionality and whether the software meets the main requirements (do right things). If there are multiple technical paths to achieve the same functionality, people tend to choose the more simple approach. Occam's Razor guideline "Entities should not be multiplied unnecessarily" sums up very well the preference for simplicity, which is to counter the challenge of complexity. The underlying logic of this preference is: "simplicity does things right. In the 1960s, the Software Crisis (Software crisis -- Wikipedia) was once called because software development could not keep up with the development of hardware and the growth in complexity of real problems and could not be delivered in the planned time. Fred Brooks, a Turing Award winner who led the development of System/360 and OS/360 at IBM, described the plight of a giant beast dying in a tar pit in the bible of software engineering, "The Mythical Man-Month", to draw an analogy with software developers who are mired in software complexity and cannot get out. He also introduced the famous Brooks' Law, "Adding people to a project that is behind schedule only makes it more behind schedule". In his paper "No Silver Bullet -- Essence and Accidents of Software Engineering," he further divides the difficulties of software development into essential and episodic and identifies several major causes of essential difficulties: complexity, invisibility, conformity, and changeability, with complexity leading the way. In 2006, a paper entitled "Out of the Tar Pit" echoed Brooks. This paper argues that complexity is the only major difficulty preventing successful large-scale software development, and that several of the other causes Brooks suggests are secondary disasters resulting from unmanageable complexity, with complexity being the root cause. This paper, too, cites several Turing Award winners for their excellent discussions of complexity. "…we have to keep it crisp, disentangled, and simple if we refuse to be crushed by the complexities of our own making…" "The general problem with ambitious systems is complexity.", "…it is important to emphasize the value of simplicity and elegance, for complexity has a way of compounding difficulties" "there is a desperate need for a powerful methodology to help us think about programs.


Pycaret: A Faster Way to Build Machine Learning Models

#artificialintelligence

Building a machine learning model requires a series of steps, from data preparation, data cleaning, feature engineering, model building to model deployment. Therefore, it can take a lot of time for a data scientist to create a solution that solves a business problem. To help speed up the process, you can use Pycaret, an open-source library. Pycaret can help you perform all the end-to-end processes of ML faster with few lines of code. Pycaret is an open-source, low code library in python that aims to automate the development of machine learning models.


Artificial Intelligence in Finance: Opportunities and Challenges

#artificialintelligence

Artificial intelligence (AI) is not a new kid on the block anymore and the field is developing at a constantly increasing pace. Pretty much every day there is some kind of new development, be it a research paper announcing a new or improved machine learning algorithm, a new library for one of the most popular programming languages (Python/R/Julia), etc. In the past, many of those advances did not make it to mainstream media. But that is also changing rapidly. Some of the recent examples include the AlphaGo beating the 18-time world champion at Go [1], using Deep Learning to generate realistic faces of humans that never existed [2], or the spread of Deep Fakes -- images or videos placing people in situations that never actually happened.