Goto

Collaborating Authors

Machine Learning


Antimicrobial resistance with Artificial Intelligence

#artificialintelligence

Minh-Hoang Tran,1 Ngoc Quy Nguyen,2 Hong Tham Pham1,3 1Department of Pharmacy, Nhan Dan Gia Dinh Hospital, Ho Chi Minh City, Vietnam; 2Institute of Environmental Technology and Sustainable Development, Nguyen Tat Thanh University, Ho Chi Minh City, Vietnam; 3Department of Pharmacy, Nguyen Tat Thanh University, Ho Chi Minh City, Vietnam Correspondence: Hong Tham Pham, Department of Pharmacy, Nguyen Tat Thanh University, Ho Chi Minh City, Vietnam, Tel 84 919 559 085, Email [email protected] Abstract: Recent years have witnessed the rise of artificial intelligence (AI) in antimicrobial resistance (AMR) management, implying a positive signal in the fight against antibiotic-resistant microbes. The impact of AI starts with data collection and preparation for deploying AI-driven systems, which can lay the foundation for some effective infection control strategies. Primary applications of AI include identifying potential antimicrobial molecules, rapidly testing antimicrobial susceptibility, and optimizing antibiotic combinations. Aside from their outstanding effectiveness, these applications also express high potential in narrowing the burden gap of AMR among different settings around the world. Despite these benefits, the interpretability of AI-based systems or models remains vague.


The Art Of Machine Learning

#artificialintelligence

This book presents a serious, practical look at machine learning, preparing you for valuable insights on your own data. This course serves as an introduction to machine learning and artificial intelligence for the creative learner. It aims to help students gain an understanding into one of the leading fields in technology. In the digital age, automation, coding and digitization are an artist's new medium. The course combines science and art and explores how creativity and technology work together.


Scaling assistive healthcare technology with 5G

#artificialintelligence

With recent advances in communication networks and machine learning (ML), healthcare is one of the key application domains which stands to benefit from many opportunities, including remote global healthcare, hospital services on cloud, remote diagnosis or surgeries, among others. One of those advances is network slicing, making it possible to provide high-bandwidth, low-latency and personalized healthcare services for individual users. This is important for patients using healthcare monitoring devices that capture various biological signals (biosignals) such as from the heart (ECG), muscles (EMG), brain (EEG), or activities from other parts of the body. In this blog, we discuss the challenges to building a scalable delivery platform for such connected healthcare services, and how technological advances can help to transform this landscape significantly for the benefit of both users and healthcare service providers. Our specific focus is on assistive technology devices which are increasingly being used by many individuals.


15 Most Common Data Science Interview Questions

#artificialintelligence

Some interviewers ask hard questions while others ask relatively easy questions. As an interviewee, it is your choice to go prepared. And when it comes to a domain like Machine Learning, preparations might fall short. You have to be prepared for everything. While preparing, you might have stuck at a point where you wonder what more shall I read. Well, based on almost 15-17 data science interviews that I have attended, here I have put 15, very commonly asked, as well as important Data Science and Machine Learning related questions that were asked to me in almost all of them and I recommend you must study these thoroughly.


Adding Explainability to Clustering - Analytics Vidhya

#artificialintelligence

Clustering is an unsupervised algorithm that is used for determining the intrinsic groups present in unlabelled data. For instance, a B2C business might be interested in finding segments in its customer base. Clustering is hence used commonly for different use-cases like customer segmentation, market segmentation, pattern recognition, search result clustering etc. Some standard clustering techniques are K-means, DBSCAN, Hierarchical clustering amongst other methods. Clusters created using techniques like Kmeans are often not easy to decipher because it is difficult to determine why a particular row of data is classified in a particular bucket.


Top 6 - PYTHON libraries You must know

#artificialintelligence

TensorFlow is the first one. Because this is the backbone library of Data science. This is optimized for Speed, it makes use of techniques like XLA for quick linear algebra operations. Theano is a computational framework machine learning library in python for computing multidimensional arrays, ability to use completely NumPy arrays in Theano-compiled functions. Pandas make sure that the entire process of manipulating data will be easier.


Could machine learning and operations research lift each other up?

#artificialintelligence

Is deep learning really going to be able to do everything? Opinions on deep learning's true potential vary. Geoffrey Hinton, awarded for pioneering deep learning, is not entirely unbiased, but others, including Hinton's deep learning collaborator Yoshua Bengio, are looking to infuse deep learning with elements of a domain still under the radar: operations research, or an analytical method of problem-solving and decision-making used in the management of organizations. Machine learning and its deep learning variety are practically household names now. There is a lot of hype around deep learning, as well as a growing number of applications using it.


My First Experience Deploying an ML Model to Production

#artificialintelligence

I have been working on Machine Learning since my third year in college. But during this time, the process always involved taking the dataset from Kaggle or some other open-source website. Also, these models/algorithms were either there on some Jupyter Notebook or Python script and were not deployed to some production website, it was always localhost. While interning at HackerRank and also after starting as a Software Engineer here as a part of the HackerRank Labs team working on a new product, I got a chance to deploy three different ML Models to production, working end-to-end on them. In this blog, I will be sharing my learnings and experience from one of the deployed models.


Graph machine learning with missing node features

#artificialintelligence

Graph Neural Network (GNN) models typically assume a full feature vector for each node. The two inputs to this model are the (normalised) adjacency matrix A encoding the graph structure and the feature matrix X containing as rows the node feature vectors and outputs the node embeddings Z. Each layer of GCN performs node-wise feature transformation (parametrised by the learnable matrices W₁ and W₂) and then propagates the transformed feature vectors to the neighboring nodes. Importantly, GCN assumes that all the entries in X are observed. In real-world scenarios, we often see situations where some node features are missing (Fig 1).


New improvement - spxbot blog

#artificialintelligence

In latest days, I've added and checked a new improvement to the model that generates the forecasts published by spxbot.com. As you might know, the input to a neural network is usually preprocessed, for many reasons, usually to eliminate excesses in the raw data and to create a more uniform analysis environment. Even if it may seem bizarre, a lot of documents available on the web agree that adding noise to the input produces a better pattern recognition. In easy words, this process enhance the ability of the neural networks to generalize, or to extract meaning from the inputs, or simply to "see" better. But what is exactly noise?