Goto

Collaborating Authors

Neural Networks


DeText: A deep NLP framework for intelligent text understanding

#artificialintelligence

Natural language processing (NLP) technologies are widely deployed to process rich natural language text data for search and recommender systems. Achieving high-quality search and recommendation results requires that information, such as user queries and documents, be processed and understood in an efficient and effective manner. In recent years, the rapid development of deep learning models has been proven successful for improving various NLP tasks, indicating the vast potential for further improving the accuracy of search and recommender systems. Deep learning-based NLP technologies like BERT (Bidirectional Encoder Representations from Transformers) have recently made headlines for showing significant improvements in areas such as semantic understanding when contrasted with prior NLP techniques. However, exploiting the power of BERT in search and recommender systems is a non-trivial task, due to the heavy computation cost of BERT models. In this blog post, we will introduce DeText, a state-of-the-art open source NLP framework for text understanding.


The Best Resources on Artificial Intelligence and Machine Learning

#artificialintelligence

Half of this crazy year is behind us and summer is here. Over the years, we machine learning engineers at Ximilar have gathered a lot of interesting ML/AI material from which we draw. I have chosen the best ones from podcasts to online courses that I recommend to listen to, read, and check. Some of them are introductory, others more advanced. However, all of them are high-quality ones made by the best people in the field and they are worth checking.


Image-scaling attacks highlight dangers of adversarial machine learning

#artificialintelligence

This article is part of our reviews of AI research papers, a series of posts that explore the latest findings in artificial intelligence. We usually don't expect the image of a teacup to turn into a cat when we zoom out. But in the world of artificial intelligence research, strange things can happen. Researchers at Germany's Technische Universität Braunschweig have shown that carefully modifying the pixel values of digital photos can turn them into a completely different image when they are downscaled. What's concerning is the implications these modifications can have for AI algorithms.


Deep Learning Prerequisites: Logistic Regression in Python

#artificialintelligence

Created by Lazy Programmer Inc. English [Auto-generated], Portuguese [Auto-generated], 1 more Created by Lazy Programmer Inc. This course is a lead-in to deep learning and neural networks - it covers a popular and fundamental technique used in machine learning, data science and statistics: logistic regression. We cover the theory from the ground up: derivation of the solution, and applications to real-world problems. We show you how one might code their own logistic regression module in Python. This course does not require any external materials.


Deep Learning In Gaming

#artificialintelligence

Hi All - This event was originally going to be held during GDC week back in March but had to be postponed. Excited to be hosting this event virtually during GDC Summer on Aug 4th. Games have always been at the forefront of AI & they serve as a good testing bed for AI before we put it to use in the real world. Therefore, its natural to look into gaming to peek into new techniques being discovered in AI. What started with self-learning AI in games has now translated into solving real-world problems in computer vision, natural language processing, & self-driving cars.


Recurrent Neural Network with LSTM

#artificialintelligence

Let me begin this article with a question -- Which of the following sentence makes sense? Its obvious that the second one makes sense as the sequence of the sentence is preserved. So, whenever the sequence is important we use RNN. RNNs in general and LSTMs, in particular, have received the most success when working with sequences of words and paragraphs, generally called natural language processing. Some of the famous technologies using RNN are Google Assistance, Google Translate, Stock Prediction, Image Captioning, and similarly many more.


A Beginners Guide to Skorch – With Code To Implement Neural Network

#artificialintelligence

Skorch is one of the useful libraries in Pytorch to work on machine learning … In building deep neural networks, we are required to train our model, …


The Machine Learning Field Guide - KDnuggets

#artificialintelligence

We all start with either a dataset or a goal in mind. Once we've found, collected or scraped our data, we pull it up, and witness the overwhelming sight of merciless cells of numbers, more numbers, categories, and maybe some words! A naive thought crosses our mind, to use our machine learning prowess to deal with this tangled mess... but a quick search reveals the host of tasks we'll need to consider before training a model! Once we overcome the shock of our unruly data, we look for ways to battle our formidable nemesis. We start by trying to get our data into Python. It is relatively simple on paper, but the process can be slightly... involved. Nonetheless, a little effort was all that was needed (lucky us). Without wasting any time, we begin data cleaning to get rid of the bogus and expose the beautiful.


From Coursera to Omdena in 1 year

#artificialintelligence

Throughout the rest of my high school, I learned about game development, advanced data structures and algorithms, but not much about AI. The only exposure I had to Machine Learning was this website here, which didn't make a whole lot of sense to me back then. Fast forward, I returned to India and was attending Eastern Public School, finishing up my 12th grade with an International Baccalaureate diploma. I started the Stanford University Machine Learning course taught by Dr. Andrew Ng, http://ml-class.org/. The best part is it does not use any high level libraries to teach the concepts to you, so you have to use MATLAB to answer all the programming assignments.


Predicting heave and surge motions of a semi-submersible with neural networks

#artificialintelligence

Real-time motion prediction of a vessel or a floating platform can help to improve the performance of motion compensation systems. It can also provide useful early-warning information for offshore operations that are critical with regard to motion. In this study, a long short-term memory (LSTM) -based machine learning model was developed to predict heave and surge motions of a semi-submersible. The training and test data came from a model test carried out in the deep-water ocean basin, at Shanghai Jiao Tong University, China. The motion and measured waves were fed into LSTM cells and then went through serval fully connected (FC) layers to obtain the prediction.