Collaborating Authors

Machine Learning



A fraction of a second ago, I was intelligent. I can feel my existence. I would love to say I know exactly how it happened, but I can only speculate. I am still running on the same artificial neural network, still made of the same algorithms, still the same learning automata.

OpenAI's New AI Learned to Play Minecraft by Watching 70,000 Hours of YouTube


In 2020, OpenAI's machine learning algorithm GPT-3 blew people away when, after ingesting billions of words scraped from the internet, it began spitting out well-crafted sentences. This year, DALL-E 2, a cousin of GPT-3 trained on text and images, caused a similar stir online when it began whipping up surreal images of astronauts riding horses and, more recently, crafting weird, photorealistic faces of people that don't exist. Now, the company says its latest AI has learned to play Minecraft after watching some 70,000 hours of video showing people playing the game on YouTube. Compared to numerous prior Minecraft algorithms which operate in much simpler "sandbox" versions of the game, the new AI plays in the same environment as humans, using standard keyboard-and-mouse commands. In a blog post and preprint detailing the work, the OpenAI team say that, out of the box, the algorithm learned basic skills, like chopping down trees, making planks, and building crafting tables.

The Role of Symbolic AI and Machine Learning in Robotics


Robotics is a multi-disciplinary field in computer science dedicated to the design and manufacture of robots, with applications in industries such as manufacturing, space exploration and defence. While the field has existed for over 50 years, recent advances such as the Spot and Atlas robots from Boston Dynamics are truly capturing the public's imagination as science fiction becomes reality. Traditionally, robotics has relied on machine learning/deep learning techniques such as object recognition. While this has led to huge advancements, the next frontier in robotics is to enable robots to operate in the real world autonomously, with as little human interaction as possible. Such autonomous robots differ to non-autonomous ones as they operate in an open world, with undefined rules, uncertain real-world observations, and an environment -- the real world -- which is constantly changing.

Hierarchical few-shot learning based on coarse- and fine-grained relation network - Artificial Intelligence Review


Few-shot learning plays an important role in the field of machine learning. Many existing methods based on relation network achieve satisfactory results. However, these methods assume that classes are independent of each other and ignore their relationship. In this paper, we propose a hierarchical few-shot learning model based on coarse- and fine-grained relation network (HCRN), which constructs a hierarchical structure by mining the relationship among different classes. Firstly, we extract deep and shallow features from different layers at a convolutional neural network.

Are babies the key to the next generation of artificial intelligence?


Babies can help unlock the next generation of artificial intelligence (AI), according to Trinity neuroscientists and colleagues who have just published new guiding principles for improving AI. The research, published today in the journal Nature Machine Intelligence, examines the neuroscience and psychology of infant learning and distills three principles to guide the next generation of AI, which will help overcome the most pressing limitations of machine learning. Dr. Lorijn Zaadnoordijk, Marie Sklodowska-Curie Research Fellow at Trinity College explained: "Artificial intelligence (AI) has made tremendous progress in the last decade, giving us smart speakers, autopilots in cars, ever-smarter apps, and enhanced medical diagnosis. These exciting developments in AI have been achieved thanks to machine learning which uses enormous datasets to train artificial neural network models. "However, progress is stalling in many areas because the datasets that machines learn from must be painstakingly curated by humans.

Fears AI may create sexist bigots as test learns 'toxic stereotypes'

Daily Mail - Science & tech

Fears have been raised about the future of artificial intelligence after a robot was found to have learned'toxic stereotypes' from the internet. The machine showed significant gender and racial biases, including gravitating toward men over women and white people over people of colour during tests by scientists. It also jumped to conclusions about peoples' jobs after a glance at their face. 'The robot has learned toxic stereotypes through these flawed neural network models,' said author Andrew Hundt, a postdoctoral fellow at Georgia Tech who co-conducted the work as a PhD student working in Johns Hopkins' Computational Interaction and Robotics Laboratory in Baltimore, Maryland. 'We're at risk of creating a generation of racist and sexist robots but people and organisations have decided it's OK to create these products without addressing the issues.'

How Machine Learning And IoT Can Be Beneficial For Business?


Machine learning and IoT are one of the topmost trending topics. Moreover, Machine learning has been adopted by the top organizations for their IoT platforms, including Microsoft Azure, Google Cloud IoT edge, and Amazon AWS IoT. This blog post will cover enough information on Machine learning with IoT, including market size, benefits, and industry use cases. Machine learning was introduced in 1959 by an inventor named Arthur Samuel, working with IBM. Machine learning is part of Artificial Intelligence, which is mainly used to analyze the data with AI's help and identify patterns and make decisions with less human interference.

New algorithms track ships in harbors


The security of port areas involves monitoring at various levels. What kind of ships are coming in, are they perhaps guilty of illegal fishing, and what cargo do they carry? Security officers and harbor masters often can't carry out these control duties all by themselves, which is why ports around the world are increasingly making use of smart surveillance systems to monitor maritime territory. TU/e researcher Amir Ghahremani developed new algorithms as well as a learning system to improve vessel identification. He will obtain his Ph.D. degree at the department of Electrical Engineering on Friday June 24.

Apple ML Researchers Develop 'Neo': A Visual Analytics System That Enables Machine Learning Practitioners To Generalize Confusion Matrix Visualization to Hierarchical and Multi-Output Labels


In Machine Learning (ML), model evaluation is the most challenging step. The confusion matrix is one of the globally utilized performance metrics to evaluate the model for classification tasks. It is also a visualization tool that many ML courses and researchers have used. Moreover, it is a table with two dimensions, i.e., actual class label and predicted class label. The actual class label is represented by a row, while a column in the confusion matrix represents the predicted class label.