Goto

Collaborating Authors

Machine Learning


Opinion: Regulations and common sense must pace machine learning

#artificialintelligence

The first Industrial Revolution used steam and water to mechanize production. The second, the Technological Revolution, offered standardization and industrialization. The third capitalized on electronics and information technology to automate production. Now a fourth Industrial Revolution, our modern Digital Age, is building on the third; expanding exponentially, it is disrupting and transforming our lives, while evolving too fast for governance, ethics and management to keep pace. Most high school graduates have been exposed to information technology through personal computers, word processing software and their phones. Nonetheless, the digital divide separates the tech savvy from the tech illiterate, driven by disparities in access to technology for pre-K to 12 students based on where they live and socioeconomic realities.


How Transport for NSW is tapping machine learning

#artificialintelligence

At the peak of the Covid-19 pandemic in 2020, Australian transport agency Transport for New South Wales (NSW) had to restore public confidence in the state's transportation network and curb the spread of the disease. One of the ways it did that was to analyse the travel history recorded by Opal transit cards – with an individual's permission – and inform the commuter if the regular buses and train services that they had been taking were Covid-safe. Chris Bennetts, executive director for digital product delivery at Transport for NSW, said those insights were derived using a machine learning model that predicts how full a bus or train carriage was going to be at a given time. Based on the predictions, commuters would be advised if they could continue using their regular services or switch to a different service or mode of transport. "That was interesting for us because it was our first foray into personalisation to offer more choices for customers," said Bennetts.


Understanding the differences between biological and computer vision

#artificialintelligence

Welcome to AI book reviews, a series of posts that explore the latest literature on artificial intelligence. Since the early years of artificial intelligence, scientists have dreamed of creating computers that can "see" the world. As vision plays a key role in many things we do every day, cracking the code of computer vision seemed to be one of the major steps toward developing artificial general intelligence. But like many other goals in AI, computer vision has proven to be easier said than done. In 1966, scientists at MIT launched "The Summer Vision Project," a two-month effort to create a computer system that could identify objects and background areas in images.


Machine Learning & Deep Learning in Python & R

#artificialintelligence

Free Coupon Discount - Machine Learning & Deep Learning in Python & R, Covers Regression, Decision Trees, SVM, Neural Networks, CNN, Time Series Forecasting and more using both Python & R Hot & New Created by Start-Tech Academy English [Auto] Preview this Udemy Course - GET COUPON CODE 100% Off Udemy Coupon . Free Udemy Courses . Online Classes


Artificial Intelligence at Johnson & Johnson - Current Investments

#artificialintelligence

We see evidence dating back to 2017 that Johnson & Johnson has been regularly publishing about their investments and initiatives related to artificial intelligence. At present, Johnson & Johnson does not seem to boast any mature, deployed applications with the firm itself, but its AI-related investment initiatives indicate their aspirations. According to an analysis by FiercePharma, Johnson & Johnson (J&J) is the largest pharmaceutical firm by revenue, bringing in $82.1 billion in 2019. However, its pharma group has seen the lions' share of J&J's success, outperforming its other units with notable sales expansion in oncology and immunology. J&J claims to be investing in data science competency throughout the firm.


Building a Rock Paper Scissors AI

#artificialintelligence

In this article, I'll walk you through my process of building a full stack python Flask artificial intelligence project capable of beating the human user over 60% of the time using a custom scoring system to ensemble six models (naïve logic-based, decision tree, neural network) trained on both game-level and stored historical data in AWS RDS Cloud SQL database. Rock Paper Scissors caught my attention for an AI project because, on the surface, it seems impossible to get an edge in the game. These days, it is easy to assume that a computer can beat you in chess, because it can harness all of its computing power to see all possible outcomes and choose the ones that benefit it. Rock Paper Scissors, on the other hand, is commonly used in place of a coin toss to solve disputes because the winner seems random. My theory though, was that humans can't actually make random decisions, and that if an AI could learn to understand the ways in which humans make their choices over the course of a series of matches, even if the human was trying to behave randomly, then the AI would be able to significantly exceed 33% accuracy in guessing the player's decisions.


IBM's CodeNet dataset can teach AI to translate computer languages

Engadget

AI and machine learning systems have become increasingly competent in recent years, capable of not just understanding the written word but writing it as well. But while these artificial intelligences have nearly mastered the English language, they have yet to become fluent in the language of computers -- that is, until now. IBM announced during its Think 2021 conference on Monday that its researchers have crafted a Rosetta Stone for programming code. Over the past decade, advancements in AI have mainly been "driven by deep neural networks, and even that, it was driven by three major factors: data with the availability of large data sets for training, innovations in new algorithms, and the massive acceleration of faster and faster compute hardware driven by GPUs," Ruchir Puri, IBM Fellow and Chief Scientist at IBM Research, said during his Think 2021 presentation, likening the new data set to the venerated ImageNet, which has spawned the recent computer vision land rush. "Software is eating the world," Marc Andreessen wrote in 2011.


Microsoft Releases Open-Source Tool To Test The Security Of AI Systems

#artificialintelligence

Artificial intelligence systems take inputs in the form of visuals, audios, texts, etc. As a result, filtering, handling, and detecting malicious inputs and behaviours have become more complicated. Cybersecurity is one of the top priorities of companies worldwide. The increase in the number of AI Security papers from just 617 in 2018 to over 1500 in 2020 (an increase of almost 143% as per an Adversa report) is a testament to the growing importance of cybersecurity. Microsoft has recently announced the release of Counterfit – a tool to test the security of AI systems – as an open-source project.


EETimes - Will Machines Ever Fully Understand What They Are Seeing?

#artificialintelligence

Embedded vision technologies are giving machines the power of sight, but today's systems still fall short of understanding all the nuances of an image. An approach used for natural language processing could address that. Attention-based neural networks, particularly transformer networks, have revolutionized natural language processing (NLP), giving machines a better understanding of language than ever before. This technique, which is designed to mimic cognitive processes by giving an artificial neural network an idea of history or context, has produced much more sophisticated AI agents than older approaches that also employ memory, such as long short-term memory (LSTM) and recurrent neural networks (RNNs). NLP now has a deeper level of understanding of the questions or prompts it is fed and can create long pieces of text in response that are often indistinguishable from what a human might write.


datamining_2021-05-09_23-30-38.xlsx

#artificialintelligence

The graph represents a network of 3,439 Twitter users whose tweets in the requested range contained "datamining", or who were replied to or mentioned in those tweets. The network was obtained from the NodeXL Graph Server on Monday, 10 May 2021 at 06:40 UTC. The requested start date was Monday, 10 May 2021 at 00:01 UTC and the maximum number of days (going backward) was 14. The maximum number of tweets collected was 7,500. The tweets in the network were tweeted over the 13-day, 7-hour, 19-minute period from Monday, 26 April 2021 at 16:40 UTC to Monday, 10 May 2021 at 00:00 UTC.