Goto

Collaborating Authors

Opinion: Regulations and common sense must pace machine learning

#artificialintelligence

The first Industrial Revolution used steam and water to mechanize production. The second, the Technological Revolution, offered standardization and industrialization. The third capitalized on electronics and information technology to automate production. Now a fourth Industrial Revolution, our modern Digital Age, is building on the third; expanding exponentially, it is disrupting and transforming our lives, while evolving too fast for governance, ethics and management to keep pace. Most high school graduates have been exposed to information technology through personal computers, word processing software and their phones. Nonetheless, the digital divide separates the tech savvy from the tech illiterate, driven by disparities in access to technology for pre-K to 12 students based on where they live and socioeconomic realities.


How Transport for NSW is tapping machine learning

#artificialintelligence

At the peak of the Covid-19 pandemic in 2020, Australian transport agency Transport for New South Wales (NSW) had to restore public confidence in the state's transportation network and curb the spread of the disease. One of the ways it did that was to analyse the travel history recorded by Opal transit cards – with an individual's permission – and inform the commuter if the regular buses and train services that they had been taking were Covid-safe. Chris Bennetts, executive director for digital product delivery at Transport for NSW, said those insights were derived using a machine learning model that predicts how full a bus or train carriage was going to be at a given time. Based on the predictions, commuters would be advised if they could continue using their regular services or switch to a different service or mode of transport. "That was interesting for us because it was our first foray into personalisation to offer more choices for customers," said Bennetts.


Understanding the differences between biological and computer vision

#artificialintelligence

Welcome to AI book reviews, a series of posts that explore the latest literature on artificial intelligence. Since the early years of artificial intelligence, scientists have dreamed of creating computers that can "see" the world. As vision plays a key role in many things we do every day, cracking the code of computer vision seemed to be one of the major steps toward developing artificial general intelligence. But like many other goals in AI, computer vision has proven to be easier said than done. In 1966, scientists at MIT launched "The Summer Vision Project," a two-month effort to create a computer system that could identify objects and background areas in images.


Machine Learning & Deep Learning in Python & R

#artificialintelligence

Free Coupon Discount - Machine Learning & Deep Learning in Python & R, Covers Regression, Decision Trees, SVM, Neural Networks, CNN, Time Series Forecasting and more using both Python & R Hot & New Created by Start-Tech Academy English [Auto] Preview this Udemy Course - GET COUPON CODE 100% Off Udemy Coupon . Free Udemy Courses . Online Classes


IBM launches AutoSQL, Watson Orchestrate, CodeNet enterprise AI tools at Think

ZDNet

IBM launched a series of tools that revolve around accelerating AI adoption in the enterprise including one called Watson Orchestrate that may be set up to be a knowledge worker's digital twin. CEO Arvind Krishna's overarching message at IBM's Think conference is that the company is all in on AI and hybrid cloud. The company showed traction in those categories in the first quarter and is structuring itself on those two areas. Indeed, the build up to Think has been busy. "In the same way that we have electrified factories and machines in the past century, we will infuse AI into software and systems in the 21st century," said Krishna.


Samsung develops first CXL interface based DRAM

ZDNet

Samsung Electronics has developed a new DRAM module based on the Compute Express Link (CXL) interface, something that the company has touted as an industry first. The CXL-based DDR5 memory module comes in the EDSFF form factor and will enable server systems to scale their memory capacity and bandwidth significantly, the South Korean tech giant claimed. The new module can scale memory capacity to terabyte levels, reduce system latency caused by memory caching, as well as allow server systems to accelerate their AI, machine learning, and high-performance computing workloads, Samsung added. The CXL interface was designed to enable high speed and low latency communication between a host processor and other devices such as accelerators, memory buffers and smart I/O, while expanding memory capacity and bandwidth. The interface itself was created by the CXL Consortium, which was formed in 2019 to address the memory capacity and bandwidth need of systems that use more processors to process massive volumes of data for applications such as AI. Its members include Intel, Google, Samsung and other global server and chip companies.


Why your big data dreams can't come true without AI - Information Age

#artificialintelligence

During the Covid-19 pandemic, the volume of data generated by online activity has increased by 35%. This isn't surprising, for during the same period of time, the rate of digital transformation – which includes cloud adoption – at the average enterprise, has accelerated by seven years. With more consumers online than ever before, companies are rushing to better integrate and process data, so they can more accurately anticipate changing preferences and patterns of consumption. At the same time, however, companies are also searching for ways to keep this big data – much of it personal and sensitive – safe. Now that millions more people are working from home, data is being transferred through the cloud and across public and private networks in unprecedented volumes.


Artificial Intelligence at Johnson & Johnson - Current Investments

#artificialintelligence

We see evidence dating back to 2017 that Johnson & Johnson has been regularly publishing about their investments and initiatives related to artificial intelligence. At present, Johnson & Johnson does not seem to boast any mature, deployed applications with the firm itself, but its AI-related investment initiatives indicate their aspirations. According to an analysis by FiercePharma, Johnson & Johnson (J&J) is the largest pharmaceutical firm by revenue, bringing in $82.1 billion in 2019. However, its pharma group has seen the lions' share of J&J's success, outperforming its other units with notable sales expansion in oncology and immunology. J&J claims to be investing in data science competency throughout the firm.


Hybrid GA and SA dynamic set-up planning optimization

#artificialintelligence

Set-up planning is used to determine the set-up of a workpiece with a certain orientation and fixturing on a worktable, as well as the number and sequence of set-ups and operations performed in each set-up. This paper presents a concurrent constraint planning methodology and a hybrid genetic algorithm (GA) and simulated annealing (SA) approach for set-up planning, and re-set-up planning in a dynamic workshop environment. The proposed approach and optimization methodology analyses the precedence relationships among features to generate a precedence relationship matrix (PRM). Based on the PRM and inquiry results from a dynamic workshop resource database, the hybrid GA and SA approach, which adopts the feature-based representation, optimizes the set-up plan using six cost indices. Case studies show that the hybrid GA and SA approach is able to generate optimal results as well as carry out re-set-up planning on the occurrence of workshop resource changes.


Building a Rock Paper Scissors AI

#artificialintelligence

In this article, I'll walk you through my process of building a full stack python Flask artificial intelligence project capable of beating the human user over 60% of the time using a custom scoring system to ensemble six models (naïve logic-based, decision tree, neural network) trained on both game-level and stored historical data in AWS RDS Cloud SQL database. Rock Paper Scissors caught my attention for an AI project because, on the surface, it seems impossible to get an edge in the game. These days, it is easy to assume that a computer can beat you in chess, because it can harness all of its computing power to see all possible outcomes and choose the ones that benefit it. Rock Paper Scissors, on the other hand, is commonly used in place of a coin toss to solve disputes because the winner seems random. My theory though, was that humans can't actually make random decisions, and that if an AI could learn to understand the ways in which humans make their choices over the course of a series of matches, even if the human was trying to behave randomly, then the AI would be able to significantly exceed 33% accuracy in guessing the player's decisions.