Goto

Collaborating Authors

RoboCup and its role in the history and future of AI

AIHub

As I write this blob post, we're a few days away from the opening of the 2021 RoboCup Competitions and Symposium. Running from June 22nd-28th, this event brings together AI and robotics researchers and learners from around the world, for the first (and ideally last!) time in a fully remote format. The first official international RoboCup event occurred 25 years ago, at the IROS 1996 conference in Osaka, Japan. Called "pre-RoboCup" because the first full RoboCup was slated to launch the following year at the 1997 IJCAI conference in Nagoya, the CMUnited team created by myself and my Ph.D. advisor, Manuela Veloso, was the only non-Japanese entry in the simulation competition, which was the only event that year. While RoboCup has indisputably played a huge role in the last quarter-century of AI research, it has also played a leading role in my own personal story.


Humongous flying taxi drone set to debut at the 2024 Olympics

Mashable

The VoloCity eVTOL is an 18-rotor passenger drone developed by Volocopter. The two-seater completed its latest test flight in France and is set to be used as an air taxi at the Paris Olympics Games in 2024.


3 Low-Code Machine Learning Libraries that You Should Know About

#artificialintelligence

Some of my most popular blogs on Medium are about libraries that I believe you should try. In this blog, I will focus on low-code machine learning libraries. The truth is that many data scientists believe that low-code libraries are shortcuts and should be avoided. I'm afraid I have to disagree! I think that low-code libraries should be included in our pipeline to help us make important decisions without wasting time.


An ally for alloys: AI helps design high-performance steels

#artificialintelligence

Machine learning techniques have contributed to progress in science and technology fields ranging from health care to high-energy physics. Now, machine learning is poised to help accelerate the development of stronger alloys, particularly stainless steels, for America's thermal power generation fleet. Stronger materials are key to producing energy efficiently, resulting in economic and decarbonization benefits. "The use of ultra-high-strength steels in power plants dates back to the 1950s and has benefited from gradual improvements in the materials over time," says Osman Mamun, a postdoctoral research associate at Pacific Northwest National Laboratory (PNNL). "If we can find ways to speed up improvements or create new materials, we could see enhanced efficiency in plants that also reduces the amount of carbon emitted into the atmosphere."


How Machine Learning Can Improve Your Debt Collection Process -- Lateral

#artificialintelligence

Developments in machine learning (ML) and Artificial Intelligence (AI) are having a great impact on the debt collection industry. At its core Machine Learning generates predictive models using algorithms that learn from data. The idea is that if we can input enough useful and reliable data, we can build models which can make predictions on our behalf. There are a number of ways in which machine learning can aid and improve the debt collection process: Reduce Workloads Collections departments place calls, send countless emails, and seek to work out payment plans — and very frequently none of the above activities translate into the successful recovery of debt. With ML this changes. Since tasks are automated, users experience higher productivity and less time spent on labour-intensive tasks. Protecting Your Business Reputation Since ML can automate communication, you know that all your business correspondence will be professional, methodical and unambiguous. LATERAL’S debt collections software provides its users with a non-intrusive, customer-driven point of engagement, which is proven to be highly successful.  


What government CIOs need for AI to succeed - FedScoop

#artificialintelligence

Kirke Everson is a principal in KPMG's Federal Advisory practice, focusing on technology enablement, intelligent automation, program management, process improvement, cyber security, risk management, and financial management. He currently serves as the government lead for Intelligent automation for KPMG in the U.S. Federal and state government leaders are witnessing the expansion of artificial intelligence all around them. From back-office automation, that can help reduce backlogged work, to cognitive platforms, that can identify and respond to natural language requests to better serve the public, AI and automation has become a driving force in addressing mission and business objectives. Based on the use cases they described, it's clear that agencies are making significant headway in putting AI to work. At the same time, there a variety of issues where government CIOs also need broader support. The issues they and their executive teams face, in many ways, are not that different from previous technology breakthroughs that tended to upend familiar work processes.


Council Post: Carrot And Stick: How Deep Reinforcement Learning Trains AI Differently

#artificialintelligence

Founder and CEO of PLANERGY, with decades of international experience in Procurement, Spend Management and Technology. From its earliest days, artificial intelligence (AI) has captivated and enticed the business world with its potential ability to learn not only to imitate humans but to supersede our capabilities. As the importance of digital transformation grows, so too has the number of organizations implementing AI technologies to optimize and automate their business processes. Process automation and data analytics powered by machine learning are well-established uses for artificial intelligence in today's marketplace. While these technologies certainly create value and cut costs for companies large and small, we have not yet reached the pinnacle of AI's potential benefits.


Scalable Machine Learning with Spark

#artificialintelligence

Since the early 2000s, the amount of data collected has increased enormously due to the advent of internet giants such as Google, Netflix, Youtube, Amazon, Facebook, etc. Near to 2010, another "data wave" had come about when mobile phones became hugely popular. In 2020s, we anticipate another exponential rise in data when IoT devices become all-pervasive. Given this backdrop, building scalable systems becomes a sine qua non for machine learning solutions. Pre-2005, parallel processing libraries like MPI and PVM were popular for compute heavy tasks, based on which TensorFlow was designed later. Hence, the design was aimed to reduce data redundancy, by dividing larger tables into smaller tables, and link them using relationships (Normalization).


Forrester: The new automation fabric is where digital business happens

#artificialintelligence

Automation is changing the paradigm that development was limited to app development and delivery professionals with specialized skills, a new report from Forrester finds. Today, with low-code tools and robotic process automation builders, "business users and non-coders can now build bespoke workflows and customized functionality," according to the Automation is the New Fabric for Digital Business report. But a piecemeal approach to automation technology has created as many problems as it has solved, according to the report. One issue is that tactical automation undersells transformative potential. SEE: The CIO's guide to quantum computing (free PDF) (TechRepublic) While automation is the enabler supporting transformation at multiple levels, "tactical, cost-focused automation disconnected from digital transformation goals can inhibit this broader vision."


Google Introduces New Cloud TPU VMs for Artificial Intelligence Workloads

#artificialintelligence

Recently, Google announced new Cloud TPU Virtual Machines (VMs), which provide direct access to TPU host machines. With these VMs, the company offers a new and improved user experience to develop and deploy TensorFlow, PyTorch, and JAX on Cloud TPUs. Customers could already set up virtual instances in Google Cloud with the TPU chipsets. However, this presented some drawbacks as the instances did not run in the same server environment. The TPUs were connected to the chipsets remotely via a network connection, reducing the processing speed since applications had to send the data over the network to a TPU and then wait for the processed data to be sent back.