Collaborating Authors

Information Technology: AI-Alerts

The Drones of War


North American professional drone maker Draganfly has sent the first of nearly a dozen humanitarian drones to the non-profit Ukraine organization Revived Soldiers Ukraine (RSU) in Europe, to be used to deliver insulin to hard-to-reach hospitals in the war-torn country. RSU has ordered 200 medical response drones from Draganfly, each costing $30,000 and equipped with temperature-managed payload boxes that can transport up to 35 pounds of blood, pharmaceuticals, insulin/medicines, vaccines, and wound care kits, the drone maker said. Because insulin is a temperature-sensitive product, quick and safe transportation is a top priority. There are roughly 2.3 million people living with diabetes in Ukraine, according to the International Diabetes Association, many of whom have Type 1 diabetes and require multiple daily injections of insulin to survive. For those living in high-conflict areas of the country, access to life-saving insulin is limited or non-existent.

Powering the next generation of AI


Arun Subramaniyan joined Intel to lead the Cloud & AI Strategy team. Arun joined Intel from AWS, where he led the global solutions team for Machine Learning, Quantum Computing, High Performance Computing (HPC), Autonomous Vehicles, and Autonomous Computing at AWS. His team was responsible for developing solutions across all areas of HPC, quantum computing, and large-scale machine learning applications, spanning $1.5B portfolio. Arun founded and grew the global teams for Autonomous Computing and Quantum Computing Go-to-market and solutions at AWS and grew the businesses 2-3x. Arun's primary areas of research focus are Bayesian methods, global optimization, probabilistic deep learning for large scale applications, and distributed computing.

This robot lives with an Antarctica penguin colony, monitoring their every move

USATODAY - News Top Stories

Thousands of emperor penguins waddling around Antarctica have a stalker: A yellow rover tracking their every move. ECHO is a remote-controlled ground robot that silently spies on the emperor penguin colony in Atka Bay. The robot is being monitored by the Single Penguin Observation and Tracking observatory. Both the SPOT observatory, which is also remote-operated through a satellite link, and the ECHO robot capture photographs and videos of animal population in the Arctic. The research is part of the Marine Animal Remote Sensing Lab (MARE), designed to measure the health of the Antarctic marine ecosystem.

Small Drones Are Giving Ukraine an Unprecedented Edge


In the snowy streets of the north Ukrainian town of Trostyanets, the Russian missile system fires rockets every second. Tanks and military vehicles are parked on either side of the blasting artillery system, positioned among houses and near the town's railway system. The weapon is not working alone, though. Hovering tens of meters above it and recording the assault is a Ukrainian drone. The drone isn't a sophisticated military system, but a small, commercial machine that anyone can buy.

Watch a swarm of drones navigate a forest without crashing

New Scientist

A new navigation system enables a swarm of 10 lightweight drones to fly together without crashing into one another or obstacles, even in challenging places such as forests. Drones can compute their location and find a path to follow using a panoply of sensors, which can be expensive and unwieldy. Shrinking down a drone often involves getting rid of key components, impacting its ability to travel safely. Xin Zhou at Zhejiang University in China and his colleagues have developed a new method that reduces the size and hardware requirements of a drone while keeping its computing nous. The palm-sized, 300-gram drone uses off-the-shelf computer components powered by a 100-gram battery that can keep it aloft for up to 11 minutes. The drone has a camera that feeds real-time footage to its processing unit.

AI must be developed responsibly to improve mental health outcomes

Fast Company

The motivation to integrate AI into mental health services has grown during the pandemic. The Kaiser Family Foundation reported an increase in adults experiencing symptoms of anxiety and depression, from 1 in 10 adults pre-pandemic to 4 in 10 adults in early 2021. Coupled with a national shortage of mental health professionals as well as limited opportunities for in-person mental health support, AI-powered tools could be used as an entry point to care by automatically and remotely measuring and intervening to reduce mental health symptoms. Many mental health startups are integrating AI within their product offerings. Woebot Health developed a chatbot that delivers on-demand therapy to users through natural language processing (NLP).

How Language-Generation AIs Could Transform Science

Scientific American: Technology

Machine-learning algorithms that generate fluent language from vast amounts of text could change how science is done -- but not necessarily for the better, says Shobita Parthasarathy, a specialist in the governance of emerging technologies at the University of Michigan in Ann Arbor. In a report published on 27 April, Parthasarathy and other researchers try to anticipate societal impacts of emerging artificial-intelligence (AI) technologies called large language models (LLMs). These can churn out astonishingly convincing prose, translate between languages, answer questions and even produce code. The corporations building them -- including Google, Facebook and Microsoft -- aim to use them in chatbots and search engines, and to summarize documents. They sometimes parrot errors or problematic stereotypes in the millions or billions of documents they're trained on.

R2-D-Chew: robot chef imitates human eating process to create tastier food

The Guardian

The culinary robots are here. Not only to distinguish between food which tastes good and which doesn't, but also to become better cooks. A robot chef designed by researchers at Cambridge University has been trained to taste a dish's saltiness and the myriad of ingredients at different stages of chewing – a process imitating that of humans. It is a step above current electronic testing that only provides a snapshot of a food's salinity. Replicating the human process, researchers say, should result in a tastier end product. "If robots are to be used for certain aspects of food preparation, it's important that they are able to'taste' what they're cooking," said Grzegorz Sochacki, one of the researchers, from Cambridge's department of engineering.

Meta wants to improve its AI by studying human brains


If artificial intelligence is intended to resemble a brain, with networks of artificial neurons substituting for real cells, then what would happen if you compared the activities in deep learning algorithms to those in a human brain? Last week, researchers from Meta AI announced that they would be partnering with neuroimaging center Neurospin (CEA) and INRIA to try to do just that. Through this collaboration, they're planning to analyze human brain activity and deep learning algorithms trained on language or speech tasks in response to the same written or spoken texts. In theory, it could decode both how human brains--and artificial brains--find meaning in language. By comparing scans of human brains while a person is actively reading, speaking, or listening with deep learning algorithms given the same set of words and sentences to decipher, researchers hope to find similarities as well as key structural and behavioral differences between brain biology and artificial networks.

The Hyperscalers Point The Way To Integrated AI Stacks


Enterprises know they want to do machine learning, but they also know they can't afford to think too long or too hard about it. They need to act, and they have specific business problems that they want to solve. And they know instinctively and anecdotally from the experience of the hyperscalers and the HPC centers of the world that machine learning techniques can be utterly transformative in augmenting existing applications, replacing hand-coded applications, or creating whole new classes of applications that were not possible before. They also have to decide if they want to run their AI workloads on-premise or on any one of a number of clouds where a lot of the software for creating models and training them are available as a service. And let's acknowledge that a lot of those models were created by the public cloud giants for internal workloads long before they were peddled as a service.