The next year will be pivotal for the Air Force's effort to acquire a new class of autonomous drones, as industry teams compete for a chance to build a fleet of robotic wingmen that will soon undergo operational experimentation. The "Skyborg" program is one of the service's top science-and-technology priorities under the "Vanguard" initiative to deliver game-changing capabilities to its warfighters. The aim is to acquire relatively inexpensive, attritable unmanned aircraft that can leverage artificial intelligence and accompany manned fighter jets into battle. "I expect that we will do sorties where a set number are expected to fly with the manned systems, and we'll have crazy new [concepts of operation] for how they'll be used," Assistant Secretary of the Air Force for Acquisition, Technology and Logistics Will Roper said during an online event hosted by the Mitchell Institute for Aerospace Studies. The platforms might even be called upon to conduct kamikaze missions.
The Radiological Society of North America (RSNA) has launched its fourth annual artificial intelligence (AI) challenge, a competition among researchers to create applications that perform a clearly defined clinical task according to specified performance measures. The challenge for competitors this year is to create machine-learning algorithms to detect and characterize instances of pulmonary embolism. RSNA collaborated with the Society of Thoracic Radiology (STR) to create a massive dataset for the challenge. The RSNA-STR Pulmonary Embolism CT (RSPECT) dataset is comprised of more than 12,000 CT scans collected from five international research centers. The dataset was labeled with detailed clinical annotations by a group of more than 80 expert thoracic radiologists.
Fox News Flash top entertainment and celebrity headlines are here. Check out what's clicking today in entertainment. The U.S. military recently conducted a live-fire full combat replication with unmanned-to-unmanned teaming guiding attacks, small reconnaissance drones, satellites sending target coordinates to ground artillery and high-speed, AI-enabled "networked" warfare. This exercise was a part of the Army's Project Convergence 2020, a weapons and platform combat experiment which, service leaders say, represents a massive transformation helping the service pivot its weapons use, tactics and maneuver strategies into a new era. Taking place at Yuma Proving Grounds, Arizona, Project Convergence involved live-fire war experiments aligned in three distinct phases, intended to help the Army cultivate its emerging modern Combined Arms Maneuver strategy.
Fox News Flash top headlines are here. Check out what's clicking on Foxnews.com. As the holiday draws closer, people across the country are still asking what Halloween will look like this year. Some areas have discussed canceling events like trick-or-treating, while it appears that others are looking to invent new ways to keep the tradition alive. Luke Keyes still plans on giving out candy to trick-or-treaters this year, even if he has to go high-tech for his solutions, KVUE reports.
September 23, 2020 – An artificial intelligence algorithm can detect subtle signs of osteoarthritis in MRI scans, years before symptoms of the condition even begin. Researchers at University of Pittsburgh School of Medicine and Carnegie Mellon University College of Engineering noted that right now, the primary treatment for osteoarthritis is joint replacement. The condition is so prevalent that knee replacement is the most common surgery in the US for people over the age of 45. "The gold standard for diagnosing arthritis is x-ray. As the cartilage deteriorates, the space between the bones decreases," said study co-author Kenneth Urish, MD, PhD, associate professor of orthopaedic surgery at Pitt and associate medical director of the bone and joint center at UPMC Magee-Womens Hospital. "The problem is, when you see arthritis on x-rays, the damage has already been done. It's much easier to prevent cartilage from falling apart than trying to get it to grow again."
The housing market continues to defy gravity. Sales of existing homes rose more than 10% last month compared to a year ago, hitting their highest level since December 2006, according to the National Association of Realtors. And now, more than ever, people are relying on online platforms to search for -- and even buy -- houses. And that opens the door for artificial intelligence to play a bigger role, like using computer vision to create real estate listings based on photos. I spoke with Christopher Geczy, a professor at the Wharton School of the University of Pennsylvania who teaches about real estate and insurance technology.
Dr. Dina Demner-Fushman is a Staff Scientist at the Lister Hill National Center for Biomedical Communications, NLM. Demner-Fushman is a lead investigator in several NLM projects in the areas of Information Extraction for Clinical Decision Support, EMR Database Research and Development, and Image and Text Indexing for Clinical Decision Support and Education. The outgrowths of these projects are the evidence-based decision support system in use at the NIH Clinical Center since 2009, an image retrieval engine, Open-i, launched in 2012, and an automatic question answering service. Dr. Demner-Fushman earned her doctor of medicine degree from Kazan State Medical Institute in 1980, and clinical research Doctorate (PhD) in Medical Science degree from Moscow Medical and Stomatological Institute in 1989. She earned her MS and PhD in Computer Science from the University of Maryland, College Park in 2003 and 2006, respectively.
Artificial intelligence (AI) machine learning can have a considerable carbon footprint. Deep learning is inherently costly, as it requires massive computational and energy resources. Now researchers in the U.K. have discovered how to create an energy-efficient artificial neural network without sacrificing accuracy and published the findings in Nature Communications on August 26, 2020. The biological brain is the inspiration for neuromorphic computing--an interdisciplinary approach that draws upon neuroscience, physics, artificial intelligence, computer science, and electrical engineering to create artificial neural systems that mimic biological functions and systems. The human brain is a complex system of roughly 86 billion neurons, 200 billion neurons, and hundreds of trillions of synapses.
Researchers from several American universities are collaborating to develop artificial intelligence based software to help people on the autism spectrum find and hold meaningful employment. The project is a collaboration between experts at Vanderbilt, Yale, Cornell and the Georgia Institute of Technology. It consists of developing multiple pieces of technology, each one aimed at a different aspect of supporting people with Autism Spectrum Disorder (ASD) in the workplace, according to Nilanjan Sarkar, professor of engineering at Vanderbilt University and the leader of the project. "We realized together that there are some support systems for children with autism in this society, but as soon as they become 18 years old and more, there is a support cliff and the social services are not as much," Sarkar said. The project began a year ago with preliminary funding from the National Science Foundation. The NSF initially invested in around 40 projects, but only four -- including this one -- were chosen to be funded for a longer term of two years.
While getting to grips with open banking regulation, skyrocketing transaction volumes and expanding customer expectations, banks have been rolling out major transformations of data infrastructure and partnering with Silicon Valley's most innovative tech companies to rebuild the banking business around a central nervous system. This can also be labelled as event stream processing (ESP), which connects everything happening within the business - including applications and data systems - in real-time. ESP allows banks to respond to a series of data points – events - that are derived from a system that consistently creates data – the stream – to then leverage this data through aggregation, analytics, transformations, enrichment and ingestion. Further, ESP is instrumental where batch processing falls short and when action needs to be taken in real-time, rather than on static data or data at rest. However, handling a flow of continuously created data requires a special set of technologies.