More than 18 million new cancer cases are diagnosed globally each year, and radiotherapy is one of the most common cancer treatments--used to treat over half of cancers in the United States. But planning for a course of radiotherapy treatment is often a time-consuming and manual process for clinicians. The most labor-intensive step in planning is a technique called "contouring" which involves segmenting both the areas of cancer and nearby healthy tissues that are susceptible to radiation damage during treatment. Clinicians have to painstakingly draw lines around sensitive organs on scans--a time-intensive process that can take up to seven hours for a single patient. Technology has the potential to augment the work of doctors and other care providers, like the specialists who plan radiotherapy treatment.
On the other end of the spectrum, the machine might take actions necessary to ensure the survival of the human team member when the human is incapable of doing so. For example, an AI teammate could make the "ejection decision" for a fighter pilot who has lost consciousness or the physical ability to eject themselves. Pietrucha, a retired colonel in the U.S. Air Force who has had many flight hours as a fighter/attack aviator, sees the promise of such a system that "goes beyond the mere analysis of flight parameters and includes analysis of the cognitive state of the aircrew, intervening only when the aircrew can't or wont," he says.
The 2020 presidential election in the United States is just around the corner. This year, the election has been particularly controversial in part because of the ongoing COVID-19 pandemic and the restrictions the virus has placed on in-person gatherings. In a world in which connected devices and IoT (Internet of Things) technologies have enabled everything from autonomous vehicles to robotic surgery, it seems like there should be other options for casting votes besides sending paper ballots in by mail or turning them in by hand. However, concerns (both legitimate and overblown) about election-outcome accuracy and voter privacy have held the election process back in many ways from the digital revolution that has permeated almost everything else. Will 2020 be a pivotal year in changing how the American people and "the powers that be" feel about advancing the voting process?
The quality of worklife has been an emerging trend for companies over the last several years, and now, it's gaining more traction than ever. Today's workforce has become accustomed to using various types of automation and artificial intelligence (AI). These technologies have quickly become staples to a company's operations, which means certain jobs and tasks will no longer have to be done by humans. In fact, according to a study from Stanford and Arizona State Universities, "cities with greater increases in AI-related job postings exhibited greater economic growth." "This relationship was dependent on a city's ability to leverage its inherent capabilities in industry and education to create AI-based employment opportunities," the study read.
In early 2018, officials at University College London were shocked to learn that meetings organized by "race scientists" and neo-Nazis, called the London Conference on Intelligence, had been held at the college the previous four years. The existence of the conference was surprising, but the choice of location was not. UCL was an epicenter of the early 20th-century eugenics movement--a precursor to Nazi "racial hygiene" programs--due to its ties to Francis Galton, the father of eugenics, and his intellectual descendants and fellow eugenicists Karl Pearson and Ronald Fisher. In response to protests over the conference, UCL announced this June that it had stripped Galton's and Pearson's names from its buildings and classrooms. After similar outcries about eugenics, the Committee of Presidents of Statistical Societies renamed its annual Fisher Lecture, and the Society for the Study of Evolution did the same for its Fisher Prize. In science, these are the equivalents of toppling a Confederate statue and hurling it into the sea. Unlike tearing down monuments to white supremacy in the American South, purging statistics of the ghosts of its eugenicist past is not a straightforward proposition. In this version, it's as if Stonewall Jackson developed quantum physics. What we now understand as statistics comes largely from the work of Galton, Pearson, and Fisher, whose names appear in bread-and-butter terms like "Pearson correlation coefficient" and "Fisher information." In particular, the beleaguered concept of "statistical significance," for decades the measure of whether empirical research is publication-worthy, can be traced directly to the trio. Ideally, statisticians would like to divorce these tools from the lives and times of the people who created them. It would be convenient if statistics existed outside of history, but that's not the case.
Sony on Wednesday announced it has cut the FY20 profit outlook for its sensor business by 38% to ¥81 billion due to the United States banning Sony from supplying chips to Huawei. In August, the United States issued sanctions that ban Huawei from procuring chips made by foreign manufacturers using US technology, like Sony. Due to this impact, CFO Hiroki Totoki anticipates the sensor business will not make a full recovery in profitability until the fiscal year ended March 2023. "We expect that it will take a long time for other customers to follow the trend to higher-functionality and larger die-sized smartphone cameras that the Chinese customer was leading. Thus, we expect the substantial recovery of profitability driven by these high value-added products to take place in the fiscal year ending March 31, 2023," Totoki said during the results presentation. Despite the hit to Sony's sensor business, the company has raised its annual profit outlook as it expects its gaming business to grow following the launch of the PlayStation 5 (PS5) next month.
In the winter of 2011, Daniel Yamins, a postdoctoral researcher in computational neuroscience at the Massachusetts Institute of Technology, would at times toil past midnight on his machine vision project. He was painstakingly designing a system that could recognize objects in pictures, regardless of variations in size, position and other properties -- something that humans do with ease. The system was a deep neural network, a type of computational device inspired by the neurological wiring of living brains. "I remember very distinctly the time when we found a neural network that actually solved the task," he said. It was 2 a.m., a tad too early to wake up his adviser, James DiCarlo, or other colleagues, so an excited Yamins took a walk in the cold Cambridge air. "I was really pumped," he said. It would have counted as a noteworthy accomplishment in artificial intelligence alone, one of many that would make neural networks the darlings of AI technology over the next few years.