The US Northern Command, the military command group designated to protect North America from attack, has lobbied Congress for $29.8m to expand its IT infrastructure to better support machine-learning technologies. The request is part of the command's unfunded priorities list for fiscal year 2023, a wish list of all the gear and tech US NORTHCOM and the North American Aerospace Defense Command (NORAD) reckons is needed for building and testing new weapons or operating monitoring systems, totaling some $135 million. The requested IT funding will, we're told, be used to procure cloud computing infrastructure to run AI workloads from US NORTHCOM and NORAD's joint operations center, according to Defense News this week. The goal is to build smart cloud-hosted systems that can process incoming data, generate insight and decision options, and make this intelligence available across the Dept of Defense for leaders to consider. "Maintaining our strategic advantage begins with improving domain awareness globally, including in the approaches to North America," General Glen VanHerk, US NORTHCOM and NORAD's commander, said in a statement before the Senate Armed Forces in late March.
The information technology field offers incredible opportunities for skilled professionals, and a computer science master's degree puts graduates in a position to capitalize. The U.S. Bureau of Labor Statistics (BLS) projects the addition of more than 660,000 new computer occupations between 2020 and 2030. An advanced computer science degree can lead to some of the most in-demand positions among them. Master's graduates are equipped to work in cybersecurity, big data, cloud computing, and software and application development -- some of the fastest-growing and most integral IT fields. Here, we rank the best computer science master's programs in the country. We also examine the computer science discipline and degree levels more closely.
Unless the intelligence community changes the way it defines intelligence and adopts cloud computing, it will wind up behind adversaries, private interests, and even the public in knowing what might happen, according to a new report from the Center for Strategic and International Studies. Intelligence collection to predict broad geopolitical and military events has historically been the job of well-funded and expertly staffed government agencies like the CIA or the NSA. But, the report argues, the same institutional elements that allowed the government to create those agencies are now slowing them down in a time of large publicly-available datasets and enterprise cloud capabilities. The report, scheduled to be released Wednesday, looks at a hypothetical "open-source, cloud-based, AI-enabled reporting," or OSCAR, tool for the intelligence community, a tool that could help the community much more rapidly detect and act on clues about major geopolitical or security events. The report lists the various procedural, bureaucratic, and cultural barriers within the intelligence community that block its development and use by U.S. spy agencies.
Oracle's gargantuan $28.3 billion acquisition of health care data company Cerner, the largest deal in its 44-year history, is not just about electronic patient records. From algorithmic systems that predict the likelihood a patient will contract sepsis to tech that tracks hospital bed capacity, Cerner will bring an array of cloud-based data analytics and AI technologies to Oracle as it competes with Amazon Web Services, Google, IBM and others to serve the health care industry's data and AI needs. In fact, the deal is poised to shift some business away from AWS, which Cerner named as its preferred cloud partner in 2019. Oracle's acquisition of Cerner, a company that got its start in health care IT in 1979, is expected to close in 2022. The all-cash deal is also expected to improve Oracle's bottom line in its first year, the company said in a press release.
The three-year contract will be split across multiple bidders. It replaces the 10-year, $10 billion JEDI cloud-computing contract terminated in July, which was planned to consolidate the Pentagon's patchwork of data systems to give defense personnel better access to real-time information and artificial-intelligence capabilities. The Pentagon said the contract was canceled because of its evolving needs. The project was mired in years of squabbling between Microsoft Corp. MSFT 0.26%, which won the bidding, and Amazon.com Inc., AMZN 2.15% which contended the process was politically motivated under the Trump administration.
The ubiquitous availability of computing devices and the widespread use of the internet have generated a large amount of data continuously. Therefore, the amount of available information on any given topic is far beyond humans' processing capacity to properly process, causing what is known as information overload. To efficiently cope with large amounts of information and generate content with significant value to users, we require identifying, merging and summarising information. Data summaries can help gather related information and collect it into a shorter format that enables answering complicated questions, gaining new insight and discovering conceptual boundaries. This thesis focuses on three main challenges to alleviate information overload using novel summarisation techniques. It further intends to facilitate the analysis of documents to support personalised information extraction. This thesis separates the research issues into four areas, covering (i) feature engineering in document summarisation, (ii) traditional static and inflexible summaries, (iii) traditional generic summarisation approaches, and (iv) the need for reference summaries. We propose novel approaches to tackle these challenges, by: i)enabling automatic intelligent feature engineering, ii) enabling flexible and interactive summarisation, iii) utilising intelligent and personalised summarisation approaches. The experimental results prove the efficiency of the proposed approaches compared to other state-of-the-art models. We further propose solutions to the information overload problem in different domains through summarisation, covering network traffic data, health data and business process data.
The Pentagon yesterday announced it was scuttling its long-doomed "Project JEDI," a cloud-services AI contract that was awarded to Microsoft in 2019. Up front: Project JEDI is a big deal. The US military needs a reliable cloud-service platform from which to operate its massive AI infrastructure. Unfortunately the project was mishandled from the very beginning. Today, the Department of Defense (DoD) canceled the Joint Enterprise Defense Infrastructure (JEDI) Cloud solicitation and initiated contract termination procedures.
In 2019, the U.S. Postal Service had a need to identify and track items in its torrent of more than 100 million pieces of daily mail. A USPS AI architect had an idea. Ryan Simpson wanted to expand an image analysis system a postal team was developing into something much broader that could tackle this needle-in-a-haystack problem. With edge AI servers strategically located at its processing centers, he believed USPS could analyze the billions of images each center generated. The resulting insights, expressed in a few key data points, could be shared quickly over the network.
"If your dad would just wear a space suit, I could monitor him." It's not often that a random joke leads to the creation of a company, but that's exactly what happened with Ejenta, a digital health startup. Maarten Sierhuis, a NASA alum, had made the comment to Rachna Dhamija, a tech veteran and his future cofounder. Both were dealing with aging parents who had health issues. Sierhuis had spent 12 years as a senior research scientist at NASA, where he used sensors and artificial intelligence (AI) to monitor astronauts in space.
In this paper, we propose a framework for predicting frame errors in the collaborative spectrally congested wireless environments of the DARPA Spectrum Collaboration Challenge (SC2) via a recently collected dataset. We employ distributed deep edge learning that is shared among edge nodes and a central cloud. Using this close-to-practice dataset, we find that widely used federated learning approaches, specially those that are privacy preserving, are worse than local training for a wide range of settings. We hence utilize the synthetic minority oversampling technique to maintain privacy via avoiding the transfer of local data to the cloud, and utilize knowledge distillation with an aim to benefit from high cloud computing and storage capabilities. The proposed framework achieves overall better performance than both local and federated training approaches, while being robust against catastrophic failures as well as challenging channel conditions that result in high frame error rates.