Transportation Secretary nominee Mayor Pete Buttigieg takes questions at his confirmation hearing. WASHINGTON (AP) – A Senate panel on Wednesday easily advanced President Joe Biden's nomination of Pete Buttigieg to be transportation secretary, setting up a final confirmation vote for a key role in Biden's push to rebuild the nation's infrastructure and confront climate change. The Commerce Committee approved the nomination of Buttigieg, a 39-year-old former mayor of South Bend, Indiana, on a 21-3 vote. His nomination now heads to the full Senate, where a vote could happen as early as this week. He would be the first openly gay person, and one of the youngest, confirmed by the Senate to a Cabinet post.
Fox News Flash top headlines are here. Check out what's clicking on Foxnews.com. Astronauts on Wednesday will conduct the first spacewalk of 2021 from the International Space Station (ISS). The event on Wednesday will be the first in a pair of spacewalks, the second set for Feb. 1. Both walks are planned to be 6.5 hours, according to NASA.
This paper introduces BioScript, a domain-specific language (DSL) for programmable biochemistry that executes on emerging microfluidic platforms. The goal of this research is to provide a simple, intuitive, and type-safe DSL that is accessible to life science practitioners. The novel feature of the language is its syntax, which aims to optimize human readability; the technical contribution of the paper is the BioScript type system. The type system ensures that certain types of errors, specific to biochemistry, do not occur, such as the interaction of chemicals that may be unsafe. Results are obtained using a custom-built compiler that implements the BioScript language and type system. The last two decades have witnessed the emergence of software-programmable laboratory-on-a-chip (pLoC) technology, enabled by technological advances in microfabrication and coupled with scientific understanding of microfluidics, the fundamental science of fluid behavior at the micro- to nanoliter scale. The net result of these collective advancements is that many experimental laboratory procedures have been miniaturized, accelerated, and automated, similar in principle to how the world's earliest computers automated tedious mathematical calculations that were previously performed by hand. Although the vast majority of microfluidic devices are effectively application-specific integrated circuits (ASICs), a variety of programmable LoCs have been demonstrated.16, With a handful of exceptions, research on programming languages and compiler design for programmable LoCs has lagged behind their silicon counterparts. To address this need, this paper presents a domain-specific programming language (DSL) and type system for a specific class of pLoC that manipulate discrete droplets of liquid on a two-dimensional grid. The basic principles of the language and type system readily generalize to programmable LoCs, realized across a wide variety of microfluidic technologies.
On Feb 15, 2019, John Abowd, chief scientist at the U.S. Census Bureau, announced the results of a reconstruction attack that they proactively launched using data released under the 2010 Decennial Census.19 The decennial census released billions of statistics about individuals like "how many people of the age 10-20 live in New York City" or "how many people live in four-person households." Using only the data publicly released in 2010, an internal team was able to correctly reconstruct records of address (by census block), age, gender, race, and ethnicity for 142 million people (about 46% of the U.S. population), and correctly match these data to commercial datasets circa 2010 to associate personal-identifying information such as names for 52 million people (17% of the population). This is not specific to the U.S. Census Bureau--such attacks can occur in any setting where statistical information in the form of deidentified data, statistics, or even machine learning models are released. That such attacks are possible was predicted over 15 years ago by a seminal paper by Irit Dinur and Kobbi Nissim12--releasing a sufficiently large number of aggregate statistics with sufficiently high accuracy provides sufficient information to reconstruct the underlying database with high accuracy. The practicality of such a large-scale reconstruction by the U.S. Census Bureau underscores the grand challenge that public organizations, industry, and scientific research faces: How can we safely disseminate results of data analysis on sensitive databases? An emerging answer is differential privacy. An algorithm satisfies differential privacy (DP) if its output is insensitive to adding, removing or changing one record in its input database. DP is considered the "gold standard" for privacy for a number of reasons. It provides a persuasive mathematical proof of privacy to individuals with several rigorous interpretations.25,26 The DP guarantee is composable and repeating invocations of differentially private algorithms lead to a graceful degradation of privacy.
Over the past decade, calls for better measures to protect sensitive, personally identifiable information have blossomed into what politicians like to call a "hot-button issue." Certainly, privacy violations have become rampant and people have grown keenly aware of just how vulnerable they are. When it comes to potential remedies, however, proposals have varied widely, leading to bitter, politically charged arguments. To date, what has chiefly come of that have been bureaucratic policies that satisfy almost no one--and infuriate many. Now, into this muddled picture comes differential privacy. First formalized in 2006, it's an approach based on a mathematically rigorous definition of privacy that allows formalization and proof of the guarantees against re-identification offered by a system. While differential privacy has been accepted by theorists for some time, its implementation has turned out to be subtle and tricky, with practical applications only now starting to become available. To date, differential privacy has been adopted by the U.S. Census Bureau, along with a number of technology companies, but what this means and how these organizations have implemented their systems remains a mystery to many. It's also unlikely that the emergence of differential privacy signals an end to all the difficult decisions and trade-offs, but it does signify that there now are measures of privacy that can be quantified and reasoned about--and then used to apply suitable privacy protections. A milestone in the effort to make this capability generally available came in September 2019 when Google released an open source version of the differential privacy library that the company has used with many of its core products. In the exchange that follows, two of the people at Google who were central to the effort to release the library as open source--Damien Desfontaines, privacy software engineer; and Miguel Guevara, who leads Google's differential privacy product development effort--reflect on the engineering challenges that lie ahead, as well as what remains to be done to achieve their ultimate goal of providing privacy protection by default.
Saving the Los Angeles school year has become a race against the clock -- as campuses are unlikely to reopen until teachers are vaccinated against COVID-19 and infection rates decline at least three-fold, officials said Monday. The urgency to salvage the semester in L.A. and throughout the state was underscored by new research showing the depth of student learning loss and by frustrated parents who organized statewide to pressure officials to bring back in-person instruction. A rapid series of developments Monday -- involving the governor, L.A. Unified School District, the teachers union and the county health department -- foreshadowed the uncertainties that will play out in the high-stakes weeks ahead for millions of California students. "We're never going to get back if teachers can't get vaccinated," said Assemblyman Patrick O'Donnell (D-Long Beach), who chairs the state's Assembly Education Committee and has two high schoolers learning from home. He expressed frustration that educators are not being prioritized by the L.A. County Health Department even as teachers in Long Beach are scheduled for vaccines this week. Although Long Beach is part of L.A. County, it operates its own independent health agency.
Innovations in artificial intelligence (AI) have fundamentally changed the email security landscape in recent years, but it can often be hard to determine what makes one system different than the next. In reality, under that umbrella term significant differences exist in approaches that may determine whether the technology provides genuine protection or simply a perceived notion of defense. The Rise of Fearware When the global pandemic hit, and governments began enforcing travel bans and imposing stringent restrictions, there was undoubtedly a collective sense of fear and uncertainty. As explained in this blog, cybercriminals were quick to capitalize, taking advantage of people's desire for information to send out topical emails related to COVID-19 containing malware or credential-grabbing links. These emails often spoofed the Centers for Disease Control and Prevention (CDC) and, later on, as the economic impact of the pandemic began to take hold, the Small Business Administration (SBA).
The U.S. government is tasked with protecting classified data and combating potential threats, an area of growing concern with the increasing use of web-based applications required for remote working. Due to high demands, the teams tasked with safeguarding data need a new way--or new capabilities--to scale cybersecurity efforts, especially as many government agencies also face the challenge of limited resources and massively growing data sets and feeds. Pushed by the pandemic, governments are accelerating digital transformation efforts to implement artificial intelligence for cybersecurity needs, as it brings capabilities beyond what manual human surveillance can provide. In fact, the Defense Department's investment in AI has increased from $600 million in fiscal 2016 to $2.5 billion in fiscal 2021. The security operations center is the "mothership" of security within government agencies.
In each of its annual budget requests, the Trump administration made deep funding cuts to federal research spending, in spite of Congress' consistent refusals. However, the administration's 2021 proposal actually sought to promote AI and quantum computing research. It asked for double funding to those departments in the National Science Foundation, the National Institutes of Health, the Department of Energy, Darpa, and the Joint AI Center to $2 billion annually. While decried as wholly inadequate to address the field's rate of technical advance, that funding bump would come at the expense of funding other basic sciences in those same agencies, as well as an overall reduction in research and development spending by 9 percent over 2020, to $142.2 billion. "I find it disappointing and concerning that funding for basic research is down," Martijn Rasser, a senior fellow at the Center for a New American Security, told Wired in 2020.