Goto

Collaborating Authors

DARPA digs into the details of practical quantum computing -- GCN

#artificialintelligence

Quantum computing promises enough computational power to solve problems far beyond the capabilities of the fastest digital computers, so the Defense Advanced Research Projects Agency is laying the groundwork for applying the technology to real-world problems. In a request for information, DARPA is asking how quantum computing can enable new capabilities when it comes to solving science and technology problems, such as understanding complex physical systems, optimizing artificial intelligence and machine learning and enhancing distributed sensing. Noting that it is not interested in solving cryptology issues, DARPA is asking the research community to help solve challenges of scale, environmental interactions, connectivity and memory and suggest "hard" science and technology problems the technology could be leveraged to solve. Establishing the fundamental limits of quantum computing in terms of how problems should be framed, when a model's scale requires a quantum-based solution, how to manage connectivity and errors, the size of potential speed gains and the ability to break large problems into smaller pieces that can map to several quantum platforms. Improving machine learning by leveraging a hybrid quantum/classical computing approach to decrease the time required to train machine learning models.


National lab cracks big data security problem -- GCN

#artificialintelligence

Lawrence Livermore National Laboratory is looking for a partner to help further develop and commercialize its method for securely processing protected data in high-performance computing clusters. The lab saw the need for a way to secure data in high-performance computing centers and in cloud environments so that it could meet regulatory and privacy requirements. Traditional HPC systems run their simulation and analysis tasks across hundreds or thousands of compute nodes that work together. Many users' jobs can run simultaneously, and the user need not be present when the job is launched on the cluster. Basic cybersecurity, on the other hand, requires user authentication, access control, encryption of data at rest and in transit, audits of sensitive data and secure management of encryption keys and logs.


How machine learning can improve COVID testing -- GCN

#artificialintelligence

On June 18, the Food and Drug Administration authorized the use of pooled testing for identifying COVID-19 infections. The method allows up to four swabs to be tested at once – a strategy that is expected to greatly expand frequent testing to larger sections of the population. The idea is that if a bundled sample comes back positive, then all the individuals in that sample will need to be tested separately. If a bundled sample comes back clean, however, that's four people who don't need to be tested further, saving public health officials time and money. The FDA said it expects pooling will allow virus identification with fewer tests, which means more tests could be run at once, fewer testing supplies would be consumed and patients could likely receive results more quickly.


Getting started with intelligent automation -- GCN

#artificialintelligence

As agencies look to automation to lower costs, improve efficiency and achieve higher customer satisfaction, many wonder where to start. To start them on their journey, ACT-IAC created the Intelligent Automation Primer. The document defines IA as the marriage of automation with artificial intelligence, but not all automation need be so sophisticated. It can range from desktop automation tools that leverage scripts and macros, to robotic process automation that uses simple rules to process structured data, to enhanced RPA, which addresses more-complex tasks using unstructured data from multiple sources. Full-blown IA applications, which can "sense and synthesize vast amounts of information and can automate entire processes or workflows, learning and adapting as they go," range from making decisions about text-based information to guiding autonomous vehicles, according to the primer.


DARPA's plan to speed machine learning development -- GCN

#artificialintelligence

The Defense Advanced Research Projects Agency wants to make the process of training machine-learning models more efficient. Currently, training a ML model requires the test data be labeled, which requires humans identify an an image or specific phrases in text that the algorithm should learn to recognize. The more labeled data the system can review, the more complete its training and the better the eventual results. Amassing enough clean, consistently labeled data to train an algorithm is expensive and time consuming. If, for example, a company wants to analyze user reviews of its products, it will need "at least 90,000 reviews to build a model that performs adequately," according to AltexSoft, a software R&D engineering firm.