Goto

Collaborating Authors

Architecture


Government taps AI and machine learning to analyse Test and Trace data

#artificialintelligence

The Joint Biosecurity Centre has enlisted The Alan Turing Institute and Royal Statistical Society to lend their statistical-modelling and machine-learning expertise to help predict coronavirus outbreaks and inform the response. Under a partnership announced last week with the JBC – set up by the government in May to support coronavirus decision-making – the RSS and The Alan Turing Institute, the UK's national institute for data science and artificial intelligence, will set up a virtual statistical-modelling and machine-learning laboratory to provide independent analysis of NHS Test and Trace data. The lab will give the JBC a "deeper understanding of how the virus is spreading across the country and the epidemiological consequences", the Department of Health and Social Care said in an announcement. "Statistical modelling helps data scientists to predict what the virus might do next, based on what is understood about it already," it added. The JBC, which advises on Covid-19 alert levels to inform the pandemic response, has been working with Public Health England to provide real-time analysis about infection outbreaks to national and local government bodies.


Data Science as a Service Industry Overview and Growth Outlook

#artificialintelligence

Data science is a technology used for analyzing large amount of data. It is the analysis for source of information, content of the information and how that information could be transformed into a valuable asset for creating business opportunities and information technology strategies. Data science includes data discovery, which uses the data inference and exploration techniques. It also uses mathematical and algorithmic methods for solving complex business problems and discover hidden information. Data science helps companies to intelligently operate and develop strategies derived from evidence-based analytics.


Billions of Rows, Milliseconds of Time- Pyspark Starter Guide

#artificialintelligence

Data is the new Oil with a major difference that unlike Oil, data is increasing day-by-day. The increase of the data size is outpacing the speed/cost of RAM upgrades which necessitates the need for smart data handling using multiple cores, parallel processing, chunking, etc. PySpark is a Python API for using Spark, which is a parallel and distributed engine for running big data applications. This article is an attempt to help you get up and running on PySpark in no time! Comparative performance of varying techniques to read and aggregate from a CSV file(33 Mn Row,5.7GB The major speed advantage of Dask and PySpark is due to the utilization of all the cores of the machine in a Master-Worker Node setup.


A Tour of Dependable Computing Research in Latin America

Communications of the ACM

Computing technology has become pervasive and with it the expectation for its ready availability when needed, thus basically at all times. Dependability is the set of techniques to build, configure, operate, and manage computer systems to ensure that they are reliable, available, safe, and secure.1 But alas, faults seem to be inherent to computer systems. Components can simply crash or produce incorrect output due to hardware or software bugs or can be invaded by impostors that orchestrate their behavior. Fault tolerance is the ability to enable a system as a whole to continue operating correctly and with acceptable performance, even if some of its components are faulty.3 Fault tolerance is not new; von Neumann himself designed techniques for computers to survive faults.4


Factories of The Future Are Using Machine Learning Analytics to Optimize Assets

#artificialintelligence

From food to cars to complex manufacturing machinery, quality is a top concern of manufacturers. Factors such as safety, efficiency, and reliability affect product quality and ultimately influence customer satisfaction. Sourcing, design, testing, and inspection all play a crucial role in ensuring products meet the bar when it comes to quality. Product inspections at early stages in the production cycle help reduce risks and cost. While inspections can be conducted at any point throughout the production process, the goal is to identify, contain and resolve issues as quickly as possible.


Openness a Big Advantage as Edge Grows, IBM Says

#artificialintelligence

A large computational build-up is predicted to occur on the edge in the coming years, as organizations look to capture and act upon data as soon after it's generated as possible, when it has the highest value. Today, there are few standards and protocols defined for how all this is going to work. But in the meantime, hardware and software providers, including IBM, are espousing the benefits of an open ecosystem approach. The edge, which includes server rooms, cell towers, and smaller data centers deployed in the field, is set to proliferate over the next five years, according to the IDC. By 2025, 50% of new on-premise infrastructure will be deployed in edge locations, up from 10% today, the company says.


Emerging AI and Data Driven Supervisory Technology for Regulatory Compliance

#artificialintelligence

Data Push: Push-based strategies are the default model. Automated the delivery on pre-determined specification, a forwarder is installed close to the source of the data, or built into the data generator/collector and pushes the events to an indexer. Data Pull: This approach provides significant flexibility by letting you create reports from multiple data sources and multiple data sets, and by letting you store and manage reports with an enterprise reporting server. Pull based cannot be reliable for real-time reports and information. Also, Pull base system most tolerate, its lack of real-time information cannot be best fit for supervisory Financial Institution as they demand real-time reporting with greater insights to financial health conditions of FIs. Supervisors can use machine learning tools to create a "risk score" for supervised entities. FINTRAC, the Financial Transactions and Reports Analysis Centre of Canada, has created one such score, evaluating the risk factors related to an institution's profile, compliance history, reporting behavior, and more.


The role of data in industry 4.0 - Connected Technology Solutions

#artificialintelligence

The challenges encountered by manufacturing companies when it comes to handling data are well reported, but what can they do to ensure that data is an asset rather than a problem? Data has long been treated in the manufacturing industry as the orphan nephew living in the cupboard under the stairs. While operational and service industries have leapt on the benefits of data as the catalyst of business growth and efficiency gains, the manufacturing sector has been slow to adopt the culture of becoming a data-driven business. According to Accenture, only 13 per cent of manufacturing companies have seen through a digital transformation of their processes. "In many ways the core approach to manufacturing has remained unchanged for the past 50 years despite the industry experimenting with offshoring and integrated manufacturing in mega factories," Tim Hall, VP products, InfluxData, says.


SCRN-VRC/Pix2Pix-in-a-Fragment-Shader

#artificialintelligence

A simplified version of the pix2pix generator inside a fragment shader for Unity. This implementation is only 1/4 of the original pix2pix model to help with real time performance in VR. If you wish to run the Python code, here's what you need. If you wish to run the C code.


Splunk takes aim at multicloud, machine learning and observability - SiliconANGLE

#artificialintelligence

Splunk's Data-to-Everything Platform is an all-encompassing suite of analytics tools that help enterprises to search, correlate, analyze, monitor and report on data in real time, available through its Splunk Cloud and Splunk Enterprise products. Today's slew of updates at the virtual event are all about expanding customer's multicloud capabilities, giving them new ways to set the right data strategy and improve access to the information their businesses generate, Splunk said. For example, the Splunk Data Stream Processor, an event streaming platform, is being updated with new capabilities that enable it to access, process and route real-time data from multiple cloud services, including Google LLC's Cloud Platform and Microsoft Corp.'s Azure Event Hub. In addition, event data now gets enriched with lookups and machine learning functionality that helps to minimize compute loads and provide more accuracy when searching through this data. Moreover, the Data-to-Everything Platform is getting a new Splunk Machine Learning Environment that will make it easy for companies to build and operationalize machine learning models by bringing data from multiple sources into a single platform.