Goto

Collaborating Authors

Results


Global Big Data Conference

#artificialintelligence

According to the AI Council, the biggest barrier to AI deployment is skills - and it starts as early as school. With artificial intelligence estimated to have the potential to deliver as much as a 10% increase to the UK's GDP before 2030, the challenge remains to unlock the technology's potential – and to do so, a panel of AI experts recommends placing a bet on young brains. A new report from the AI Council, an independent committee that provides advice to the UK government on all algorithmic matters, finds that steps need to be taken from the very start of children's education for artificial intelligence to flourish across the country. The goal, for the next ten years, should be no less ambitious than to ensure that every child leaves school with a basic sense of how AI works. This is not only about understanding the basics of coding and ethics, but about knowing enough to be a confident user of AI products, to look out for potential risks and to engage with the opportunities that the technology presents.


Machine learning and big data are unlocking Europe's archives

#artificialintelligence

From wars to weddings, Europe's history is stored in billions of archival pages across the continent. While many archives try to make their documents public, finding information in them remains a low-tech affair. Simple page scans do not offer the metadata such as dates, names, locations that often interest researchers. Copying this information for later use is also time-consuming. These issues are well-known in Amsterdam, which is trying to disclose its entire archives.


A Survey on Data Pricing: from Economics to Data Science

arXiv.org Artificial Intelligence

How can we assess the value of data objectively, systematically and quantitatively? Pricing data, or information goods in general, has been studied and practiced in dispersed areas and principles, such as economics, marketing, electronic commerce, data management, data mining and machine learning. In this article, we present a unified, interdisciplinary and comprehensive overview of this important direction. We examine various motivations behind data pricing, understand the economics of data pricing and review the development and evolution of pricing models according to a series of fundamental principles. We discuss both digital products and data products. We also consider a series of challenges and directions for future work.


7 big data goals for 2021: AI, DevOps, hybrid cloud, and more

#artificialintelligence

At IBM Research Switzerland, artificial intelligence (AI) and machine learning assisted researchers in plowing through reams of scientific papers and …


True-data Testbed for 5G/B5G Intelligent Network

arXiv.org Artificial Intelligence

Future beyond fifth-generation (B5G) and sixth-generation (6G) mobile communications will shift from facilitating interpersonal communications to supporting Internet of Everything (IoE), where intelligent communications with full integration of big data and artificial intelligence (AI) will play an important role in improving network efficiency and providing high-quality service. As a rapid evolving paradigm, the AI-empowered mobile communications demand large amounts of data acquired from real network environment for systematic test and verification. Hence, we build the world's first true-data testbed for 5G/B5G intelligent network (TTIN), which comprises 5G/B5G on-site experimental networks, data acquisition & data warehouse, and AI engine & network optimization. In the TTIN, true network data acquisition, storage, standardization, and analysis are available, which enable system-level online verification of B5G/6G-orientated key technologies and support data-driven network optimization through the closed-loop control mechanism. This paper elaborates on the system architecture and module design of TTIN. Detailed technical specifications and some of the established use cases are also showcased.


Improved Confidence Bounds for the Linear Logistic Model and Applications to Linear Bandits

arXiv.org Machine Learning

We propose improved fixed-design confidence bounds for the linear logistic model. Our bounds significantly improve upon the state-of-the-art bounds of Li et al. (2017) by leveraging the self-concordance of the logistic loss inspired by Faury et al. (2020). Specifically, our confidence width does not scale with the problem dependent parameter $1/\kappa$, where $\kappa$ is the worst-case variance of an arm reward. At worse, $\kappa$ scales exponentially with the norm of the unknown linear parameter $\theta^*$. Instead, our bound scales directly on the local variance induced by $\theta^*$. We present two applications of our novel bounds on two logistic bandit problems: regret minimization and pure exploration. Our analysis shows that the new confidence bounds improve upon previous state-of-the-art performance guarantees.


Global Big Data and Machine Learning in Telecom Market Expected To Reach Highest CAGR by 2026 : Allot, Argyle data, Ericsson, Guavus, HUAWEI, etc. – The Daily Philadelphian

#artificialintelligence

This versatile composition of research derivatives pertaining to diverse concurrent developments in the global Big Data and Machine Learning in Telecom market is poised to induce forward-looking perspectives favoring unfaltering growth stance. The new research report assessing market developments in the global Big Data and Machine Learning in Telecom market is a 360 degree reference guide, highlighting core information on holistic competitive landscape, besides rendering high voltage information on market size and dimensions with references of value- and volume based market details, indispensable for infallible decision making in global Big Data and Machine Learning in Telecom market. Understanding Big Data and Machine Learning in Telecom market Segments: an Overview: The report is aimed at improving the decision-making capabilities of readers with due emphasis on growth planning, resource use that boost growth trajectory. Additional insights on government initiatives, regulatory framework, growth policies and resource utilization have all been highlighted for healthy growth journey. Besides understanding the revenue generation potential of each of the segments, the report also takes note of the multifarious vendor initiatives towards segment betterment that play a crucial role in growth enablement.


Key Sessions for AWS Customers at Data + AI Summit Europe 2020 - The Databricks Blog

#artificialintelligence

Databricks and Summit Gold Sponsor AWS Present on a wide variety of topics at this year's premier data and AI event. Amazon Web Services (AWS) is sponsoring Data AI Summit Europe 2020 and our work with AWS continues to make Databricks better integrated with other AWS services, making it easier for our customers to drive huge analytics outcomes. As part of Data AI Summit, we want to highlight some of the top sessions of interest for AWS customers. The sessions below are relevant to customers interested in or using Databricks on the AWS cloud platform, demonstrating key service integrations. If you have questions about your AWS platform or service integrations, visit the AWS booth at Data AI Summit.


3 things to know about AWS Glue DataBrew

#artificialintelligence

Amazon Web Services' new visual data preparation tool for AWS Glue allows users to clean and normalize data with an interactive point-and-click visual interface without writing custom code. AWS Glue DataBrew helps data scientists and data analysts get the data ready for analytics and machine learning (ML) 80 percent quicker than traditional data preparation approaches, according to the cloud provider, which made the tool generally available on Wednesday. The new offering builds on AWS Glue, which AWS generally released in April of 2017. AWS Glue is a serverless, fully managed, extract, transform and load (ETL) service to categorize, clean, enrich and move data between various data stores. It has a central data repository called the AWS Glue Data Catalog, an ETL engine that generates Python code automatically and a flexible scheduler to handle dependency resolution, job monitoring and retries.


Global Big Data Conference

#artificialintelligence

My recent claim that fashion needs more imagination when it comes to using artificial intelligence has been unexpectedly answered by a project combining e-commerce data and artisanship. Not an obvious pairing, but the brainchild of passionate'dataphile' YOOX NET-A-PORTER GROUP Chairman and CEO, Federico Marchetti, and HRH The Prince of Wales, whose appreciation and support of artisanal craftsmanship (and dedication to safeguarding its future) is decades-long. Marchetti and the YOOX NET-A-PORTER team worked with The Prince's Foundation to create a unique year-long apprenticeship to cultivate the next generation of luxury fashion artisans, informed and guided by customer shopping data and AI analysis of millions of images of historically successful products. To breathe life into artisanship as a viable and attractive career option, underpinned by data that empowers it to deliver the right product, for the right customer on the right sales platform, crucially sustaining the artisans' craft methods and their livelihood. The Modern Artisan project brought together six designers from Milan's Politecnico Di Milano Fashion in Process (FiP) research laboratory and four apprentices undergoing certified training in small batch production and hand-craft skills at The Prince's Foundation, Dumfries House, Scotland.