Collaborating Authors

Semantic Networks

Knowledge Graphs: Powerful Structures Making Sense Of Data - AI Summary


And in both cases, the end goal of their knowledge graphs is similar--to add value to the vast amount of data out there such that it can be utilised more meaningfully and intelligently in a real-world context, ultimately producing much smarter user experiences. "The need to fit products into tabular structures limits their ability to flex to real-world needs," Capco noted in its June 2020 publication "Knowledge Graphs: Building Smarter Financial Services". And by enabling linkages between data items that would have otherwise remained disparate and siloed off from each other, moreover, knowledge graphs could represent crucial technology for helping to solve some of the world's most pressing and complex data-related challenges. The singular, centralised nature of such control can also elicit many serious privacy concerns for users, as was the case with Facebook and its notorious data-harvesting activities with Cambridge Analytica prior to the 2016 US presidential election. The knowledge graph also allows supply-chain entities to "granularly define who has access to what data--i.e., data can be made fully public, shared with specific supply chain partners, or completely private".

The Role of Knowledge Graphs in Artificial Intelligence


Representing knowledge and the reasoning for the conclusions drawn has remained a cornerstone of artificial intelligence (AI) for decades. A knowledge graph (KG) is a powerful data structure that represents information in a graphical format. DBpedia, an open source knowledge graph defines a knowledge graph as "a special kind of database which stores knowledge in a machine-readable form and provides a means for information to be collected, organised, shared, searched and utilised." Formally, a KG is a directed labeled graph which represents relations between data points. A node of the KG represents a data point.

Reasoning with Language Models and Knowledge Graphs for Question Answering


From search engines to personal assistants, we use question-answering systems every day. When we ask a question ("Where was the painter of the Mona Lisa born?"), the system needs to gather background knowledge ("The Mona Lisa was painted by Leonardo da Vinci", "Leonardo da Vinci was born in Italy") and reason over it to produce the answer ("Italy"). Knowledge sources In recent AI research, such background knowledge is commonly available in the forms of knowledge graphs (KGs) and language models (LMs) pre-trained on a large set of documents. In KGs, entities are represented as nodes and relations between them as edges, e.g. Examples of KGs include Freebase (general-purpose facts)1, ConceptNet (commonsense)2, and UMLS (biomedical facts)3.

Why knowledge graphs are key to working with data efficiently, powerfully


Where does your enterprise stand on the AI adoption curve? Take our AI survey to find out. This post is by Dr. Mukta Paliwal, senior data scientist at Persistent Systems. As many as 50% of Gartner client inquiries on the topic of artificial intelligence involve a discussion involving the use of graph technology, the market research firm said in its Top 10 Data and Analytics Trends for 2021. Every large enterprise wants to exploit available data to bring more insights for doing business at scale.

Querying in the Age of Graph Databases and Knowledge Graphs Artificial Intelligence

Graphs have become the best way we know of representing knowledge. The computing community has investigated and developed the support for managing graphs by means of digital technology. Graph databases and knowledge graphs surface as the most successful solutions to this program. This tutorial will provide a conceptual map of the data management tasks underlying these developments, paying particular attention to data models and query languages for graphs.

JointGT: Graph-Text Joint Representation Learning for Text Generation from Knowledge Graphs Artificial Intelligence

Existing pre-trained models for knowledge-graph-to-text (KG-to-text) generation simply fine-tune text-to-text pre-trained models such as BART or T5 on KG-to-text datasets, which largely ignore the graph structure during encoding and lack elaborate pre-training tasks to explicitly model graph-text alignments. To tackle these problems, we propose a graph-text joint representation learning model called JointGT. During encoding, we devise a structure-aware semantic aggregation module which is plugged into each Transformer layer to preserve the graph structure. Furthermore, we propose three new pre-training tasks to explicitly enhance the graph-text alignment including respective text / graph reconstruction, and graph-text alignment in the embedding space via Optimal Transport. Experiments show that JointGT obtains new state-of-the-art performance on various KG-to-text datasets.

Knowledge Graphs and Machine Learning in biased C4I applications Artificial Intelligence

This paper introduces our position on the critical issue of bias that recently appeared in AI applications. Specifically, we discuss the combination of current technologies used in AI applications i.e., Machine Learning and Knowledge Graphs, and point to their involvement in (de)biased applications of the C4I domain. Although this is a wider problem that currently emerges from different application domains, bias appears more critical in C4I than in others due to its security-related nature. While proposing certain actions to be taken towards debiasing C4I applications, we acknowledge the immature aspect of this topic within the Knowledge Graph and Semantic Web communities.

Learning Knowledge Graph-based World Models of Textual Environments Artificial Intelligence

World models improve a learning agent's ability to efficiently operate in interactive and situated environments. This work focuses on the task of building world models of text-based game environments. Text-based games, or interactive narratives, are reinforcement learning environments in which agents perceive and interact with the world using textual natural language. These environments contain long, multi-step puzzles or quests woven through a world that is filled with hundreds of characters, locations, and objects. Our world model learns to simultaneously: (1) predict changes in the world caused by an agent's actions when representing the world as a knowledge graph; and (2) generate the set of contextually relevant natural language actions required to operate in the world. We frame this task as a Set of Sequences generation problem by exploiting the inherent structure of knowledge graphs and actions and introduce both a transformer-based multi-task architecture and a loss function to train it. A zero-shot ablation study on never-before-seen textual worlds shows that our methodology significantly outperforms existing textual world modeling techniques as well as the importance of each of our contributions.

Query Embedding on Hyper-relational Knowledge Graphs Artificial Intelligence

Multi-hop logical reasoning is an established problem in the field of representation learning on knowledge graphs (KGs). It subsumes both one-hop link prediction as well as other more complex types of logical queries. Existing algorithms operate only on classical, triple-based graphs, whereas modern KGs often employ a hyper-relational modeling paradigm. In this paradigm, typed edges may have several key-value pairs known as qualifiers that provide fine-grained context for facts. In queries, this context modifies the meaning of relations, and usually reduces the answer set. Hyper-relational queries are often observed in real-world KG applications, and existing approaches for approximate query answering cannot make use of qualifier pairs. In this work, we bridge this gap and extend the multi-hop reasoning problem to hyper-relational KGs allowing to tackle this new type of complex queries. Building upon recent advancements in Graph Neural Networks and query embedding techniques, we study how to embed and answer hyper-relational conjunctive queries. Besides that, we propose a method to answer such queries and demonstrate in our experiments that qualifiers improve query answering on a diverse set of query patterns.

An Intelligent Question Answering System based on Power Knowledge Graph Artificial Intelligence

The intelligent question answering (IQA) system can accurately capture users' search intention by understanding the natural language questions, searching relevant content efficiently from a massive knowledge-base, and returning the answer directly to the user. Since the IQA system can save inestimable time and workforce in data search and reasoning, it has received more and more attention in data science and artificial intelligence. This article introduced a domain knowledge graph using the graph database and graph computing technologies from massive heterogeneous data in electric power. It then proposed an IQA system based on the electrical power knowledge graph to extract the intent and constraints of natural interrogation based on the natural language processing (NLP) method, to construct graph data query statements via knowledge reasoning, and to complete the accurate knowledge search and analysis to provide users with an intuitive visualization. This method thoroughly combined knowledge graph and graph computing characteristics, realized high-speed multi-hop knowledge correlation reasoning analysis in tremendous knowledge. The proposed work can also provide a basis for the context-aware intelligent question and answer.