Goto

Collaborating Authors

Ontologies: Instructional Materials


Python for Machine Learning - Classes and Objects

#artificialintelligence

This Python for Machine Learning Tutorial will help you learn the Python programming language from scratch. You'll learn about Classes and Objects in Python. Everything in this course is explained with the relevant example thus you will actually know how to implement the topics that you will learn in this course.


Ontologies for the Virtual Materials Marketplace

arXiv.org Artificial Intelligence

The Virtual Materials Marketplace (VIMMP) project, which develops an open platform for providing and accessing services related to materials modelling, is presented with a focus on its ontology development and data technology aspects. Within VIMMP, a system of marketplace-level ontologies is developed to characterize services, models, and interactions between users; the European Materials and Modelling Ontology (EMMO), which is based on mereotopology following Varzi and semiotics following Peirce, is employed as a top-level ontology. The ontologies are used to annotate data that are stored in the ZONTAL Space component of VIMMP and to support the ingest and retrieval of data and metadata at the VIMMP marketplace frontend.


A Heuristically Modified FP-Tree for Ontology Learning with Applications in Education

arXiv.org Machine Learning

We propose a heuristically modified FP-Tree for ontology learning from text. Unlike previous research, for concept extraction, we use a regular expression parser approach widely adopted in compiler construction, i.e., deterministic finite automata (DFA). Thus, the concepts are extracted from unstructured documents. For ontology learning, we use a frequent pattern mining approach and employ a rule mining heuristic function to enhance its quality. This process does not rely on predefined lexico-syntactic patterns, thus, it is applicable for different subjects. We employ the ontology in a question-answering system for students' content-related questions. For validation, we used textbook questions/answers and questions from online course forums. Subject experts rated the quality of the system's answers on a subset of questions and their ratings were used to identify the most appropriate automatic semantic text similarity metric to use as a validation metric for all answers. The Latent Semantic Analysis was identified as the closest to the experts' ratings. We compared the use of our ontology with the use of Text2Onto for the question-answering system and found that with our ontology 80% of the questions were answered, while with Text2Onto only 28.4% were answered, thanks to the finer grained hierarchy our approach is able to produce.


A 20-Year Community Roadmap for Artificial Intelligence Research in the US

arXiv.org Artificial Intelligence

Decades of research in artificial intelligence (AI) have produced formidable technologies that are providing immense benefit to industry, government, and society. AI systems can now translate across multiple languages, identify objects in images and video, streamline manufacturing processes, and control cars. The deployment of AI systems has not only created a trillion-dollar industry that is projected to quadruple in three years, but has also exposed the need to make AI systems fair, explainable, trustworthy, and secure. Future AI systems will rightfully be expected to reason effectively about the world in which they (and people) operate, handling complex tasks and responsibilities effectively and ethically, engaging in meaningful communication, and improving their awareness through experience. Achieving the full potential of AI technologies poses research challenges that require a radical transformation of the AI research enterprise, facilitated by significant and sustained investment. These are the major recommendations of a recent community effort coordinated by the Computing Community Consortium and the Association for the Advancement of Artificial Intelligence to formulate a Roadmap for AI research and development over the next two decades.


Artificial Intelligence : from Research to Application ; the Upper-Rhine Artificial Intelligence Symposium (UR-AI 2019)

arXiv.org Artificial Intelligence

The TriRhenaTech alliance universities and their partners presented their competences in the field of artificial intelligence and their cross-border cooperations with the industry at the tri-national conference 'Artificial Intelligence : from Research to Application' on March 13th, 2019 in Offenburg. The TriRhenaTech alliance is a network of universities in the Upper Rhine Trinational Metropolitan Region comprising of the German universities of applied sciences in Furtwangen, Kaiserslautern, Karlsruhe, and Offenburg, the Baden-Wuerttemberg Cooperative State University Loerrach, the French university network Alsace Tech (comprised of 14 'grandes \'ecoles' in the fields of engineering, architecture and management) and the University of Applied Sciences and Arts Northwestern Switzerland. The alliance's common goal is to reinforce the transfer of knowledge, research, and technology, as well as the cross-border mobility of students.



Ontologies for Business Analysis Udemy

#artificialintelligence

The practice of Business Analysis revolves around the formation, transformation and finalisation of requirements to recommend suitable solutions to support enterprise change programmes. Practitioners working in the field of business analysis apply a wide range of modelling tools to capture the various perspectives of the enterprise, for example, business process perspective, data flow perspective, functional perspective, static structure perspective, and more. These tools aid in decision support and are especially useful in the effort towards the transformation of a business into the "intelligent enterprise", in other words, one which is to some extent "self-describing" and able to adapt to organisational change. However, a fundamental piece remains missing from the puzzle. Achieving this capability requires us to think beyond the idea of simply using the current mainstream modelling tools.


Introduction to Formal Concept Analysis and Its Applications in Information Retrieval and Related Fields

arXiv.org Machine Learning

This paper is a tutorial on Formal Concept Analysis (FCA) and its applications. FCA is an applied branch of Lattice Theory, a mathematical discipline which enables formalisation of concepts as basic units of human thinking and analysing data in the object-attribute form. Originated in early 80s, during the last three decades, it became a popular human-centred tool for knowledge representation and data analysis with numerous applications. Since the tutorial was specially prepared for RuSSIR 2014, the covered FCA topics include Information Retrieval with a focus on visualisation aspects, Machine Learning, Data Mining and Knowledge Discovery, Text Mining and several others.


DynaLearn – An Intelligent Learning Environment for Learning Conceptual Knowledge

AI Magazine

Articulating thought in computer-based media is a powerful means for humans to develop their understanding of phenomena. We have created DynaLearn, an Intelligent Learning Environment that allows learners to acquire conceptual knowledge by constructing and simulating qualitative models of how systems behave. DynaLearn uses diagrammatic representations for learners to express their ideas. The environment is equipped with semantic technology components capable of generating knowledge-based feedback, and virtual characters enhancing the interaction with learners. Teachers have created course material, and successful evaluation studies have been performed. This article presents an overview of the DynaLearn system.


Publishing Math Lecture Notes as Linked Data

arXiv.org Artificial Intelligence

We mark up a corpus of LaTeX lecture notes semantically and expose them as Linked Data in XHTML+MathML+RDFa. Our application makes the resulting documents interactively browsable for students. Our ontology helps to answer queries from students and lecturers, and paves the path towards an integration of our corpus with external sites.