Ontologies


Semantic Development and Integration of Standards for Adoption and Interoperability

IEEE Computer

Semantic applications can help commercial applications perform quickly and reliably by improving ecosystem interoperability. Converting and integrating current standards specifications to OWL models could support the adoption of semantic models, as well as machine-processable standards compliance and data interoperability.


Onboarding to Enterprise Knowledge Graphs - DATAVERSITY

@machinelearnbot

Enterprise Knowledge Graph vendors are working hard to find their place in the heart of businesses, helping them do more with and get more out of their mountains of data. Recently, for example, Stardog has adopted its leading Knowledge Graph platform to be "FIBO-aware," mapping to the Financial Industry Business Ontology (FIBO) semantic standards out-of-the-box. GraphPath launched what it says is the first Knowledge-Graph-as-a-Service (KGaaS) platform. And Maana, with its Knowledge Graph-centered Knowledge Platform, has been talking up its partnerships with clients like Shell to drive digital transformation efforts. As part of these efforts, work is underway to make it easier for businesses to adopt these solutions – for experts like data engineers who will manage the graphs, of course, but also for the business users who will consume data from them via different applications that developers create.


a16z Podcast: The Taxonomy of Collective Knowledge – Andreessen Horowitz

@machinelearnbot

What do disease diagnostics, language learning, and image recognition have in common? All depend on the organization of collective intelligence: data ontologies. In this episode of the a16z Podcast, guests Luis von Ahn, founder of reCaptcha and Duolingo, Jay Komarneni, founder of HumanDX, a16z General Partner Vijay Pande, and a16z Partner Malinka Walaliyadde break down what data ontologies are, from the philosophical (Wittgenstein and Wikipedia!) It is data ontologies, in fact, that enable not only human computation -- but that allow us to map out, structure, and scale knowledge creation online, providing order to how we organize massive amounts of information so that humans and machines can coordinate in a way that both understand.


Internet of Humans - GS Lab

#artificialintelligence

The apps silently capture user data which is used for Ad revenue. Similarly, online retailers started profiling users based on products searched and purchased. While massive data sets of user information are captured by cookies, Apps, social media, retailing sites and Bots/wearables, the information is still in a silo and resides with an individual player. Equipped with this, you can potentially consume data from any structures or unstructured data source, create RDF, ontologies and RIF's and then search for an answer to any random question.


Semantic technology underpins conversational AI, other big data uses

@machinelearnbot

And it isn't just AI -- semantic methodologies also support a variety of other applications in big data environments. The buzz: Like AI, semantic technology has hovered on the fringe of mainstream IT consciousness for years. It first came to life in 2001 under the banner of the Semantic Web, a concept based on the Resource Description Framework (RDF), which structures data in graph form. Uses include natural language processing, social networking, customer and healthcare analytics, and AI undertakings from Amazon's Alexa to IBM's Watson.


A Standard to build Knowledge Graphs: 12 Facts about SKOS

@machinelearnbot

The usage of open standards for data and knowledge models eliminates proprietary vendor lock-in. This builds the basis for a wide range of applications, starting from semantic search and text mining, ranging to data integration and data analytics. By that means, knowledge models become actionable and can help to find answers in unstructured content, trigger alerts or to make better decisions. SKOS is relatively easy to learn and can produce massive input to make machine learning tasks more precise.


Why Cognitive Systems should combine Machine Learning with Semantic Technologies

@machinelearnbot

Most of the machine learning algorithms were developed to solve a well-known problem in AI, which is called the'Knowledge Acquisition Bottleneck'. It deals with the question how subject matter experts (SMEs) can be enabled to work together with data scientists on knowledge models in an efficient and sustainable way (See also: Taxonomies and Ontologies – The Yin and Yang of Knowledge Modelling). Machine learning algorithms learn from data, and by that, successful implementations are obviously strongly related to data quality and the approaches taken to encode the semantics (meaning) of data. Facing the'Knowledge Acquisition Bottleneck' also means that experts' knowledge is recognized as an essential asset of any organization.


Machine Learning Ontology

@machinelearnbot

Instead of seeing each Machine Learning (ML) method as a "shiny new object", here is an attempt to create a unified picture. There is no consensus when it comes to an ontology for ML methods; organizational principles are simply ways to get our arms around knowledge so that we are not swamped by too many unconnected notions. In chapter 4 ("Modern" ML Method) of my upcoming book, "SYSTEMS Analytics", we develop the basic theory and algorithms for some key blocks in the diagram above. In ML practice, these ML methods are "wrapped" by "bootstrap" and "consensus" methods.


5 TEDTalks Every Entrepreneur Needs to Watch

#artificialintelligence

DISCLAIMER: This website is provided "as is" without any representations or warranties, express or implied. AIRS makes no representations or warranties in relation to this website or the information and materials provided on this website.


Ontologies: Practical Applications

@machinelearnbot

NASA had an analogous problem, and they solved it with the practical application of data management best practices, which included the use of domain specific ontologies[3]. However, any enterprise information architecture intended to enable horizontal communication between disparate data sources, with related and/or potentially different domains (e.g., banking and insurance), must identify a methodology for rapidly merging, and extracting Key Data Elements (KDE) necessary for answering essential competency questions[5]. Whether it is an engine overheating or gasses reaching a dangerous level as identified by sensor data, network intrusion detection identified by real time network log monitoring, or social and news media feeds indicating a need for risk reduction procedures to be implemented, the organization that can quickly identify risk and/or opportunity will have a distinct advantage over their competitors. As described above, the practical application of ontologies range from NASA integrating data from multiple disparate systems that enables the rapid identification of system failures, to environmental monitoring for oil and gas operations through the Semantic Sensor Network (SSN)[1], to market volatility and risk management in the financial industry.