Ontology Neural Networks for Topologically Conditioned Constraint Satisfaction

Oh, Jaehong

arXiv.org Machine Learning 

Abstract--Neuro-symbolic reasoning systems face fundamental challenges in maintaining semantic coherence while satisfying physical and logical constraints. Building upon our previous work on Ontology Neural Networks, we present an enhanced framework that integrates topological conditioning with gradient stabilization mechanisms. The approach employs Forman-Ricci curvature to capture graph topology, Deep Delta Learning for stable rank-one perturbations during constraint projection, and Covariance Matrix Adaptation Evolution Strategy for parameter optimization. Experimental evaluation across multiple problem sizes demonstrates that the method achieves mean energy reduction to 1.15 compared to baseline values of 11.68, with 95 percent success rate in constraint satisfaction tasks. The framework exhibits seed-independent convergence and graceful scaling behavior up to twenty-node problems, suggesting that topological structure can inform gradient-based optimization without sacrificing interpretability or computational efficiency. Integrating symbolic reasoning with neural learning remains a central challenge in artificial intelligence. While neural networks excel at pattern recognition and gradient-based optimization, they often struggle to maintain explicit constraints or provide interpretable intermediate representations. The opacity of deep neural representations makes it difficult to verify whether learned policies respect domain knowledge or physical laws. Conversely, symbolic systems offer logical transparency and formal guarantees but lack the flexibility to learn from noisy, incomplete data or adapt to distributional shifts.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found