Induction of Non-Monotonic Rules From Statistical Learning Models Using High-Utility Itemset Mining
Shakerin, Farhad, Gupta, Gopal
We present a fast and scalable algorithm to induce non-monotonic logic programs from statistical learning models. We reduce the problem of search for best clauses to instances of the High-Utility Itemset Mining (HUIM) problem. In the HUIM problem, feature values and their importance are treated as transactions and utilities respectively. We make use of TreeExplainer, a fast and scalable implementation of the Explainable AI tool SHAP, to extract locally important features and their weights from ensemble tree models. Our experiments with UCI standard benchmarks suggest a significant improvement in terms of classification evaluation metrics and running time of the training algorithm compared to ALEPH, a state-of-the-art Inductive Logic Programming (ILP) system.
May-28-2019
- Country:
- North America > United States
- California > San Francisco County
- San Francisco (0.14)
- Massachusetts (0.14)
- Pennsylvania (0.14)
- Texas (0.14)
- California > San Francisco County
- North America > United States
- Genre:
- Overview (0.47)
- Research Report (0.50)
- Industry:
- Health & Medicine > Therapeutic Area (0.48)
- Technology: