TITLE:
Heavy-Head Sampling Strategy of Graph Convolutional Neural Networks for q-Consistent Summary-Explanations with Application to Credit Evaluation Systems
AUTHORS:
Xinrui Dou
KEYWORDS:
Summary-Explanation, q-Consistent, Branch-and-Bound, Heavy-Head Sampling Strategy
JOURNAL NAME:
Open Access Library Journal,
Vol.10 No.9,
September
15,
2023
ABSTRACT: Machine learning systems have found extensive applications as auxiliary tools in domains that necessitate critical decision-making, such as healthcare and criminal justice. The interpretability of these systems’ decisions is of paramount importance to instill trust among users. Recently, there have been developments in globally-consistent rule-based summary-explanation and its max-support (MSGC) problem, enabling the provision of explanations for specific decisions along with pertinent dataset statistics. Nonetheless, globally-consistent summary-explanations with limited complexity tend to have small supports, if any. In this study, we propose a more lenient variant of the summary-explanation, namely the q-consistent summary-explanation, which strives to achieve greater support at the expense of slightly reduced consistency. However, the challenge lies in the fact that the max-support problem of the q-consistent summary-explanation (MSqC) is significantly more intricate than the original MSGC problem, leading to extended solution times using standard branch-and-bound (B & B) solvers. We improve the B & B solving process by replacing time-consuming heuristics with machine learning (ML) models and apply a heavy-head sampling strategy for imitation learning of MSqC problems by exploiting the heavy-head maximum depth distribution of B & B solution trees. Experimental results show that using the heavy-head sampling strategies, the final evaluation results of trained strategies on MSqC problems are significantly improved compared to previous studies using uniform sampling strategies.