search menu icon-carat-right cmu-wordmark

Kernel Density Decision Trees

March 2022 Conference Paper
Jack H. Good (Carnegie Mellon University, Robotics Institute, Kyle Miller (Carnegie Mellon University, Robotics Institute), Artur Dubrawski (Carnegie Mellon University, Robotics Institute)

This paper was presented at the 2022 AAAI Spring Symposium on AI Engineering.


Software Engineering Institute


We propose kernel density decision trees (KDDTs), a novel fuzzy decision tree (FDT) formalism based on kernel density estimation that improves the generalization and robustness of decision trees and offers additional utility. FDTs mitigate the sensitivity and tendency to overfitting of decision trees by representing uncertainty through fuzzy partitions. However, compared to conventional, crisp decision trees, FDTs are generally complex to apply, sensitive to design choices, slow to fit and make predictions, and difficult to interpret. Moreover, finding the optimal threshold for a given fuzzy split is challenging, resulting in methods that discretize data, settle for near-optimal thresholds, or fuzzify crisp trees. Our KDDTs address these shortcomings by representing uncertainty intuitively through kernels and by using a novel, scalable generalization of the CART algorithm for finding optimal partitions for FDTs with piecewise-linear splitting functions or KDDTs with piecewise-constant fitting kernels. We demonstrate prediction performance against conventional decision trees and tree ensembles on 12 publicly available datasets.