TY - JOUR T1 - The Best Way to Select Features? Comparing MDA, LIME, and SHAP JF - The Journal of Financial Data Science SP - 127 LP - 139 DO - 10.3905/jfds.2020.1.047 VL - 3 IS - 1 AU - Xin Man AU - Ernest P. Chan Y1 - 2021/01/31 UR - https://pm-research.com/content/3/1/127.abstract N2 - Feature selection in machine learning is subject to the intrinsic randomness of the feature selection algorithms (e.g., random permutations during MDA). Stability of selected features with respect to such randomness is essential to the human interpretability of a machine learning algorithm. The authors propose a rank-based stability metric called the instability index to compare the stabilities of three feature selection algorithms—MDA, LIME, and SHAP—as applied to random forests. Typically, features are selected by averaging many random iterations of a selection algorithm. Although the variability of the selected features does decrease as the number of iterations increases, it does not go to zero, and the features selected by the three algorithms do not necessarily converge to the same set. LIME and SHAP are found to be more stable than MDA, and LIME is at least as stable as SHAP for the top-ranked features. Hence, overall, LIME is best suited for human interpretability. However, the selected set of features from all three algorithms significantly improves various predictive metrics out of sample, and their predictive performance does not differ significantly. Experiments were conducted on synthetic datasets, two public benchmark datasets, an S&P 500 dataset, and on proprietary data from an active investment strategy.TOPICS: Big data/machine learning, simulations, statistical methodsKey Findings▪ The authors propose a novel ranked-based instability index to measure the stability of feature selection algorithms MDA, LIME, and SHAP.▪ LIME is more stable than MDA and SHAP on the features with high importance scores.▪ LIME improves the exemplified trading strategy in both Sharpe ratio and cumulative return. ER -