Deep Neural Networks for High Dimension, Low Sample Size Data

Deep Neural Networks for High Dimension, Low Sample Size Data

Bo Liu, Ying Wei, Yu Zhang, Qiang Yang

Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 2287-2293. https://doi.org/10.24963/ijcai.2017/318

Deep neural networks (DNN) have achieved breakthroughs in applications with large sample size. However, when facing high dimension, low sample size (HDLSS) data, such as the phenotype prediction problem using genetic data in bioinformatics, DNN suffers from overfitting and high-variance gradients. In this paper, we propose a DNN model tailored for the HDLSS data, named Deep Neural Pursuit (DNP). DNP selects a subset of high dimensional features for the alleviation of overfitting and takes the average over multiple dropouts to calculate gradients with low variance. As the first DNN method applied on the HDLSS data, DNP enjoys the advantages of the high nonlinearity, the robustness to high dimensionality, the capability of learning from a small number of samples, the stability in feature selection, and the end-to-end training. We demonstrate these advantages of DNP via empirical results on both synthetic and real-world biological datasets.
Keywords:
Machine Learning: Feature Selection/Construction
Multidisciplinary Topics and Applications: Computational Biology and e-Health
Machine Learning: Deep Learning