Subset selection for tuning of hyper-parameters in artificial neural networks
MetadataShow full item record
CitationAki, K. K. E., Erkoç, T. & Eskil, M. T. (2017). Subset selection for tuning of hyper-parameters in artificial neural networks. Paper presented at the 24th IEEE International Conference on Electronics, Circuits and Systems, 2018-January, 144-147. doi:10.1109/ICECS.2017.8292105
Hyper-parameters of a machine learning architecture define its design. Tuning of hyper-parameters is costly and for large data sets outright impractical, whether it is performed manually or algorithmically. In this study we propose a Neocognitron based method for reducing the training set to a fraction, while keeping the dynamics and complexity of the domain. Our approach does not require processing of the entire training set, making it feasible for larger data sets. In our experiments we could successfully reduce the MNIST training data set to less than 2.5% (1,489 images) by processing less than 10% of the 60K images. We showed that the reduced data set can be used for tuning of number of hidden neurons in a multi-layer perceptron.