CUED Publications database

Neural random subspace

Cao, YH and Wu, J and Wang, H and Lasenby, J (2021) Neural random subspace. Pattern Recognition, 112. ISSN 0031-3203

Full text not available from this repository.


The random subspace method, also known as the pillar of random forests, is good at making precise and robust predictions. However, there is as yet no straightforward way to combine it with deep learning. In this paper, we therefore propose Neural Random Subspace (NRS), a novel deep learning based random subspace method. In contrast to previous forest methods, NRS enjoys the benefits of end-to-end, data-driven representation learning, as well as pervasive support from deep learning software and hardware platforms, hence achieving faster inference speed and higher accuracy. Furthermore, as a non-linear component to be encoded into Convolutional Neural Networks (CNNs), NRS learns non-linear feature representations in CNNs more efficiently than contemporary, higher-order pooling methods, producing excellent results with negligible increase in parameters, floating point operations (FLOPs) and real running time. Compared with random subspaces, random forests and gradient boosting decision trees (GBDTs), NRS demonstrates superior performance on 35 machine learning datasets. Moreover, on both 2D image and 3D point cloud recognition tasks, integration of NRS with CNN architectures achieves consistent improvements with only incremental cost.

Item Type: Article
Uncontrolled Keywords: cs.LG cs.LG cs.CV stat.ML
Divisions: Div F > Signal Processing and Communications
Depositing User: Cron Job
Date Deposited: 09 Oct 2020 21:10
Last Modified: 02 Sep 2021 04:52
DOI: 10.1016/j.patcog.2020.107801