Title | Is nonparametric learning practical in very high dimensional spaces? |
Publication Type | Journal Article |
Year of Publication | 1997 |
Authors | Grudie, G. Z., and P. D. Lawrence |
Secondary Authors | Pollack, M. E. |
Journal | IJCAI-97 - Proceedings of the 15th International Joint Conference on Articicial Intelligence, Vols 1 and 2 |
Pagination | 804–809 |
ISSN | 1045-0823 |
Abstract | Many of the challenges faced by the field of Computational Intelligence in building intelligent agents, involve determining mappings between numerous and varied sensor inputs and complex and flexible action sequences. In applying nonparametric learning techniques to such problems we must therefore ask: "Is nonparametric learning practical in very high dimensional spaces?" Contemporary wisdom states that variable selection and a "greedy" choice of appropriate functional structures are essential ingredients for nonparametric learning algorithms. However, neither of these strategies is practical when learning problems have thousands of input variables, and tens of thousands of learning examples, We conclude that such nonparametric learning is practical by using a methodology which does not use either of these techniques. We propose a simple nonparametric learning algorithm to support our conclusion. The algorithm is evaluated first on 10 well known regression data sets, where it is shown to produce regression functions which are as good or better than published results on 9 of these data sets. The algorithm is further evaluated on 15 large, very high dimensional data sets (40,000 learning examples of 100, 200, 400, 800 and 1600 dimensional data) and is shown to construct effective regression functions despite the presence of noise in both inputs and outputs. |