search query: @instructor Corona, Francesco / total: 8
reference: 6 / 8
Author: | Zhu, Zhanxing |
Title: | Supervised distance preserving projections for dimensionality reduction |
Publication type: | Master's thesis |
Publication year: | 2011 |
Pages: | ix + 53 Language: eng |
Department/School: | Tietotekniikan laitos |
Main subject: | Informaatiotekniikka (T-61) |
Supervisor: | Simula, Olli |
Instructor: | Corona, Francesco |
OEVS: | Electronic archive copy is available via Aalto Thesis Database.
Instructions Reading digital theses in the closed network of the Aalto University Harald Herlin Learning CentreIn the closed network of Learning Centre you can read digital and digitized theses not available in the open network. The Learning Centre contact details and opening hours: https://learningcentre.aalto.fi/en/harald-herlin-learning-centre/ You can read theses on the Learning Centre customer computers, which are available on all floors.
Logging on to the customer computers
Opening a thesis
Reading the thesis
Printing the thesis
|
Location: | P1 Ark Aalto 8666 | Archive |
Keywords: | supervised dimensionality reduction regression classification optimization kernel |
Abstract (eng): | Facing with high-dimensional data, dimensionality reduction is an essential technique for overcoming the "curse of dimensionality" problem. This work focuses on supervised dimensionality reduction, especially for regression tasks. The goal of dimensionality reduction for regression is to learn a low-dimensional representation of the original high-dimensional data such that this new representation leads to accurate regression predictions. Motivated by continuity preservation, we propose a novel algorithm for supervised dimensionality reduction named Supervised Distance Preserving Projection (SDPP). In order to preserve the continuity in the low-dimensional subspace, we resort to considering the local geometrical structure of the original input space and response space. Inside a neighborhood of each point in the input space, the optimization criterion of SDPP tries to minimize the difference between the distances of the projected covariates and distances of the responses. Consequently, this minimization of distance differences leads to the effect that the local geometrical structure of the low-dimensional subspace optimally matches the geometrical characteristics of the response space. The local match not only facilitates an accurate regressor design but also uncovers the necessary information for visualization. Different optimization schemes are proposed for solving SDPP efficiently. Moreover, the parametric mapping we learned can easily handle the out-of-sample data points. A kernelized version of SDPP is derived for nonlinear data. An intuitive extension of SDPP is also presented for classification tasks. We compare the performance of our method with state-of-the-art algorithms on both synthetic and real-world data. These comparisons show the superiority of our approach on the task of dimensionality reduction for regression and classification. |
ED: | 2011-12-14 |
INSSI record number: 43252
+ add basket
INSSI