论文标题

线性维度降低

Linear Dimensionality Reduction

论文作者

Franc, Alain

论文摘要

这些注释是多元数据分析中某些经典线性方法的概述。这是一个很好的旧领域,自60年代以来就建立了良好的领域,并及时刷新了统计学习的关键一步。它可以作为统计学习的一部分表示,也可以作为几何风味降低维度。两种方法都是紧密相连的:从低维空间中的数据中学习模式比在高维空间中更容易。可以显示多种方法和工具如何归结为具有SVD的PCA的单个核心方法,以便优化分析大量数据集(例如分布式内存和基于任务的编程)的代码,或提高算法效率(如随机SVD)的效率,可以专注于这种共享的核心方法,并利用所有方法。

These notes are an overview of some classical linear methods in Multivariate Data Analysis. This is a good old domain, well established since the 60's, and refreshed timely as a key step in statistical learning. It can be presented as part of statistical learning, or as dimensionality reduction with a geometric flavor. Both approaches are tightly linked: it is easier to learn patterns from data in low dimensional spaces than in high-dimensional spaces. It is shown how a diversity of methods and tools boil down to a single core methods, PCA with SVD, such that the efforts to optimize codes for analyzing massive data sets like distributed memory and task-based programming or to improve the efficiency of the algorithms like Randomised SVD can focus on this shared core method, and benefit to all methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源