标题: Learning in High Dimension Always Amounts to Extrapolation
arXiv链接: https://arxiv.org/abs/2110.09485
阅读原因: 推荐,LeCun出品,Facebook AI出品,机器学习理论
摘要: The notion of interpolation and extrapolation is fundamental in various fields from deep learning to function approximation. Interpolation occurs for a sample x whenever this sample falls inside or on the boundary of the given dataset’s convex hull. Extrapolation occurs when x falls outside of that convex hull. One fundamental (mis)conception is that state-of-the-art algorithms work so well because of their ability to correctly interpolate training data. A second (mis)conception is that interpolation happens throughout tasks and datasets, in fact, many intuitions and theories rely on that assumption. We empirically and theoretically argue against those two points and demonstrate that on any high-dimensional (>100) dataset, interpolation almost su

本文探讨了在高维数据集中,机器学习模型实际上总是进行外推而非插值的现象。研究发现,在数据维度超过100的情况下,模型几乎不可能正确地进行插值,这对评估模型的泛化能力提出了挑战。
最低0.47元/天 解锁文章
963

被折叠的 条评论
为什么被折叠?



