Learning Embedding Adaptation for Few-Shot Learning

本文介绍了一种利用Transformer架构实现的自适应策略,该策略通过学习一个实例嵌入函数来处理少样本学习任务。这种方法从已知类别中学习判别性的实例嵌入模型,并将其应用于未知类别的实例,尤其是在有限的标签情况下。通过自我注意力机制,每个实例的嵌入可以考虑其上下文信息,从而改善嵌入质量。实验表明,使用排列不变的集函数代替序列模型可以提高模型的效果。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Abstract

  • by learning an instance embedding function from seen classes, and apply the function to instances from unseen classes with limited labels.
  • usually learn a discriminative instance embedding model from the SEEN categories, and apply the embedding model to visual data in UNSEEN categories
  • no-parametric classifiers to avoid learning complicated recognition models from a small number of examples.
  • The most useful features for discerning “cat” versus “tiger” could be irrelevant and noise to the task of discerning “cat” versus “dog”

Introduce

  • What is lacking in the current approaches for few-shot learning is an adaptation strategy that tailors the visual knowledge extracted from the SEEN classes to the UNSEEN ones in a target task. In other words, we desire separate embedding spaces where each one of them is customized such that the visual features are most discriminative for a given task
  • The key assumption is that the embeddings capture all necessarily discriminative representations of data such that simple classifiers are sufficed。
  • We use the Transformer architecture to implement T. In particular, we employes self-attention mechanism to improve each instance embedding with consideration to its contextual embedding

实验

use pre-train??

  • to demonstrate the effectiveness of using a permutation-invariant set function instead of a sequence model. Please see supplementary for details.
  • Transformer is set-to-set transformation
  • customizes a task-specific embedding spaces via a self-attention architecture
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值