Abstract写作方法
第一句:背景介绍,说明我们做的是什么方向,能解决什么问题。
- Multi-label learning deals with training examples each represented by a single instance while associated with multiple class
labels, and the task is to train a predictive model which can assign a set of proper labels for the unseen instance
第二句:陈列出目前该方向已经有的工作(一两个即可),陈述完后用However转折接第三句(不可用But)
- Existing approaches employ the common assumption of equal labeling-importance, i.e., all associated labels are regarded to be relevant to the training instance while their relative importance in characterizing its semantics are not differentiated
第三句:总结出上面已有的工作存在的缺陷
- However, this common assumption does not reflect the fact that the importance degree of each relevant label is generally different, though the importance information is not directly accessible from the training examples
第四句:提出自己的工作。
- In this article, we show that it is beneficial to leverage the implicit relative labeling-importance (RLI) information to help induce multi-label predictive model with strong generalization performance
第五、六、七句:分步骤陈述自己的工作(首先、其次、最后)
- Specifically, RLI degrees are formalized as multinomial distribution over the label space, which can be estimated by either global label propagation procedure or local k-nearest neighbor reconstruction. Correspondingly, the multi-label predictive model is induced by fitting modeling outputs with estimated RLI degrees along with multi-label empirical loss regularization.
第八句:陈述自己的工作效果(如提升了某某指标)
- Extensive experiments clearly validate that leveraging implicit RLI information serves as a favorable strategy to achieve effective multi-label learning.
本文探讨了多标签学习的问题,指出现有方法通常假设所有相关标签的重要性相等,但忽视了其实各标签的重要性程度可能不同。为此,我们提出利用隐含的相对标签重要性(RLI)信息来帮助构建具有强泛化性能的多标签预测模型。具体来说,RLI被形式化为标签空间上的多项式分布,并通过全局标签传播或局部K近邻重建进行估计。实验结果表明,利用RLI信息是实现有效多标签学习的有效策略。
1万+

被折叠的 条评论
为什么被折叠?



