cs.AI updates on arXiv.org 09月17日
EMOE:基于支持扩展的专家匹配框架
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

本文提出了一种名为EMOE的新型框架,通过支持扩展和预测性伪标签来提升预测和不确定性拒绝在分布外(OOD)点的性能。该框架利用多个基础专家作为伪标签器,通过多个MLP头(每个专家一个)共享嵌入训练,并引入了一种新颖的每个头的匹配损失。EMOE在不依赖模态特定增强或假设访问OOD数据的情况下,实现了对任何实值向量数据的鲁棒OOD泛化,并证明了其在单源域泛化设置中的优越性能。

arXiv:2406.01825v3 Announce Type: replace-cross Abstract: Expansive Matching of Experts (EMOE) is a novel framework that utilizes support-expanding, extrapolatory pseudo-labeling to improve prediction and uncertainty based rejection on out-of-distribution(OOD) points. EMOE utilizes a diverse set of multiple base experts as pseudo-labelers on the augmented data to improve OOD performance through multiple MLP heads (one per expert) with shared embedding train with a novel per-head matching loss. Unlike prior methods that rely on modality-specific augmentations or assume access to OOD data, EMOE introduces extrapolatory pseudo-labeling on latent-space augmentations, enabling robust OOD generalization with any real-valued vector data. In contrast to prior modality agnostic methods with neural backbones, EMOE is model-agnostic, working effectively with methods from simple tree-based models to complex OOD generalization models. We demonstrate that EMOE achieves superior performance compared to state-of-the-art method on diverse datasets in single-source domain generalization setting.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

EMOE 专家匹配 分布外泛化 伪标签 机器学习
相关文章