论文标题
假设差异正规化互信息最大化
Hypothesis Disparity Regularized Mutual Information Maximization
论文作者
论文摘要
We propose a hypothesis disparity regularized mutual information maximization~(HDMI) approach to tackle unsupervised hypothesis transfer -- as an effort towards unifying hypothesis transfer learning (HTL) and unsupervised domain adaptation (UDA) -- where the knowledge from a source domain is transferred solely through hypotheses and adapted to the target domain in an unsupervised manner.与通常使用单个假设的普遍HTL和UDA方法相反,HDMI采用多个假设来利用源和目标假设的基本分布。更好地利用不同假设之间的关键关系 - 与每个假设的不受限制优化相反 - 同时通过共同信息最大化适应了未标记的目标域,HDMI将假设差异差异稳定性结合在一起,可以共同学习目标的目标代表性更好地预测目标,同时学习更好的目标代表性,同时提供更好的预测知识。 HDMI在HTL的背景下在UDA的基准数据集上实现了最新的适应性性能,而无需在适应过程中访问源数据。
We propose a hypothesis disparity regularized mutual information maximization~(HDMI) approach to tackle unsupervised hypothesis transfer -- as an effort towards unifying hypothesis transfer learning (HTL) and unsupervised domain adaptation (UDA) -- where the knowledge from a source domain is transferred solely through hypotheses and adapted to the target domain in an unsupervised manner. In contrast to the prevalent HTL and UDA approaches that typically use a single hypothesis, HDMI employs multiple hypotheses to leverage the underlying distributions of the source and target hypotheses. To better utilize the crucial relationship among different hypotheses -- as opposed to unconstrained optimization of each hypothesis independently -- while adapting to the unlabeled target domain through mutual information maximization, HDMI incorporates a hypothesis disparity regularization that coordinates the target hypotheses jointly learn better target representations while preserving more transferable source knowledge with better-calibrated prediction uncertainty. HDMI achieves state-of-the-art adaptation performance on benchmark datasets for UDA in the context of HTL, without the need to access the source data during the adaptation.