Link Search Menu Expand Document

ReCross | Unsupervised Cross-Task Generalization via Retrieval Augmentation


Paper Github Video Slides


This is the project site for the paper, Unsupervised Cross-Task Generalization via Retrieval Augmentation, by Bill Yuchen Lin, Kangmin Tan, Chris Miller, Beiwen Tian, and Xiang Ren.


Abstract

Humans can perform unseen tasks by recalling relevant skills that are acquired previously and then generalizing them to the target tasks, even if there is no supervision at all. In this paper, we aim to improve such cross-task generalization ability of massive multi-task language models such as T0 (Sanh et al., 2021) in an unsupervised setting. We propose a retrieval-augmentation method named ReCross that takes a few unlabelled examples as queries to retrieve a small subset of upstream data and uses them to update the multi-task model for better generalization. Our empirical results show that the proposed ReCross consistently outperforms non-retrieval baselines by a significant margin.


Problem Formulation

Introduction of the problem


ReCross: the upstream stage

Introduction of the ReCross


ReCross: the generalization stage

Introduction of the ReCross


Cite

If you’d like cite us, please use this:

@article{Lin2022UnsupervisedCG,
  title={Unsupervised Cross-Task Generalization via Retrieval Augmentation},
  author={Bill Yuchen Lin and Kangmin Tan and Chris Miller and Beiwen Tian and Xiang Ren},
  journal={ArXiv},
  year={2022},
  volume={abs/2204.07937}
}