UniPT Save

[CVPR2024] The code of "UniPT: Universal Parallel Tuning for Transfer Learning with Efficient Parameter and Memory"

Project README

UniPT

PyTorch implementation for CVPR2024 paper of “UniPT: Universal Parallel Tuning for Transfer Learning with Efficient Parameter and Memory”.

It is built on top of the VSE-infty, CLIP-ViL, CLIP4Clip, MDETR, LST and Awesome_Pretraining_Transfering.

If any problems, please contact me at [email protected]. ([email protected] is deprecated)

Introduction

The framework of UniPT:

Overview of the Framework with (a) parallel interaction ($\varphi$) and (b) confidence aggregation ($\theta$) layers. The former attempts to extract more discriminative features at each layer independently guided by the relatively most powerful output features, while the latter learns a dynamic and optimal combination strategy over the blended features at each layer for the ultimate domain adaptation.

Task & Model Details

Image-Text Retrieval: VSE-infty with the strongest combination of a BERT-base model and a ResNeXt-101(32×8d) backbone pre-trained on Instagram (WSL).

Video-Text Retrieval: CLIP4Clip with the pre-trained CLIP network using Text Transformer and ViT-B/32 models.

Question Answering: CLIP-ViL that utilizes the CLIP image backbone and encodes the text into the word embedding sequence, followed by a cross-modal Transformer.

Visual Grounding: MDETR with a pre-trained ResNet-101 vision encoder, a RoBERTa-base text encoder, and a query-based encoder-decoder Transformer.

Please refer to their respective README.md file for the detailed settings.

Guidance for Applications

We summarize the positions where UniPT is defined and invoked in each work as follows:
We hope these help you quickly realize your idea beyond UniPT.

  1. CLIP-ViL: UniPT is defined and called at class LXRTEncoder(nn.Module) from CLIP-ViL/src/lxrt/modeling.py.

  2. CLIP4Clip: UniPT is defined at CLIP4Clip/modules/module_adapter.py, and called at Line 251-261 from CLIP4Clip/modules/modeling.py.

  3. VSE-infty: UniPT is defined at VSE-infty/lib/adapter_for_cnn.py and VSE-infty/lib/adapter_for_transformer.py, and called at VSE-infty/lib/encoders.py.

  4. MDETR: UniPT is defined and called at class Transformer(nn.Module) from MDETR/models/transformer.py.

Reference

If UniPT is useful for your research, please cite the following paper:

  @article{Diao2023UniPT,
      title={UniPT: Universal Parallel Tuning for Transfer Learning with Efficient Parameter and Memory},
      author={Diao, Haiwen and Wan, Bo and Zhang, Ying and Jia, Xu and Lu, Huchuan and Chen, Long},
      journal={arXiv preprint arXiv:2308.14316},
      year={2023}
  }

License

Apache License 2.0.

Open Source Agenda is not affiliated with "UniPT" Project. README Source: Paranioar/UniPT

Open Source Agenda Badge

Open Source Agenda Rating