Welcome to the Universal Transfer Learning Lab

We are Universal Transfer Learning (UTL) lab at Korea University with Prof. Donghyun Kim. Our research pursuits are situated under the expansive umbrella of transfer learning, with a particular emphasis on investigating the transferability, generalization, and adaptability of robust artificial intelligence (AI) models across a wide array of AI domains and disciplines.

Research Area

Our overarching objective is to pioneer the creation of highly effective transfer learning algorithms that can seamlessly transcend the boundaries of disparate domains and modalities found within a multitude of fields. These algorithms will be specifically tailored to cater to a wide spectrum of real-world applications, thus driving innovation and advancements in various industries and sectors.


Our main research interests include but not limited to the following:

  • Domain Transfer (Domain Adaptation/Generalization)
  • Task Transfer
  • Pre-training for Transfer Learning
  • Foundation Model Adaptation
  • Efficient Adaptation
  • Multimodal Representation Learning
  • Transferable Representation Learning
  • Un/self-supervised Learning
  • Learning From Synthetic Data
  • Learning Compositionality
  • Etc…



We are looking for passionate new MS, MS/PhD, PhD students or Postdocs to join the team (more info) !

News

Dec. 2024

A paper has been accepted to AAAI 2025.

Sep. 2024

A paper has been accepted to WACV 2025.

Sep. 2024

A paper has been accepted to NeurIPS 2024 Workshop.

Sep. 2024

A paper has been accepted to NeurIPS 2024.

Sep. 2024

A paper has been accepted to EMNLP 2024.

Jul. 2024

Two papers have been accepted to ECCV 2024.

Apr. 2024

A paper has been accepted to CVPR Workshop 2024.

Feb. 2024

Three papers have been accepted to CVPR 2024.

Jan. 2024

A paper has been accepted to ICLR 2024.

... see all News