Quanyu Long

I am a fourth-year Ph.D. student at Nanyang Technological University (NTU), advised by Prof. Sinno Jialin Pan, and closely collaborating with Prof. Wenya Wang.

Before entering NTU, I received my bachelor's degree at Shanghai Jiao Tong University, admitted to the IEEE Honor class and Zhiyuan Honor Program. I joined the Apex Data and Knowledge Management Lab, advised by Prof. Yong Yu and Prof. Weinan Zhang.

Email: quanyu001 [at] e.ntu.edu.sg

CV  /  Google Scholar  /  Twitter  /  GitHub

profile photo

Research Interest

My recent research focuses on contextualized augmentations and their applications, particularly in retrieval-augmented language models (RALMs), in-context learning (ICL) with retrieval, and aligning retrieval with large language models (LLMs).

Papers

Visual-RAG: Benchmarking Text-to-Image Retrieval Augmented Generation for Visual Knowledge Intensive Queries
Yin Wu, Quanyu Long, Jing Li, Jianfei Yu, Wenya Wang
ArXiv Preprint, 2025.

T2I-FactualBench: Benchmarking the Factuality of Text-to-Image Models with Knowledge-Intensive Concepts
Ziwei Huang, Wanggui He, Quanyu Long, Yandi Wang, Haoyuan Li, Zhelun Yu, Fangxun Shu, Long Chan, Hao Jiang, Leilei Gan, Fei Wu
ArXiv Preprint, 2025.

Decomposition Dilemmas: Does Claim Decomposition Boost or Burden Fact-Checking Performance?
Qisheng Hu, Quanyu Long, Wenya Wang
2025 Annual Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics (NAACL), 2025.

Whispers in Grammars: Injecting Covert Backdoors to Compromise Dense Retrieval Systems
Quanyu Long*, Yue Deng*, Leilei Gan, Wenya Wang, Sinno Jialin Pan
ArXiv Preprint, 2024.

Large Language Models Know What Makes Exemplary Contexts
Quanyu Long, Jianda Chen, Wenya Wang, Sinno Jialin Pan
ArXiv Preprint, 2024.

Does In-Context Learning Really Learn? Rethinking How Large Language Models Respond and Solve Tasks via In-Context Learning
Quanyu Long*, Yin Wu*, Wenya Wang, Sinno Jialin Pan
First Conference on Language Modeling (COLM), 2024.

Adapt in Contexts: Retrieval-Augmented Domain Adaptation via In-Context Learning
Quanyu Long, Wenya Wang, Sinno Jialin Pan
The 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP), 2023.

Domain Confused Contrastive Learning for Unsupervised Domain Adaptation
Quanyu Long, Tianze Luo, Wenya Wang, Sinno Jialin Pan
2022 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL), 2022.

Generative Imagination Elevates Machine Translation
Quanyu Long, Mingxuan Wang, Lei Li
2021 Annual Conference of the North American Chapter of the Association for Computational Linguistics (NAACL), 2021.

On the Robustness of Language Encoders against Grammatical Errors
Fan Yin, Quanyu Long, Tao Meng, Kai-Wei Chang
The 58th Annual Meeting of the Association for Computational Linguistics (ACL), 2020.

QA4IE: A Question Answering Based System for Document-Level General Information Extraction
Lin Qiu, Dongyu Ru, Quanyu Long, Weinan Zhang, Yong Yu
IEEE Access, vol. 8, pp. 29677-29689, 2020.

Internship

  • Jun. 2024 - Sep. 2024: Research intern at I2R, A*STAR, advised by Nancy F. Chen
  • Mar. 2020 - Nov. 2020: Research intern at ByteDance AI Lab, advised by Prof. Lei Li
  • Jul. 2019 - Dec. 2019: Visiting intern at UCLA NLP lab, advised by Prof. Kai-Wei Chang

Awards

  • Singapore Research Scholarship, NTU. 2021-2025
  • Zhiyuan Honor Degree of Bachelor of Engineering in Computer Science and Technology, SJTU. 2020
  • Shanghai Scholarship (top 3%), SJTU. 2018
  • Matsushita Electric Education Scholarship (top 5%), SJTU. 2017
  • Zhiyuan College Honors Scholarship, SJTU. 2017-2019





Design and source code from Jon Barron's website