My research interests are Natural Language Processing/Understanding and Information Retrieval. Recently I am
focusing on following topics:
Continue learning, evolving and adaption of Large Language Models: Large language models are powerful, but they
need to be updated. How to efficiently but effectively update them so that they can adapt to new task, new domains
and incorporate new knowledge ?
Language modeling with structured data: Large Language (Multimodal) Models succedded in modeling and
reasoning on seqential data or unstructured data, such as text, images, and videos. But how can we incooperate language models with structured data, such as information network, knowledge graphs, taxonomy, event triplets, and tabular data ?
Representation learning, information retrieval and their applications: Good represetations matters, they
connect
different modalities and models. Training representations for various task is a long lasting problem.
Feel free to reach out to me if you want to discuss any of these topics or interested in collaboration!
I am actively looking for research internship opportunities for Summer/Fall 2026 in Canada/US. If you believe I might be a good fit for your openings, feel free to reach out wtzhang0820 [AT] gmail.com .
News
[2025-09-10] New demo paper Interactive Training: Feedback-Driven Neural Network Optimization, is accepted in EMNLP 2025 System Demo Track
[2025-08-21] From Chat Logs to Collective Insights: Aggregative Question Answering accepted in EMNLP 2025 (Oral)
[2025-05-21] New arXiv out: From Chat Logs to Collective Insights: Aggregative Question Answering, check here
[2025-09-04] Start PhD at University of Waterloo
[2024-05-01] New personal page initial setup
Publications
Full publications on Google
Scholar. ‡ indicates equal contribution.
Selected
From Chat Logs to Collective Insights: Aggregative Question Answering