Currently, I am working on building Large Language Models (LLM) and enable its usage in more application scenarios. Todays’ LLMs are often extremely computation-expensive and difficult to control, which limits theirs usefulness. To make them user-friendly and accessible to more individuals and organizations, my research focus on: (1) make them better API’s for various realistic problems; (2) build smaller but more specificalized LLM expert; (3) incorporate various input signals to them like image and video.
I am an Ph.D. student of School of Computer Science and Engineering at Hong Kong University of Science and Technology. I am affliated to Statistics and Machine Learning Research Group and JC STEM Lab of Data Science Foundations.
Education
Hong Kong University of Science and Technology, Clear Water Bay, Hong Kong SAR
- Ph.D. Student in Computer Science and Engineering
- Sept. 2021 - Present
- Advisor: Prof.Tong Zhang and Prof.Xiaofang Zhou
University of Electronic Science and Technology of China
- Master of Computer Science in School of Computer Science and Engineering
- Sept. 2018 - Jun. 2021
- Advisor: Prof. Jie Shao and Prof. Xing Xu
University of Electronic Science and Technology of China
- Bachelor of Science in School of Schools of Mathematics
- Sept. 2014 - Jun. 2018
Experiences
- Research assistant. Sept. 2019 - Sept. 2020
- Advisor:
- MultiModal Knowledge Graph and Math Word Problem Solving
- Research Interns. Feb. 2021 - Aug. 2021
- Advisor: and
- Data Wrangling based on Pre-Trained Language Models
Awards
Outstanding Graduate Student of Sichuan Province, 2021
National Scholarship (1/341), UESTC, 2020
Goodix Technology Scholarship, UESTC, 2020
Outstanding Graduates, University of Electronic Science and Technology of China, 2018
Execellent Graduates, University of Electronic Science and Technology of China, 2018
Excellent Undergraduate Thesis Award, UESTC, 2018
People’s scholarship in China for Undergraduate (First class), 2017