Seminar by Dr Du Yingpeng, Research Fellow, CCDS NTU, 17 Oct 2024, N4-B1A-02

17 Oct 2024 11.30 AM - 12.00 PM Current Students, Industry/Academic Partners

Title:  Incorporating LLMs for Effective and Efficient Recommendation

Abstract:   Recently, there has been a growing interest in harnessing the extensive knowledge and powerful reasoning abilities of large language models (LLMs) to recommender systems (RSs) for more effective decision-making [3]. However, integrating LLMs into RSs isn't a one-size-fits-all solution. Challenges arise due to hallucinations within LLMs, hindering the generation of reliable suggestions without domain-specific knowledge and effective guidance. To bridge this gap, our project proposes integrating domain-specific knowledge graphs (KGs) [4] into LLMs to enhance the knowledge of LLMs and provide effective guidance, thus facilitating more effective recommendation results. KGs, with structured representation of facts and relationships, can significantly enrich the knowledge and fact understanding of LLMs. Thus, this integration facilitates LLMs in discerning relevant knowledge effectively, thereby overcoming their limitations of hallucinations and providing reliable suggestions for users in various domains. In addition, employing LLMs usually demands substantial computational time and memory, leading to a high latency and computation requirement during the serving time and limiting real-world applications. To this end, we propose an active LLM-based knowledge distillation (KD) method for sustainable AI. Specifically, we propose to elicit student learning from a small proportion of instances,  maximizing the minimal gains of distillation by selecting effective instances to ensure effective LLM-based KD theoretically. 

Biography: Du Yingpeng received the Ph.D. degree in software engineering from Peking University, Beijing, China, in 2023. He is currently a research fellow with the College of Computing and Data Science, at Nanyang Technological University. His research interests include recommender systems and ensemble learning. He has published more than 20 papers in top-tier journals and conferences, such as JMLR, Pattern Recognition, AAAI, SIGKDD, ICDM, IJCAI, and WWW.