Zhiwu Lu is a full professor with the Gaoling School of Artificial Intelligence, Renmin University of China, Beijing 100872, China. He received the Master of Science degree in applied mathematics from Peking University in 2005, and the PhD degree in computer science from City University of Hong Kong in 2011. He have published over 70 papers in international journals and conference proceedings including TPAMI, IJCV, TIP, ICLR, NeurIPS, CVPR, ICCV, and ECCV. He won the IBM SUR Award 2015, and Best Paper Award at CGI 2014. His team took the 2nd place in the VID task of ILSVRC 2015.
Most recent few-shot learning (FSL) methods take a meta-learning framework. Since the performance of the SOTA FSL methods is saturating on benchmark datasets, new meta-learning paradigms are needed in this field. In this talk, I will introduce two such paradigms, MELR (Modeling Episode-Level Relationships) and IEPT (Instance-Level and Episode-Level Pretext Tasks), for standard inductive FSL setting. Specifically, MELR aims to explicitly model the episode-level relationships during meta-training, and IEPT aims to seamlessly integrate self-supervised learning (SSL) into supervised FSL. Both MELR and IEPT achieve new SOTA on several benchmarks. These two paradigms have been published as two ICLR 2021 papers.