I am an Associate Professor and Master’s supervisor at the Institute of Information Engineering, Chinese Academy of Sciences (CAS), working on large-scale statistical learning theory and large language models (LLMs). In July 2020, I obtained my Ph.D. degree from Institute of Information Engineering, CAS, advised by Associate Prof. Yong Liu and Prof. Weiping Wang.

My research focuses on foundational machine learning theory, particularly the generalization theory of large-scale methods. Addressing the lag in foundational theory compared to empirical algorithms in large-scale machine learning, I aim to uncover the underlying principles and narrow the gap between theory and practical algorithms. Ultimately, I strive to guide large-scale algorithm design for a balance between computational efficiency and generalization performance. Specific interests include:

  • Optimal Generalization Guarantees for Large-Scale ML: Investigating optimal generalization guarantees, relaxing assumptions, and enhancing large-scale algorithms, including federated learning, distributed learning, and random features.

  • Generalization Theory of Deep Neural Networks: Exploring connections between neural networks and kernel methods, studying generalization in non-stationary spectral kernel networks, refining current neural network models, and using random matrix theory to understand phenomena in deep networks.

  • (Future Direction) Fundamental Research on Large Language Models: Delving into the foundational theory of large language models, explaining unique capabilities like scaling laws, context learning, and complex reasoning. Improving model architecture for computational efficiency and performance and researching the next generation of efficient language models with reduced parameters.

Curriculum Vitae (CV)

TimeTitleInstitutionResearch Direction
2023.10 - presentAssociate ProfessorInstitute of Information Engineering, CASLLMs, Large-scale Statistical Machine Learning
2020.09 - 2023.10Tenure-track ProfessorInstitute of Information Engineering, CASLarge-scale Statistical Machine Learning
2015.09 - 2020.06Ph.D. CandidateInstitute of Information Engineering, CASLarge-scale Model Selection, Semi-supervised Learning
2011.09 - 2015.06Bachelor CandidateNortheastern UniversitySoftware Engineering (International class)

Selected Papers [Full List] [Google Scholar]

  • Optimal Rates for Agnostic Distributed Learning. [pdf] [code]
    Jian Li, Yong Liu, Weiping Wang.
    IEEE Transactions On Information Theory (TIT), 2023. CCF-A Journal.

  • Optimal Convergence Rates for Distributed Nyström Approximation. [pdf] [code]
    Jian Li, Yong Liu, Weiping Wang.
    Journal of Machine Learning Research (JMLR), 2023. CCF-A Journal.

  • Convolutional Spectral Kernel Learning with Generalization Guarantees. [pdf] [code]
    Jian Li, Yong Liu, Weiping Wang.
    Artificial Intelligence (AI), 2022. CCF-A Journal.

  • Optimal Convergence Rates for Agnostic Nyström Kernel Learning. [pdf]
    Jian Li, Yong Liu, Weiping Wang.
    International Conference on Machine Learning (ICML), 2023. CCF-A Conference.

  • Multi-Class Learning: From Theory to Algorithm. [pdf] [poster] [sildes] [3-minute video] [code]
    Jian Li, Yong Liu, Rong Yin, Hua Zhang, Lizhong Ding, Weiping Wang.
    Advances in Neural Information Processing Systems 31 (NeurIPS), 2018. CCF-A Conference.

  • Federated learning for non-iid data: From theory to algorithm. [pdf] [presentation] [🏆最佳学生论文奖] (1/92)
    Bojian Wei, Jian Li*, Yong Liu, Weiping Wang.
    Pacific Rim International Conference on Artificial Intelligence (PRICAI), 2021. CCF-C Conference.

Projects

  • National Key R&D Program of China (2022YFB3105302.2), 2022.12 - 2025.11, ¥1,200,000.
    Aggregation and Collaborative Techniques for Cross-platforms Heterogenous Data.

  • National Natural Science Foundation of China (No. 62106257), 2022.01 - 2024.12, ¥300,000.
    Large Scale Structured Prediction with Automated Spectral Kernel Learning.

  • China Postdoctoral Science Foundation (Special Support, No. 2023T160680), 2023.07 - 2024.03, ¥180,000.
    Research on Deep Differentiable Gaussian Processes for Structured Prediction.

  • Special Research Assistant Project of CAS, 2020.09 - 2022.09, ¥800,000.
    Large-scale Few-shot Automated Machine Learning.

  • Talent Program Class A of Institute of Information Engineering, CAS, Tenure-track Professor, 2023.10 - 2026.09.

  • Talent Program Class B of Institute of Information Engineering, CAS, Tenure-track Young Professor, 2020.09 - 2023.10.

Patents

Pending

  • Jian Li, Yong Liu, Liubin Wang, Yiguo Yang, Juhong Wang.Neural Network Architecture Search Method, Device, Computer Equipment, and Storage Medium. CN:202011567991.3. App. Date: December 25, 2020.
  • Jian Li, Jiaoyang Li, Bojian Wei, Yong Liu, Weiping Wang. A Federated Learning Method and System Based on Attention Mechanism. CN: 202311073645.3. App. Date: August 24, 2023
  • Jian Li, Jiaoyang Li, Zheng Lin, Yong Liu, Weiping Wang. A Vertical Domain Large Model Method and System Based on Knowledge Distillation and Prompt Engineering. CN: 202311073641.5. App. Date: August 24, 2023.

Granted

  • Hailun Lin, Yong Liu, Jian Li, Weiping Wang. A Large-Scale Ontology Merging Method that Integrates Representation Learning and Divide-and-Conquer Strategy: China. Granted No.CN110059194A. Granted Date: April 8, 2022.

Students

  • Ph.D. students
    • 🎓Yilin Kang (2020.09 - 2023.06 ), Differential Privacy. Papers: Computers & Security, CIKM 2022, ICCS 2023. Post-graduation: Researcher in Purple Mountain Laboratories.
    • Xunyu Zhu (2020.09 - present), Neural Architecture Search. Papers: ICDM 2021.
    • Boxuan Che (2022.09 - present), Efficient Graph Neural Networks.
  • Master’s students
    • 🎓Bojian Wei (2020.09 - 2022.06), Federated Learning on Heterogenous Data. Papers: PRICAI 2021 (best student paper award), ECML-PKDD 2022, TNNLS, IJCNN 2023. Post-graduation: Management Trainee in Bank of China Head Office.
    • Xuning Zhang (2022.09 - present), Federated Learning. Excellent Bachelor’s Thesis in Wuhan University in 2023.

Honors and Awards

  • PRICAI 2021 best student paper award, 2021.
  • Special Research Assistant of Chinese Academy of Sciences, 2020.
  • AIDU Talents of Baidu Research (Decline), 2020
  • Joint Ph.D. Program with Stanford University (Discontinued due to COVID-19), 2020.02 - 2021.02
  • Outstanding Graduates of Beijing, 2020.
  • Outstanding Graduates of University of Chinese Academy of Sciences (UCAS), 2020.
  • Outstanding Graduates of Institute of Information Engineering, CAS, 2020.
  • National Scholarship for Doctoral students, 2019.
  • ZhuLiYueHua Scholarship for Excellent Doctoral Student, 2019.
  • CAS Presidential Scholarship, 2019.
  • National Scholarship for Doctoral students, 2018.

Academic Service

  • Mathematics Guest Editor
  • Program committee of Conference: ICML, NeurIPS, ICLR, AAAI, IJCAI, ECAI, etc.
  • Reviewers of Journals: TPAMI, JMLR, Pattern Recognition, etc.