Education2020.09 - 2025.06, Ph.D., Computer Science and Technology, Nanjing University. Supervisor: Professor Lijun Zhang (LAMDA Group). 2023.10 - 2024.05, Visiting Student, National University of Singapore. 2016.09 - 2020.06, B.E., Computer Science and Technology, Xi'an Jiaotong University. 2019.01 - 2019.06, Exchange Student, University of California, Berkeley. 2018.07 - 2018.08, Exchange Student, University of Manchester. Work Experience2025.10 - now, Nanjing University of Science and Technology. Professor, School of Computer Science and Engineering. ResumeSocial PositionReviewer for Conferences: ICML 2025,2024,2023,2022; NeurIPS 2024,2023,2022; ICLR 2025,2024; AAAI 2025; AISTATS 2023. Reviewer for Journal: IEEE Transactions on Pattern Analysis and Machine Intelligence; IEEE Transactions on Information Forensics and Security; IEEE Transactions on Evolutionary Computation; Scientific Reports; Machine Learning; Applied Numerical Mathematics; Information Sciences; Neurocomputing; Transactions on Machine Learning Research. Research FieldMachine Learning, Stochastic Optimization, LLM Optimization. Open CourseTeaching researchResearch ProjectDistributed Optimization of Compositional Loss Functions. Postgraduate Research & Practice Innovation Program of Jiangsu Province (KYCX24_0231) 2024.05-2025.05 Thesis9. Optimizing Unnormalized Statistical Models through Compositional Optimization. [link] W. Jiang, J. Qin, L. Wu, C. Chen, T. Yang, and L. Zhang IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI 2026), 48(2): 1949 - 1960, 2026. 8. Revisiting Stochastic Multi-Level Compositional Optimization. [link] W. Jiang, S. Yang, Y. Wang, T. Yang, and L. Zhang IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI 2025), 47(7): 5613 - 5624, 2025. 7. Normalized Adaptive Variance Reduction Method. [link] W. Jiang, S. Yang, Y. Wang, and L. Zhang Journal of Software, 36(11): 4893 - 4905, 2025. 6. Adaptive Variance Reduction for Stochastic Optimization under Weaker Assumptions. [link] W. Jiang, S. Yang, Y. Wang, and L. Zhang In Advances in Neural Information Processing Systems 37 (NeurIPS 2024), pages 22047 - 22080, 2024. 5. Efficient Sign-Based Optimization: Accelerating Convergence via Variance Reduction. [link] W. Jiang, S. Yang, W. Yang, and L. Zhang In Advances in Neural Information Processing Systems 37 (NeurIPS 2024), pages 33891 - 33932, 2024. 4. Projection-Free Variance Reduction Methods for Stochastic Constrained Multi-Level Compositional Optimization. [link] W. Jiang, S. Yang, W. Yang, Y. Wang, Y. Wan, and L. Zhang In Proceedings of the 41st International Conference on Machine Learning (ICML 2024), pages 21962 - 21987, 2024. 3. Learning Unnormalized Statistical Models via Compositional Optimization. [link] W. Jiang, J. Qin, L. Wu, C. Chen, T. Yang, L. Zhang In Proceedings of the 40th International Conference on Machine Learning (ICML 2023), pages 15105 - 15124, 2023. 2. Multi-block-Single-probe Variance Reduced Estimator for Coupled Compositional Optimization. [link] W. Jiang, G. Li, Y. Wang, L. Zhang, and T. Yang In Advances in Neural Information Processing Systems 35 (NeurIPS 2022), pages 32499 - 32511, 2022. 1. Optimal Algorithms for Stochastic Multi-Level Compositional Optimization. [link] W. Jiang, B. Wang, Y. Wang, L. Zhang, and T. Yang In Proceedings of the 39th International Conference on Machine Learning (ICML 2022), pages 10195 - 10216, 2022. WritingsPatentHonor RewardExcellent Graduate of Nanjing University, 2025 National Scholarship, 2024 Excellent Student of Nanjing University, 2024 NeurIPS Top Reviewer, 2024 National Scholarship, 2023 Excellent Student of Nanjing University, 2023 LAMDA Elite Award, 2023 Tencent Scholarship, 2022 Excellent Student of Nanjing University, 2022 Industrial Bank Scholarship, 2021 Excellent Student of Nanjing University, 2021 Grand Champion of DeeCamp Artificial Intelligence Camp, 2020 (¥100,000) Excellent Graduate of Xi'an jiaotong University, 2020 |
