Jialin Zhao
Ph.D Student
Personal introduction
2023.7 until now Ph.D in Computer Science & Technology, Tsinghua University
2019,7-2021,6 MSc in Data Science & Information Technology, Tsinghua University
2015,7-2019,6 BEng in Computer Science & Technology, Tsinghua University
Research Direction
Efficient AI, Sparse Training, Topology
Publications
Zhang, Y.; Zhao, J.; Liao, Z.; Wu, W.; Michieli, U.; Cannistraci, C. V. Brain-Inspired Sparse Training in MLP and Transformers with Network Science Modeling via Cannistraci-Hebb Soft Rule. Preprints 2024, 2024061136. https://doi.org/10.20944/preprints202406.1136.v1
Zhang, Y., Bai, H., Lin, H., Zhao, J., Hou, L., & Cannistraci, C. V. (2024, May). Plug-and-play: An efficient post-training pruning method for large language models. In The Twelfth International Conference on Learning Representations.
Zhao, J., Zhang, Y., Li, X., Liu, H., & Cannistraci, C. V. (2024). Sparse Spectral Training and Inference on Euclidean and Hyperbolic Neural Networks. arXiv preprint arXiv:2405.15481.
Zhang, Y., Zhao, J., Wu, W., Muscoloni, A., & Cannistraci, C. V. (2024, May). Epitopological learning and Cannistraci-Hebb network shape intelligence brain-inspired theory for ultra-sparse advantage in deep learning. In The Twelfth International Conference on Learning Representations.
Zhao, J., Dong, Y., Ding, M., Kharlamov, E., & Tang, J. (2021). Adaptive diffusion in graph neural networks. Advances in neural information processing systems, 34, 23321-23333.