当前位置: 首页>>研究生教育>>导师简介>>正文

宁婉仪
2025-09-16 20:07  

宁婉仪,女,讲师/校聘教授。北京邮电大学计算机科学与技术专业博士,苏黎世联邦理工学院联合培养博士。主要研究方向为分布式深度学习和模型压缩。以第一/共一作者身份在JSACTSCNeurIPSTNNLSEuro-ParICML workshop等国际顶级期刊和会议上发表论文多篇。联系邮箱:ningwanyi@126.com

发表论文

[1] Wanyi Ning, Jingyu Wang, Qi Qi, Haifeng Sun, Daixuan Cheng, Cong Liu, Lei Zhang, Zirui Zhuang, Jianxin Liao. Federated Fine-Tuning on Heterogeneous LoRAs With Error-Compensated Aggregation. [TNNLS’ 25],中科院一区,CCF-B


[2] Minwei Zhang, Haifeng Sun, Jingyu Wang, Shaolong Li, Wanyi Ning, Qi Qi, Zirui Zhuang, Jianxin Liao. ClusterAttn: KV Cache Compression under Intrinsic Attention Clustering. [ACL’ 25]CCF-A


[3] Wanyi Ning, Jingyu Wang, Qi Qi, Mengde Zhu, Haifeng Sun, Daixuan Cheng, Jianxin Liao, Ce Zhang. Fm-delta: Lossless compression for storing massive fine-tuned foundation models. [NeurIPS’ 24]CCF-A


[4] Mengde Zhu*, Wanyi Ning*, Qi Qi, Jingyu Wang, Zirui Zhuang, Haifeng Sun, Jun Huang, Jianxin Liao. Fluk: protecting federated learning against malicious clients for internet of vehicles. [Euro-Par’ 24]CCF-B


[5] Wanyi Ning, Qi Qi, Jingyu Wang, Mengde Zhu, Shaolong Li, Guang Yang, Jianxin Liao. One Teacher is Enough: A Server-Clueless Federated Learning With Knowledge Distillation. [TSC’ 24] CCF-A


[6] Berivan Isik*, Hermann Kumbong*, Wanyi Ning*, Xiaozhe Yao*, Sanmi Koyejo, Ce Zhang. Gpt-zip: Deep compression of finetuned large language models. [ICML’ 23 workshop] CCF-A


[7] Wanyi Ning, Haifeng Sun, Xiaoyuan Fu, Xiang Yang, Qi Qi, Jingyu Wang, Jianxin Liao, Zhu Han. Following the correct direction: Renovating sparsified SGD towards global optimization in distributed edge learning. [JSAC’ 21]CCF-A



上一条:张济
下一条:高宸
关闭窗口
快速通道

地址:中国 湖南 衡阳 常胜西路     
雨母校区:湖南省衡阳市蒸湘区衡祁路228号南华大学雨母校区计算机学院     
电话:0734-8282473     邮编:421001

版权所有:南华大学计算机学院