您所在的位置: 首页- 新闻公告- 学术讲座-

学术讲座

“高屋建瓴AI公开课”第17期:Sample Complexity and Regularized Schemes of Graph Convolutional Networks
日期:2022-05-27访问量:


17期.png

报告时间: 2022年6月1日(周三) 15:00-16:00        

腾讯会议: 962 346 579

邀请人:  刘勇  中国人民大学高瓴人工智能学院准聘副教授

主讲人姓名: 吕绍高 南京审计大学统计与数据科学学院教授

主讲人简介: 2011年获得中国科大-香港城市大学联合培养博士,2011年-2018年在西南财经大学工作。主要研究方向是统计机器学习,当前研究兴趣包括分布式学习、结构化预测以及深度学习等。迄今为止在SCI检索的杂志上发表论文30多篇,包括统计学顶级期刊 《Annals of Statistics》、人工智能领域顶级期刊《Journal of Machine Learning Research》与顶级会议“NeurIPS”,以及数量经济学顶级期刊《Journal of Econometrics》。

报告题目: Sample Complexity and Regularized Schemes of Graph Convolutional Networks

报告摘要: This report mainly studies the sample complexity and regularization algorithm of graph convolutional networks, respectively. First, we provide a tight upper bound of Rademacher complexity for GCN models with a single hidden layer. Under regularity conditions, theses derived complexity bounds explicitly depend on the largest eigenvalue of graph convolution filter and the degree distribution of the graph. Again, we provide a lower bound of Rademacher complexity for GCNs to show optimality of our derived upper bounds. Second, our purpose is to quantify the trade off of GCN between smoothness and sparsity,  with the help of a new   stochastic learning proposed in the work. For a single

layer GCN, we develop an explicit theoretical understanding of GCN with the regularized stochastic learning by analyzing the stability of our regularized stochastic algorithm. Particularly, we prove that the uniform stability of our GCN depends on the largest absolute eigenvalue of its graph filter, and there exists a stability-sparsity trade off with varying p. Several empirical experiments are implemented to validate our theoretical findings.

检测到您当前使用浏览器版本过于老旧,会导致无法正常浏览网站;请您使用电脑里的其他浏览器如:360、QQ、搜狗浏览器的速模式浏览,或者使用谷歌、火狐等浏览器。

下载Firefox