科学研究
报告题目:

Statistical Learning Theory for Deep Functional Networks

报告人:

樊军 教授(香港浸会大学)

报告时间:

报告地点:

武汉大学雷军科技楼601报告厅

报告摘要:

Recently, there has been a significant increase in research aimed at understanding the theoretical foundations of neural networks defined in infinite-dimensional function spaces. While existing studies have explored various aspects of this topic, our understanding of the approximation and learning abilities of these networks remains limited. In this talk, I will present our recent work on the generalization analysis of deep functional networks designed for learning nonlinear mappings from function spaces to R (i.e., functionals). By investigating the convergence rates of approximation and generalization errors, we uncover important insights into the theoretical properties of these networks. This analysis not only deepens our understanding of deep functional networks but also paves the way for their effective application in areas such as operator learning, functional data analysis, and scientific machine learning.