Is a Complex-Valued Stepsize Advantageous in Complex-Valued Gradient Learning Algorithms?
报告人:张会生
报告地点:数学与统计学院111室
报告时间:2018年07月10日星期二16:00-17:00
邀请人:
报告摘要:
According to the optimization theory, the stepsize of the gradient algorithm was usually set as a small positive number. Different from the classical gradient algorithm, this talk is concerned with the complex-valued gradient learning algorithms with a complex-valued stepsize. We will report some new developments and new ideas on this topic.
主讲人简介:
张会生,大连海事大学理学院教授,博士生导师,在国内外知名刊物上发表SCI检索学术论文20余篇。先后承担国家自然科学基金项目2项,中国博士后科学基金1项,辽宁省自然科学基金项目1项。