¹â»¬ÎÞÔ¼ÊøÓÅ»¯ÎÊÌâµÄÐÂÌݶȲ½Öè

2023.09.22

Ͷ¸å£º¹¨»ÝÓ¢²¿ÃÅ£ºÀíѧԺä¯ÀÀ´ÎÊý£º

»î¶¯ÐÅÏ¢

»ã±¨±êÌâ (Title)£ºNew gradient methods for smooth unconstrained optimization problems£¨¹â»¬ÎÞÔ¼ÊøÓÅ»¯ÎÊÌâµÄÐÂÌݶȲ½Ö裩

»ã±¨ÈË (Speaker)£º Ëï´Ï ¸±½ÌÊÚ£¨±±¾©Óʵç´óѧ£©

»ã±¨¹¦·ò (Time)£º2023Äê9ÔÂ26ÈÕ (Öܶþ) 10:00

»ã±¨µØÖ· (Place)£ºÐ£±¾²¿F309

Ô¼ÇëÈË(Inviter)£ºÐì×Ë ½ÌÊÚ

Ö÷°ì²¿ÃÅ£ºÀíѧԺÊýѧϵ

»ã±¨ÌáÒª£ºIn this talk, a new gradient method for unconstrained optimization problem is proposed, where the stepsizes are updated in a cyclic way, and the Cauchy step is approximated by the quadratic interpolation. Combined with the adaptive non-monotone line search technique, we prove the global convergence of this method. Moreover, the algorithms have sublinear convergence rate for general convex functions and R-linear convergence rate for strongly convex problems. The numerical results show that our proposed algorithm outperforms the benchmark methods.

¡¾ÍøÕ¾µØÍ¼¡¿