ÀûÓûúе½øÅú¸Ä½ø´ó¹æÄ£ÖØÐÂËã×ÊÁÏ·ÂÕÕ²½Ö裺ÎÞ¹ì·Ãܶȷºº¯ÀíÂۺͽôÔ¼ÊøÃܶȷºº¯ÀíÂÛ

2024.09.02

Ͷ¸å£º¹¨»ÝÓ¢²¿ÃÅ£ºÀíѧԺä¯ÀÀ´ÎÊý£º

»î¶¯ÐÅÏ¢

»ã±¨±êÌâ (Title)£ºAdvancing Orbital-Free DFT and DFTB for Large-Scale ab initio Materials Modeling with Machine Learning£¨ÀûÓûúе½øÅú¸Ä½ø´ó¹æÄ£ÖØÐÂËã×ÊÁÏ·ÂÕÕ²½Ö裺ÎÞ¹ì·Ãܶȷºº¯ÀíÂۺͽôÔ¼ÊøÃܶȷºº¯ÀíÂÛ£©

»ã±¨ÈË (Speaker)£ºAssoc. Prof. Sergei Manzhos£¨Tokyo Institute of Technology£©

»ã±¨¹¦·ò (Time)£º2024Äê9ÔÂ5ÈÕ(ÖÜËÄ) 15:30-17:30

»ã±¨µØÖ· (Place)£ºÐ£±¾²¿G313

Ô¼ÇëÈË (Inviter)£ºÈÎΰ ½ÌÊÚ

Ö÷°ì²¿ÃÅ£ºÀíѧԺÎïÀíϵ

ÌáÒª (Abstract)£º

Ab initio materials modeling is still largely based on Kohn-Sham DFT (density functional theory). The near-cubic scaling of KS-DFT makes possible routine calculations only at small scale, limited to 102-3 atoms. This becomes problematic when ab initio level insight (effects of or on electronic structure, mechanisms of various phenomena) is needed for intrinsically large-scale problems (e.g. microstructure effects, large molecules and interfaces). Large-scale DFT-based methods exist (Order-N DFT, Orbital-free (OF) DFT, DFTB (density functional tight binding)) but still require improvements to be routinely usable in various applications. In this talk, I will consider two large-scale approaches, OF-DFT and DFTB, and specifically focus on using machine learning (ML) to improve them, either to improve the accuracy or to extend the field of applicability. For DFTB, I will show how one can realize a QM-MM (quantum mechanics-molecular mechanics) hybrid not by spatial range but by type of interactions, modeling some of the interatomic interactions at the MM level with a ML-optimized potential function. For OF-DFT, I will show how ML can be used to help construct kinetic energy functionals (KEF) which have been the bottleneck on the way to wider adoption of the OF-DFT method that is capable of routine modeling of million-atom systems.

¡¾ÍøÕ¾µØÍ¼¡¿