Multilevel and Domain Decomposition Methods for Training Neural Networks
Video recording:
Speaker: Rolf Krause (Università della Svizzera italiana, Switzerland)
Title: Multilevel and Domain Decomposition Methods for Training Neural Networks.
Time: Wednesday, 2023.10.11, 10:00 a.m. (CET)
Place: fully virtual (contact Jakub Lengiewicz to register)
Format: 30 min. presentation + 30 min. discussion
Abstract: The constantly increasing sizes of DNNs, measured in terms of parameters and training data available, is putting high demands on the training process, as the minimization process itself and the identification of suitable hyper-parameters are becoming more and more time consuming. A natural demand at this point is to devise scalable, parallel, and efficient training methodologies. We will present non-linear domain decomposition methods and multi-level techniques, which allow for scalability, convergence control, and automatic identification of training related hyper-parameters. We discuss the presented methods and demonstrate their convergence and scalability properties through benchmark problems.
Prof. Rolf Krause holds a chair of advanced scientific computing in the faculty of informatics and is the director of the institute of computational science (ICS) at the Università della Svizzera italiana. He is also the Co-director of the Center for Computational Medicine in Cardiology (CCMC) at USI. His research focuses on numerical simulation, machine learning, optimization, and data driven approaches. The complexity of real world applications constitutes a challenge for model and data based prediction, turning the development of models and of solution methods into a challenging task. In addition to a well balanced combination of methodological and mathematical knowledge, it also requires experience in dealing with subtle aspects of the implementation.
Seminar slides: