Research Results

My research ranges from pure theoretical analysis, basic algorithmic development to practical applications I mainly work on numerical methods for partial differential equations (PDEs) and big data, such as finite element, multi-grid (MG), domain decomposition (DD) methods and deep neural networks, for their theoretical analysis, algorithmic development, and practical applications. Theoretical elegance and practical usefulness can go together, and the design and analysis of algorithms can be beautiful. Theory is the soul of what I do, and practical needs are what motivate me. In all my work, I try to strike a balance between rigor and practicality.

Examples of my better known works include the Bramble-Pasciak-Xu preconditioner (a basic algorithm for solving elliptic PDEs) and the Hiptmair-Xu preconditioner (an effective Maxwell solver which was featured in a 2008 report by the U.S. Department of Energy as one of the top 10 breakthroughs in computational science in recent years). I developed the framework and theory of the Method of Subspace Corrections that have been widely used in the literature for the design and analysis of iterative methods and later established the Xu-Zikatanov identity, giving the optimal theory for these methods. I also designed the Morley-Wang-Xu (MWX) element, which is the only known class of finite elements universally constructed for elliptic partial differential equations of any order in any spatial dimension.

In recent years, I have spent much time research of deep learning, focusing on approximation theory, deep learning model, training algorithms, and application to numerical PDEs. In particular, we have observed close conneciton between ReLU-DNN and linear FEM, CNN and multigrid method and developed MgNet that can outperform the corresponding CNN for various applications such as image classification.