11月演講
|
Colloquium, 國立中山大學應用數學系 林晉宏 教授
Thursday, November 4, 16:10—17:00 化學館 36104教室
Title: Inverse Eigenvalue Problem of a Graph Abstract:We often encounter matrices whose pattern (zero-nonzero, or sign) is known while the precise value of each entry is not clear. Thus, a natural question is what we can say about the spectral property of matrices of a given pattern. When the matrix is real and symmetric, one may use a simple graph to describe its off-diagonal nonzero support. For example, it is known that an irreducible tridiagonal matrix (whose pattern is described by a path) only allows eigenvalues of multiplicity one. In contrast, a periodic Jacobi matrix (whose pattern is described by a cycle) allows multiplicity two but no more. The inverse eigenvalue problem of a graph (IEPG) focuses on the matrices whose pattern is described by a given graph and studies their possible spectra. In this talk, we will go through some of the histories of the IEPG and see how combinatorial methods (zero forcing) and analytic methods (implicit function theorem) can come into play in modern-day research.
|
Colloquium, 中央研究院數學研究所 林正洪 研究員
Thursday, November 18, 16:10—17:00 化學館 36104教室
Title: Leech Lattice and Holomorphic Vertex Operator Algebras of Central Charge 24 Abstract:In this talk, I will first review the classification of positive definite even unimodular lattices of rank 24. It turns out that there is a 1 to 1 correspondence between the isometry types of positive definite even unimodular lattices of rank 24 and the deep holes of the Leech lattice. Then I will try to explain how one can generalize the similar ideas to classify holomorphic vertex operator algebras of central charge 24. The Leech lattice also plays a very important role in the classification.
|
Colloquium, 國立台灣大學數學系 楊鈞澔教授
Thursday, November 25, 15:10—16:00 化學館 36104教室
Title:Geometry, Statistics, and Deep Learning Abstract:There will be two parts in this seminar talk: (1) the application of geometry in statistics and deep learning and (2) the problem of dimension reduction for data on a Grassmann manifold. In the first part, I will briefly introduce some research areas where geometry meets statistics and deep learning. There are two main directions: (i) studying the geometry of statistical models (information geometry) and (ii) statistical analysis of geometric data (geometric statistics and geometric deep learning). I will also point out some interesting research problems in these areas. In the second part, I will discuss a research problem that I have been working on recently: dimension reduction with nested Grassmann manifolds. In this work, we proposed to use the nested structure of Grassmann manifolds to solve the dimension reduction problem for data residing on a Grassmann manifold. The main application is in dimension reduction of planar shapes.
|
Colloquium, 國立台灣大學數學系 王振男教授
Thursday, November 25, 16:10—17:00 化學館 36104教室
Title: Well-Posedness vs Ill-Posedness in PDEs Abstract:When dealing with direct problems in PDEs, we normally expect that the problems are well-posedness, i.e., there exists a unique solution and the solution depends continuously on data. However, for most inverse problems, well-posedness may fail. We may be able to prove the uniqueness of the problem. But the continuous dependence does not hold in the usual sense. In this talk, I would like to discuss some interesting phenomena of continuous dependence in several inverse problems. In some cases, I will also show that the continuous dependence will improve in high frequency. We will view the increasing stability phenomenon from the viewpoints of stability and instability estimates.
|