Date | Speaker | Topic |
---|---|---|
M Sep 9 | Organizational Meeting | |
M Sep 16 | Seulip Lee (Tufts) | Efficient and Robust Multiphysics Simulation: A uniform framework for fluid dynamics in porous media |
M Sep 23 | N/A | |
M Sep 30 |
Rocio Diaz Martin (Tufts) |
Constructing New Metrics from Optimal Transport Theory:
Abstract: The optimal transport (OT) problem seeks the most efficient way to transport a distribution of mass from one configuration to another, minimizing the cost associated with the transportation process. This framework has found diverse applications in machine learning due to its ability to define meaningful distances, known as Wasserstein distances, between probability distributions. However, Wasserstein distances can be computationally expensive, particularly in high-dimensional settings. In this seminar, we will introduce a new class of OT-based metrics designed to address these computational challenges. These metrics maintain the desirable properties of the Wasserstein distance while offering significant improvements in efficiency, making them more suitable for large-scale applications. |
M Oct 7 | ||
M Oct 14 | N/A | Indigenous Peoples’ Day |
M Oct 21 | Maryam Bagherian (Idaho State) |
Distance Metric Learning Over Diverse Domains Abstract: Metric learning serves as an approach for uncovering hidden structures within high-dimensional spaces. Through the acquisition of a suitable distance metric, algorithms reliant on distance measurements can more effectively capture the inherent structure of data points, resulting in enhanced performance. In contrast to single metric learning methods, the effectiveness of multi-metric and geometric metric learning becomes evident in handling intricate data distributions and diverse data characteristics. These alternative approaches offer heightened flexibility and interpretability, making them especially valuable for representation learning in intricate non-linear multi-modal datasets. In this context, I provide a concise introduction to the concepts of distance metric learning and introduce methods for extending its applicability to high-dimensional spaces, graphs and manifolds. |
M Oct 28 | Santhosh Karnik (Northeastern) |
Tensor Completion for Low CP-Rank Tensors Via Random Sampling Abstract: Low-rank tensor completion is a natural extension of the classic problem of low-rank matrix completion. Tensor completion has been used in a variety of applications, including recommender systems, visual data, and hyperspectral imaging. However, low-rank tensor completion has been significantly less-studied than low-rank matrix completion. We propose two provably accurate algorithms for low CP-rank tensor completion – one using adaptive sampling and one using nonadaptive sampling. Our algorithms combine matrix completion techniques for a small number of slices along with the simultaneous diagonalization algorithm to learn the factors corresponding to the first two modes, and then solve systems of linear equations to learn the factors corresponding to the remaining modes. In the noise-free setting, both of our algorithms can be proven to work for tensors of any order, run in polynomial time, and require a number of samples that scales roughly linearly with the sum of the dimensions of the tensor. Numerical experiments demonstrate that our algorithms work both on noisy synthetic data as well as real world data. This is joint work with Cullen Haselby, Mark Iwen, and Rongrong Wang. |
M Nov 4 | Rishi Sonthalia (Boston College) |
Universal approximation results for solving PDEs with neural network Abstract: There has been significant recent work on solving PDEs using neural networks on infinite dimensional spaces. In this talk we consider two examples. First, we prove that transformers can effectively approximate the mean-field dynamics of interacting particle systems exhibiting collective behavior, which are fundamental in modeling phenomena across physics, biology, and engineering. We provide theoretical bounds on the approximation error and validate the findings through numerical simulations. Second, we show that finite dimensional neural networks can be used to approximate eigenfunction for the Laplace Beltrami operator on manifolds. We provide quantitative insights into the number of neurons needed to learn spectral information and shed light on the non-convex optimization landscape of training. |
M Nov 11 | N/A | Veterans’ Day |
M Nov 18 | ||
M Nov 25 | N/A |
Thanksgiving week |
M Dec 2 | Melanie Weber (Harvard) |