Fall 2020

Date Speaker Topic
M Sep 14 Organizational Meeting  

 

 

 

Abstract:

M Sep 21  

 

 

 

Abstract:

M Sep 28 Vince Lyzinski (UMD) The Importance of Being Correlated: Implications of Dependence in Joint Spectral Inference across Multiple Networks

 

Abstract: Spectral inference on multiple networks is a rapidly-developing subfield of graph statistics. Recent work has demonstrated that joint, or simultaneous, spectral embedding of multiple independent network realizations can deliver more accurate estimation than individual spectral decompositions of those same networks. Little attention has been paid, however, to the network correlation that such joint embedding procedures necessarily induce. In this paper, we present a detailed analysis of induced correlation in a {\em generalized omnibus} embedding for multiple networks. We show that our embedding procedure is flexible and robust, and, moreover, we prove a central limit theorem for this embedding and explicitly compute the limiting covariance. We examine how this covariance can impact inference in a network time series, and we construct an appropriately calibrated omnibus embedding that can detect changes in real biological networks that previous embedding procedures could not discern. Our analysis confirms that the effect of induced correlation can be both subtle and transformative, with import in theory and practice.

M Oct 5 Howard Elman (UMD) A Low-Rank Solver for the Stochastic Unsteady Navier-Stokes Problem

 

Abstract: We study a low-rank iterative solver for the unsteady Navier–Stokes equations for incompressible flows with a stochastic viscosity. The equations are discretized using the stochastic Galerkin method, and we consider an all-at-once formulation where the algebraic systems at all the time steps are collected and solved simultaneously. The problem is linearized with Picard’s method. To efficiently solve the linear systems at each step, we use low-rank tensor representations within the Krylov subspace method, which leads to significant reductions in storage requirements and computational costs. Combined with effective mean-based preconditioners and the idea of inexact solve, we show that only a small number of linear iterations are needed at each Picard step. The proposed algorithm is tested with a model of flow in a two-dimensional symmetric step domain with different settings to demonstrate the computational efficiency.

M Oct 12 No seminar University Holiday
M Oct 19 Rongjie Lai (RPI) Chart Auto-encoder for Manifold-Structured Data

 

 

 

Abstract: Abstract: Deep generative models have made tremendous advances in image and signal representation learning and generation. These models employ the full Euclidean space or a bounded subset as the latent space, whose flat geometry, however, is often too simplistic to meaningfully reflect the manifold structure of the data. In this talk, I will discuss our recent working on advocating the use of a multi-chart latent space for better data representation. We analyze the topology requirement of latent space for a faithful latent representation of manifold-structured data. Inspired by differential geometry, we propose a Chart Auto-Encoder (CAE) and prove a universal approximation theorem on its representation capability. We show that the training data size and the network size scale exponentially in approximation error with an exponent depending on the intrinsic dimension of the data manifold. CAE admits desirable manifold properties that auto-encoders with a flat latent space fail to obey, predominantly proximity of data. We conduct extensive experimentation with synthetic and real-life examples to demonstrate that CAE provides reconstruction with high fidelity, preserves proximity in the latent space, and generates new data remaining near the manifold. This is joint work with Stefan Schonsheck and Jie Chen. 

M Oct 26 Gal Mishne (UCSD)  Multiway Tensor Analysis with Neuroscience Applications

 

 

 

Abstract: Experimental advances in neuroscience enable the acquisition of increasingly large-scale, high-dimensional and high-resolution neuronal and behavioral datasets, however addressing the full spatiotemporal complexity of these datasets poses significant challenges for data analysis and modeling. We propose to model such datasets as multiway tensors with an underlying graph structure along each mode, learned from the data. In this talk I will present three frameworks we have developed to model, analyze and organize tensor data that infer the coupled multi-scale structure of the data, reveal latent variables and visualize short and long-term temporal dynamics with applications in calcium imaging analysis, fMRI and artificial neural networks. 

M Nov 2 Sui Tang (UCSB)

 

 

 

Abstract:

M Nov 9 Julien Fageot (McGill)

 

 

 

Abstract:

M Nov 16 Kasso Okoudjou (Tufts)

 

 

 

Abstract:

M Nov 23 Michael Perlmutter (UCLA) 

 

 

 

Abstract:

M Nov 30 Rachel Ward (UT Austin)  

 

 

 

Abstract:

M Dec 7 Monika Nitsche (UNM)  

 

 

 

Abstract:

M Dec 14 Eyal Neuman (Imperial)  

 

 

 

Abstract: