Date | Speaker | Topic |
---|---|---|
M Sep. 8 | Neta Rabin (Tel Aviv University) | Title: Multi-Scale Kernel Methods: Applications to Grid Refinement and Data Augmentation Abstract: Multi-scale models provide a simple yet powerful framework for approximating and extending functions defined over grid-based or scattered datasets. In this talk, we focus on multi-scale kernel methods, where convolution with Gaussian kernels of progressively decreasing bandwidths yields a multi-scale representation. In statistics, this approach is closely related to the Nadaraya–Watson estimator. The resulting high-order approximation is constructed by iterating until the difference between the function and its approximation falls below a predefined error threshold. |
M Sep. 15 | Katya Epshteyn (U. Utah) | Title: Structure-Preserving Algorithms for Hyperbolic Balance Laws and Related PDE-Based Models Abstract: In this talk, we will discuss progress in the design of structure-preserving numerical methods for hyperbolic and related nonlinear PDE-based models, including systems with uncertainty. As a primary example, shallow water equations will be considered, but the developed ideas can be extended to a wider class of models, including different models of conservation and balance laws. Shallow water systems are widely used in many scientific and engineering applications related to the modeling of water flows in rivers, lakes, and coastal areas. Thus, stable and accurate numerical methods for shallow water models are needed. Although some algorithms are well-studied for deterministic shallow water systems, more effort should be devoted to handling such equations with uncertainty. We will show that the structure-preserving numerical methods that we developed for these models deliver high resolution and satisfy important stability conditions. We will illustrate the performance of the designed algorithms on a number of challenging numerical tests. Current and future research will be discussed as well. Part of this talk is based on the recent work with Dihan Dai, Akil Narayan, Yinqian Yu, and is partially supported by the NSF-DMS Award 2207207 and Simons Foundation Fellowship Award SFI-MPS-SFM-00010667. |
M Sep. 22 | Joseph Nasser (Brandeis) | Title: A Calculus for Transcription Abstract: What language should we use to describe the natural world: words, pictures, math, computer programs, something else? The discipline of physics has historically used mathematics with great success. The use of mathematics in biology has been more sporadic. I will begin by highlighting some historical uses of mathematics in biology. Then I will describe recent efforts to formulate a “calculus for transcription”. Transcription is the process by which RNA is synthesized from a DNA template. A “calculus for transcription” refers to a hypothetical mathematical framework that can be used to reason about transcription. This talk will be self contained and no previous knowledge of transcription will be assumed. |
M Sep. 29 | Daniel McKenzie (Mines) | Title: Faster Decision-Focused Learning over Polytopes Abstract: Many real-world problems can be distilled into an optimization problem, for which many good algorithms exist. However, it is often the case that certain key parameters in the optimization problem are not observed directly. Instead, one can observe large amounts of data that is correlated with these parameters, but in ways that are not easy to describe. This raises the possibility of combining machine learning (to predict the unknown parameters) with optimization (to solve the problem of interest). This combination is sometimes called decision-focused learning. In this talk I’ll give an introduction to this field and describe some recent work done by myself and collaborators. |
M Oct. 6 | Deepanshu Verma (Clemson) | Title: Neural Network Approaches for Optimal Control: Implicit Hamiltonians and Transferable Policies Abstract: This talk presents two neural network methodologies advancing optimal control beyond current limitations. First, we address implicit Hamiltonians in practical problems like space shuttle reentry, where existing methods fail without explicit feedback control formulas. Our end-to-end implicit deep learning approach directly parameterizes value functions to handle the underlying implicit structure while enforcing physical principles through the relationship between optimal control and value function gradients, bridging Pontryagin’s Maximum Principle and Dynamic Programming. Using Jacobian-Free Backpropagation, we efficiently train the implicit networks for high-dimensional feedback controllers in previously intractable scenarios. Second, we tackle the computational burden of re-solving problems when objectives change. Our function encoder framework learns reusable neural basis functions enabling zero-shot adaptation through offline-online decomposition: basis functions are learned once, while adaptation requires only lightweight coefficient estimation. Experiments demonstrate near-optimal performance across diverse dynamics with minimal overhead. These approaches expand neural HJB applicability by handling structural complexity through implicit Hamiltonians and enabling operational flexibility through transferable policies for real-time deployment. |
M Oct. 20 | Sarah Cannon (Claremont McKenna) |
|
M Oct. 27 | Linh Huynh (Dartmouth) |
|
M Nov. 3 | Miriam Gordin (Brown) | |
M Nov. 10 | Daniel Pickard (MIT) | |
M Nov. 17 | Varun Kharuna (Brown) | |
M Nov. 24 | Boris Landa (Yale) | |
M Dec. 1 | Lucas Janson (Harvard) | |
M Dec. 8 | Chris Criscitiello (UPenn) |