Course Description
Many seek to replicate the successes of AI/ML in computer vision and natural language processing in the sciences, aiming to tackle previously inaccessible problems in scientific discovery, engineering prediction, and optimal design. ML however has been powered by "black-box" architectures specifically tailored toward text/image data which lack the mathematical structure necessary to provide predictions which meet the requirements for high-consequence science and engineering: consistency with physical principles, numerical robustness, interpretability, and amenability to uncertainty quantification.
In this course we will survey theories of variational mechanics, geometric dynamics and numerical analysis to understand how to construct simulators from data which respect mechanical and geometric principles.
While ML may improve engineering models (AI4Science), we can also use scientific computing principles to improve the performance of ML models for physics-agnostic tasks (Science4AI). Many "black-box" architectures admit alternative representations from scientific computing, e.g. CNNs as finite differences, multilayer perceptrons as B-splines, ResNets as discrete differential equations, graph attention networks as finite element/volume methods, or generative models as stochastic differential equations.
The course will initially focus on reviewing material necessary for data-driven modeling, including: probability, variational calculus, and discretizations of partial/ordinary-differential equations. We will consider problems from both engineering settings and data analytics, focusing on problems of engineering relevance such as inverse problems, reduced-order modeling, and data assimilation.
Course Schedule
Lecture 1: Course Logistics & Mathematical Fundamentals
Course logistics. Fundamentals of analysis and PyTorch.
No class - MLK Jr. Day
Lecture 2: Universal Approximation Theory
Universal approximation with polynomials and neural networks, Lagrange interpolation, Bernstein polynomials, ReLU networks, convergence analysis.
Snow Day - No Class
Lecture 3: Fourier Analysis and Finite Difference Methods
PDE classification, Fourier analysis fundamentals, finite difference operators, stability and convergence analysis.
Lecture 4: Discrete Norms, Stability Analysis, and Lax Equivalence
Discrete inner products, operator norms, DFT, stability analysis, Lax equivalence theorem.
Exercise Session
In-class exercises.
Lecture 5: Numerical Stabilization and Polynomial Reproduction
Artificial viscosity, Lax-Friedrichs, Lax-Wendroff, implicit schemes, polynomial reproduction.
Lecture 6: Constrained Optimization and Polynomial Reproduction
Lagrange multipliers, Schur complement, moving least squares, learning stencils.
Lecture 7: Hamiltonian Dynamics and Energy-Conserving Integrators
Canonical Hamiltonians, symplectic structure, Liouville theorem, discrete gradient method.
Snow Day - No Class
Hackathon Day
Develop nonlinear stencil for Kelvin-Helmholtz problem.
Lecture 8: Lagrangian Mechanics and Noether's Theorem
Functional derivatives, Euler-Lagrange equations, principle of least action, Noether's theorem.
Lecture 9: Spatial Discretization via Discrete Action Principle
Discrete Lagrangian, shift operators, Noether constraints, stencil classification.
No class - Spring Break
Lecture 12: Introduction to Finite Element Methods
Weak formulation, Galerkin method, stiffness matrices, Gaussian quadrature.
Lecture 13: Quasi-Optimality and Lax-Milgram Theory
Error estimation, duality argument, interpolation theory, Lax-Milgram theorem.
Hackathon Day
2D FEM using scikit-FEM, physics-informed neural networks, and starting HW4.
Lecture 14: Applications of Lax-Milgram and Mixed FEM
Biharmonic, reaction-diffusion, elasticity equations, locking, inf-sup condition.
Lecture 15: Conservation Laws and De Rham Complexes
Saddle-point problems, inf-sup stability, de Rham complex, Whitney forms.
Lecture 10: Multi-Stage Time Integration and Symplectic Methods
Runge-Kutta schemes, Butcher tableaux, symplectic integrators, stability analysis.
Lecture 11: Stochastic Differential Equations and Probabilistic Learning
Wiener process, SDEs, Euler-Maruyama, MLE/NLL, metriplectic formalism.
Lecture 16: Graph Calculus and Spectral Graph Theory
Exterior calculus, graph Laplacian, spectral properties, Fiedler eigenvalue.
Lecture 17: Helmholtz-Hodge Decomposition and Graph Calculus
Exact sequences, Hodge decomposition, Chorin projection, causal inference.
Lecture 18: Attention Mechanisms and Physics-Inspired Architectures
Multi-head attention, GATs, over-squashing, GRAND, Hamiltonian neural networks.
Lecture 19: Variational Inference and Variational Autoencoders
ELBO, KL divergence, VAEs, reparameterization trick, mixture/product of experts.
Lecture 20: Denoising Diffusion Probabilistic Models
Forward/reverse processes, score matching, DDPM architecture, sampling strategies.
Lecture 21: TBD
Lecture 22: TBD
Course Objectives
By the end of this course, you should be able to:
- Comfortably use PyTorch or another automatic differentiation library to fit physics to traditional models
- Implement standard schemes (finite differences, volumes, elements) into a simple 1D code
- Use variational principles and numerical analysis to propose novel machine learning architectures
Prerequisites
Formally this course is a sequence with ENM5310. It is ok if you haven't taken that class, but I will assume mathematical maturity, fluency in Python, familiarity with PyTorch, and a background in probability fundamentals and linear algebra. Experience with numerical analysis will be beneficial but not assumed.
Teaching Staff and Office Hours
Instructor: Dr. Nat Trask
Associate Professor, MEAM
Email: ntrask@seas.upenn.edu
Office Hours: Tuesday 9:15-10:30am and by Appointment
Location: 5th floor Amy Gutmann Hall (AGH 519)
Graduate TA: Ben Shaffer
Email: ben31@seas.upenn.edu
Office Hours: To be determined
Location: AGH
Reminder: All correspondence should be through the Ed forum. Emails are provided here for special circumstances (e.g., you can't access the OH building) and will otherwise be ignored.
Course Requirements and Evaluation
Homework Assignments
Regular assignments consisting of both analysis and programming. You are encouraged to intelligently use LLMs to assist you in writing code, but not in writing up your reports. If you do use LLMs, or any other resources including internet resources or collaboration with other students, you must attribute them to comply with the Penn code of conduct.
Evaluations
Pencil and paper evaluations will be used to ensure command of material without access to AI or other resources. These will be held in-class and closed-book, with a single sheet of notes allowed.
Final Project
The course will culminate in a project relevant to your research interests, including a short written report and a presentation to the class. Final projects may be done either independently or in groups - consider building collaborations between experimentalists and computational folks. This is a great opportunity to lay the groundwork for a paper that you can finish over the summer!
Course Policies
Late Policy
One late assignment will be accepted up to two days late. Further late assignments will not be accepted without an excuse from Prof. Trask.
Collaboration Policy
You are encouraged to discuss the material with your classmates and to work in groups for any homework assignment, but the final product should be your own work. If you collaborate, in any way, you must acknowledge the collaboration. You should be able to provide a brief explanation of how your learning was improved by the collaboration. If you find this difficult to do, then it is probably the wrong kind of collaboration. This includes using AI tools or consulting stack overflow.
University Policies and Resources
This course will be conducted in accordance with all university policies. The university and the School of Engineering & Applied Science also offer numerous resources to students that may be useful.
Code of Academic Integrity
In accordance with the University's Code of Academic Integrity, all work turned in by students should be an accurate reflection of their knowledge, and, with the exception of working in groups for homework assignments, should be conducted alone. Violation of University Code of Academic Integrity may result in failure of the course.
Students with Disabilities and Learning Differences
Students with disabilities are encouraged to contact Weingarten Learning Resource Center's Office for Student Disabilities Services for information and assistance with the process of accessing reasonable accommodations. For more information, visit their website, or email lrcmail@pobox.upenn.edu.
Counseling and Psychological Services (CAPS)
CAPS is the counseling center for the University of Pennsylvania. CAPS offers free and confidential services to all Penn undergraduate, graduate, and professional students. For more information, visit their website.
Course Resources
We will use Canvas for assignments, and all material will be hosted on the course GitHub. We will use Ed Discussion for questions and announcements about the course. Ed is accessible through a link on the left panel of our Canvas page.
If you have any questions about the class, please create a post on Ed! Use email only for sensitive topics. We will do our best to respond to your questions in a timely manner. If you see others' questions that you can answer or answers that you can improve, do it! Students who have contributed thoughtful comments, questions, and answers throughout the semester will earn extra credit in the class.
Important: While we encourage the use of AI at the graduate level to assist with learning and research, you must attribute AI-generated content and are soley responsible for your work. Pencil and paper exams will be used to ensure that you are actually learning course material. If your performance on exams does not reflect your submitted assignments and AI use is not attributed, it may be considered a violation of the academic integrity policy.