QR Code for Course Website

ENM5320

AI4Science/Science4AI: Combining theoretical mechanics, numerical analysis, and machine learning

Spring 2026

Meeting Time: MW 10:15am-11:44am
Location: TOWNE 309
Dates: January 15 - April 30, 2026
Office Hours: Tuesday 9:30-11:30am in AGH 519
TA Office Hours: Friday 1:00-3:00 outside AGH 521
Discussion Board: Ed Discussion

Course Description

Many seek to replicate the successes of AI/ML in computer vision and natural language processing in the sciences, aiming to tackle previously inaccessible problems in scientific discovery, engineering prediction, and optimal design. ML however has been powered by "black-box" architectures specifically tailored toward text/image data which lack the mathematical structure necessary to provide predictions which meet the requirements for high-consequence science and engineering: consistency with physical principles, numerical robustness, interpretability, and amenability to uncertainty quantification.

In this course we will survey theories of variational mechanics, geometric dynamics and numerical analysis to understand how to construct simulators from data which respect mechanical and geometric principles.

While ML may improve engineering models (AI4Science), we can also use scientific computing principles to improve the performance of ML models for physics-agnostic tasks (Science4AI). Many "black-box" architectures admit alternative representations from scientific computing, e.g. CNNs as finite differences, multilayer perceptrons as B-splines, ResNets as discrete differential equations, graph attention networks as finite element/volume methods, or generative models as stochastic differential equations.

The course will initially focus on reviewing material necessary for data-driven modeling, including: probability, variational calculus, and discretizations of partial/ordinary-differential equations. We will consider problems from both engineering settings and data analytics, focusing on problems of engineering relevance such as inverse problems, reduced-order modeling, and data assimilation.

Course Schedule

Jan 19

No class - MLK Jr. Day

Jan 21

Lecture 2: Universal Approximation Theory

Universal approximation with polynomials and neural networks, Lagrange interpolation, Bernstein polynomials, ReLU networks, convergence analysis.

Jan 26

Snow Day - No Class

Jan 28

Lecture 3: Fourier Analysis and Finite Difference Methods

PDE classification, Fourier analysis fundamentals, finite difference operators, stability and convergence analysis.

Feb 2

Lecture 4: Discrete Norms, Stability Analysis, and Lax Equivalence

Discrete inner products, operator norms, DFT, stability analysis, Lax equivalence theorem.

Feb 9

Lecture 5: Numerical Stabilization and Polynomial Reproduction

Artificial viscosity, Lax-Friedrichs, Lax-Wendroff, implicit schemes, polynomial reproduction.

Feb 11

Lecture 6: Constrained Optimization and Polynomial Reproduction

Lagrange multipliers, Schur complement, moving least squares, learning stencils.

Feb 16

Lecture 7: Hamiltonian Dynamics and Energy-Conserving Integrators

Canonical Hamiltonians, symplectic structure, Liouville theorem, discrete gradient method.

Feb 18

Hackathon Day

Develop linear stencil for Kelvin-Helmholtz problem.

Feb 23

Snow Day - No Class

Feb 25

Hackathon Day

Develop nonlinear stencil for Kelvin-Helmholtz problem.

Mar 2

Lecture 8: Lagrangian Mechanics and Noether's Theorem

Functional derivatives, Euler-Lagrange equations, principle of least action, Noether's theorem.

Mar 4

Lecture 9: Spatial Discretization via Discrete Action Principle

Discrete Lagrangian, shift operators, Noether constraints, stencil classification.

Mar 7-15

No class - Spring Break

Mar 16

Lecture 12: Introduction to Finite Element Methods

Weak formulation, Galerkin method, stiffness matrices, Gaussian quadrature.

Mar 18

Lecture 13: Quasi-Optimality and Lax-Milgram Theory

Error estimation, duality argument, interpolation theory, Lax-Milgram theorem.

Mar 23

Hackathon Day

2D FEM using scikit-FEM, physics-informed neural networks, and starting HW4.

Mar 25

Lecture 14: Applications of Lax-Milgram and Mixed FEM

Biharmonic, reaction-diffusion, elasticity equations, locking, inf-sup condition.

Mar 30

Lecture 15: Conservation Laws and De Rham Complexes

Saddle-point problems, inf-sup stability, de Rham complex, Whitney forms.

Apr 1

Lecture 10: Multi-Stage Time Integration and Symplectic Methods

Runge-Kutta schemes, Butcher tableaux, symplectic integrators, stability analysis.

Apr 6

Lecture 11: Stochastic Differential Equations and Probabilistic Learning

Wiener process, SDEs, Euler-Maruyama, MLE/NLL, metriplectic formalism.

Apr 8

Lecture 16: Graph Calculus and Spectral Graph Theory

Exterior calculus, graph Laplacian, spectral properties, Fiedler eigenvalue.

Apr 13

Lecture 17: Helmholtz-Hodge Decomposition and Graph Calculus

Exact sequences, Hodge decomposition, Chorin projection, causal inference.

Apr 15

Lecture 18: Attention Mechanisms and Physics-Inspired Architectures

Multi-head attention, GATs, over-squashing, GRAND, Hamiltonian neural networks.

Apr 20

Lecture 19: Variational Inference and Variational Autoencoders

ELBO, KL divergence, VAEs, reparameterization trick, mixture/product of experts.

Apr 22

Lecture 20: Denoising Diffusion Probabilistic Models

Forward/reverse processes, score matching, DDPM architecture, sampling strategies.

Apr 27

Lecture 21: TBD

Apr 29

Lecture 22: TBD

Course Objectives

By the end of this course, you should be able to:

Prerequisites

Formally this course is a sequence with ENM5310. It is ok if you haven't taken that class, but I will assume mathematical maturity, fluency in Python, familiarity with PyTorch, and a background in probability fundamentals and linear algebra. Experience with numerical analysis will be beneficial but not assumed.

Teaching Staff and Office Hours

Instructor: Dr. Nat Trask

Associate Professor, MEAM

Email: ntrask@seas.upenn.edu

Office Hours: Tuesday 9:15-10:30am and by Appointment

Location: 5th floor Amy Gutmann Hall (AGH 519)

Graduate TA: Ben Shaffer

Email: ben31@seas.upenn.edu

Office Hours: To be determined

Location: AGH

Reminder: All correspondence should be through the Ed forum. Emails are provided here for special circumstances (e.g., you can't access the OH building) and will otherwise be ignored.

Course Requirements and Evaluation

40%

Homework Assignments

Regular assignments consisting of both analysis and programming. You are encouraged to intelligently use LLMs to assist you in writing code, but not in writing up your reports. If you do use LLMs, or any other resources including internet resources or collaboration with other students, you must attribute them to comply with the Penn code of conduct.

40%

Evaluations

Pencil and paper evaluations will be used to ensure command of material without access to AI or other resources. These will be held in-class and closed-book, with a single sheet of notes allowed.

20%

Final Project

The course will culminate in a project relevant to your research interests, including a short written report and a presentation to the class. Final projects may be done either independently or in groups - consider building collaborations between experimentalists and computational folks. This is a great opportunity to lay the groundwork for a paper that you can finish over the summer!

Course Policies

Late Policy

One late assignment will be accepted up to two days late. Further late assignments will not be accepted without an excuse from Prof. Trask.

Collaboration Policy

You are encouraged to discuss the material with your classmates and to work in groups for any homework assignment, but the final product should be your own work. If you collaborate, in any way, you must acknowledge the collaboration. You should be able to provide a brief explanation of how your learning was improved by the collaboration. If you find this difficult to do, then it is probably the wrong kind of collaboration. This includes using AI tools or consulting stack overflow.

University Policies and Resources

This course will be conducted in accordance with all university policies. The university and the School of Engineering & Applied Science also offer numerous resources to students that may be useful.

Code of Academic Integrity

In accordance with the University's Code of Academic Integrity, all work turned in by students should be an accurate reflection of their knowledge, and, with the exception of working in groups for homework assignments, should be conducted alone. Violation of University Code of Academic Integrity may result in failure of the course.

Students with Disabilities and Learning Differences

Students with disabilities are encouraged to contact Weingarten Learning Resource Center's Office for Student Disabilities Services for information and assistance with the process of accessing reasonable accommodations. For more information, visit their website, or email lrcmail@pobox.upenn.edu.

Counseling and Psychological Services (CAPS)

CAPS is the counseling center for the University of Pennsylvania. CAPS offers free and confidential services to all Penn undergraduate, graduate, and professional students. For more information, visit their website.

Course Resources

We will use Canvas for assignments, and all material will be hosted on the course GitHub. We will use Ed Discussion for questions and announcements about the course. Ed is accessible through a link on the left panel of our Canvas page.

If you have any questions about the class, please create a post on Ed! Use email only for sensitive topics. We will do our best to respond to your questions in a timely manner. If you see others' questions that you can answer or answers that you can improve, do it! Students who have contributed thoughtful comments, questions, and answers throughout the semester will earn extra credit in the class.

Important: While we encourage the use of AI at the graduate level to assist with learning and research, you must attribute AI-generated content and are soley responsible for your work. Pencil and paper exams will be used to ensure that you are actually learning course material. If your performance on exams does not reflect your submitted assignments and AI use is not attributed, it may be considered a violation of the academic integrity policy.