Parameter Estimation#
Given some data
We can estimate the parameters, \(\boldsymbol{\theta}\), of the model, \(\mathcal{M}\), from the data, \(\mathcal{D}\).
So the name of the game is to approximate the true state, \(\mathbf{u}\), using a model, \(\mathbf{u}_{\boldsymbol{\theta}}\), and then minimize the data likelihood
Some example parameterizations of the state, \(\mathbf{u}_{\boldsymbol \theta}\):
ODE/PDE
Neural Field
Hybrid Model (Parameterization)
Most of the prior information that can be embedded within the problem:
Data, e.g. Historical observations
State Representation / Architectures, e.g. coords –> NerFs, discretized –> CNN
Loss, e.g. physics-informed, gradient-based
Examples#
Partial Differential Equations#
We could describe the state via a PDE with some constraints on variants of the derivative of the field wrt space and time.
Let’s assume we approximate the true state, \(\boldsymbol{u}\), with an approximation
Now, we can add a constraint on the field s.t. it respects some physical laws. These laws are prescribed by constraining the derivative wrt space and time. So we can explicitly constrain the state with the following set of equations:
where \(\boldsymbol{R}(\cdot)\) is the constraint on the state itself, \(\boldsymbol{R_{bc}}(\cdot)\) is the constraint on the boundaries, and \(\boldsymbol{R_{ic}}(\cdot)\) is the constraint on the initial condition. To obtain the solution of this PDE, we have the fundamental theorem of calculus which tells us the solution is of the form:
So assuming we can plug-in-play to find the solution, we can then plug in this to get a solution to the state. From this solution, we can try to minimize the likelihood function for the data, \(p(\mathcal{D}|\boldsymbol{\theta})\). An example likelihood function is to use a Gaussian assumption with a constant noise value.
Now, we can plug this into the full formulation for the loss function:
Code Formulation
# initialize the domain
domain: Domain = Grid(N=(100,100), dx=(0.01, 0.01))
# initialize the BC and IC functions
bc_fn: Callable = ...
ic_fn: Callable = ...
# initialize the field and spatial discretization
state_init: Discretization = FiniteDiffDiscretization(domain, ic_fn)
# describe the equation of motion (+ params)
params: Params = ...
def equation_of_motion(t, u, params):
# parse state and params
u = state.u
diffusivity = params.nu
# apply BCs
u = bc_fn(u)
# equation of motion
u_rhs = diffusivity * laplacian(u)
return u_rhs
# initialize the time stepper (solver)
solver: TimeStepper = Euler()
# solver ODE solution
u = ODESolve(fn, solver, u_state, params, bc_fn)
Neural Fields#
Neural fields (NerFs) are a coordinate-based model that parameterizes the field based on the spatio-temporal locations.
The parameters, \(\boldsymbol{\theta}\), could be a function or it could be parameters described by a PDE. Now, if we assume that we don’t have
Note: In this example, we assumed that we simply observe a corrupted version of the state. However, we could also not directly observe the state and we would need an additional operator. For example, we could have a composite operator that first parameterizes the state and then transforms the state quantity to the observed quantity, i.e. \(y = \boldsymbol{H}_q \circ \boldsymbol{H}_u (\mathbf{x},t)\)
Conditional Generative Models#
First, let’s assume that we have a likelihood term which describes the posterior directly.
Now, we want to estimate the distribution of
We can use the change of variables formulation to estimate the conditional distribution
where