Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

GP Tutorial Master List

A reconciled, exhaustive curriculum spanning what currently exists in gaussx, pyrox, and research_notebook, plus gaps surfaced from the gaussx + pyrox public APIs, open GitHub issues, and pyrox design_docs/. Goal: the most complete GP tutorial sequence we could ship.

Bayesian NN / NeRF / basis-function-regression tutorials live in ../bayesian_nns/TUTORIAL_MASTER_LIST.md. Cross-listed items (RFF, deep kernels, BLR, last-layer-Bayes) are flagged 🔁.

Legend — Source columns:

Scope tag: 🧱 fundamental · 🔬 research · 🌉 bridge · 🔁 cross-listed

Refs column: gh#N = open GitHub issue · dd:path = pyrox design_docs/pyrox/<path> · api:foo = gaussx exported symbol.


Curriculum at a glance

A bird’s-eye view of the parts and their subparts. Skim this first to orient; the detailed per-tutorial tables live below.


Part 0 — Linear Algebra & Gaussian Foundations

0.A — The Multivariate Gaussian

Key equations / models:

#TutorialSourceScopeRefs / Notes
0.1The Multivariate Gaussian: density, sampling, conditioningR multivariate_gaussian🧱pedagogical entry — three sampling routes, marginal & Schur conditioning, jitter
0.2MultivariateNormal & MultivariateNormalPrecision distribution APIR mvn_distribution_api🧱covariance vs precision parameterisation, GMRF / banded Λ, round-trip equivalence
0.3Quadratic forms, entropy, KL between GaussiansR gaussian_quantities🧱api: gaussian_entropy, dist_kl_divergence, kl_standard_normal, quadratic_form, gaussian_expected_log_lik — extended to cover score, cross-entropy, expected log-likelihood, mutual information, mini-ELBO

0.B — Parameterizations

Key equations / models:

#TutorialSourceScopeRefs / Notes
0.4Three parameterizations: mean-cov ↔ natural ↔ expectationR natural_parameters🧱api: mean_cov_to_natural, natural_to_mean_cov, natural_to_expectation, expectation_to_natural, damped_natural_update — round-trip identities, conjugate update as natural-form addition, moment matching, damped VI/EP primitive, use-case map across the curriculum

0.C — Bayesian Updates & Conditioning

Key equations / models:

#TutorialSourceScopeRefs / Notes
0.6Bayesian updates from scratch (sequential conjugate)R bayesian_updates🧱natural-form addition recursion, batch = sequential = any order, GP regression as single application
0.7Conditional distributions & Schur complementR conditional_distributions🧱api: gaussx.conditional, schur_complement, conditional_variance, cov_transform; GP regression as joint conditioning
0.8Structured MVN sampling dispatchR structured_sampling🧱api: gaussx.cholesky, gaussx.sqrt; dispatch on Diagonal / Kronecker / BlockDiag / BlockTriDiag; LowRank additive sampling; fast-sampling tracking issues gaussx#168 (Toeplitz), #169 (KroneckerSum), #170 (SumKronecker)

0.D — Numerical Mechanics

Key equations / models:

#TutorialSourceScopeRefs / Notes
0.5Joseph-form covariance updateR joseph_form_update🧱four equivalent covariance updates (standard / symmetric / information / Joseph), float32 stress test, connection to natural-parameter addition; preservation counterpart to 0.11’s recovery tools
0.9Cholesky, log-det, trace primitives tourR cholesky_logdet_trace🧱api: gaussx.cholesky, gaussx.logdet, gaussx.trace, gaussx.diag — closed-form identities / compute / storage tables, theoretical-order plots, Hutchinson stochastic trace
0.10Differentiating through solveR differentiating_solve🧱implicit-function-theorem JVP/VJP via lineax, Jacobi’s formula for logdet gradients, GP marginal-likelihood ascent in one jax.grad call
0.11Numerical stability: jitter, safe Cholesky, condition numberR numerical_stability🧱api: gaussx.add_jitter, gaussx.safe_cholesky — condition-number diagnostic, bias–stability U-curve trade-off, float32 stress; jitter as recovery vs Joseph as preservation
0.12Stable RBF & squared distancesR stable_rbf_distances🧱api: gaussx.stable_rbf_kernel, gaussx.stable_squared_distances — mixed-precision recipe, catastrophic cancellation, three-stage robustness pipeline (stable distances → jitter / safe Cholesky → Joseph form)

Part 1 — Structured Linear Operators

1.A — Operator Zoo (catalog)

Key equations / models:

#TutorialSourceScopeRefs / Notes
1.1Operator basics + structural tags & dispatch (Dense, Diagonal, Kronecker, BlockDiag, LowRankUpdate; tag inventory; isinstance dispatch; bring-your-own-operator Circulant demo)G operator_basics🧱✅ merged 1.1+1.7; replaces former basics/operator_zoo stubs
1.2Lazy operator algebra (Sum, Scaled, Product)G lazy_algebra🧱
1.3KroneckerSum vs SumKronecker (additive vs superposed)G kronecker_sum_vs_sum_kronecker🧱
1.4Toeplitz operators for stationary 1-D / 2-D gridsG toeplitz🧱
1.5BlockTriDiag (Markov / Kalman precision form) + Lower/Upper variantsG block_tridiag🧱
1.6MaskedOperator for missing data on a structured grid (MVN / Toeplitz / Kron / BlockTriDiag bases)G masked_operator🧱

1.B — Matrix Identities & Decompositions

Key equations / models:

#TutorialSourceScopeRefs / Notes
1.8Kronecker eigendecompositionG kronecker_eigen🧱
1.9Sherman–Morrison–Woodbury walkthroughG woodbury_solve🧱
1.10Operator sandwich A P Aᵀ without materialization🧱GAP — gh:gaussx#163
1.11UDL decomposition for block-tridiagonal precision🧱GAP — gh:gaussx#65
1.12Discrete Lyapunov solve (stationary covariance of LTI)🧱GAP — api: discrete_lyapunov_solve

1.C — Matrix-Free / Implicit

Key equations / models:

#TutorialSourceScopeRefs / Notes
1.13Matrix-free / implicit operatorsG implicit_kernel🧱extend with ImplicitCrossKernelOperator (GAP)

1.D — Solvers

Key equations / models:

#TutorialSourceScopeRefs / Notes
1.14Solver strategies overview (dense, CG, Lanczos)G solver_strategies, G solver_comparison🧱merge candidates
1.15Preconditioned CG🧱GAP — api: PreconditionedCGSolver
1.16BBMM — Black-Box Matrix-Matrix Multiplication🧱GAP — api: BBMMSolver; GPyTorch-style
1.17Indefinite/non-PSD: MINRES / LSMR🧱GAP — api: MINRESSolver, LSMRSolver
1.18Auto-dispatch (AutoSolver, ComposedSolver)🧱GAP

1.E — Trace, Log-Det, Roots

Key equations / models:

#TutorialSourceScopeRefs / Notes
1.19Stochastic Lanczos Quadrature log-det🧱GAP — api: SLQLogdet, IndefiniteSLQLogdet
1.20RNLA — randomized numerical linear algebra port🧱GAP — gh:gaussx#156
1.21Contour-integral sqrt_inv_matmul / sqrt_matmul🧱GAP — gh:gaussx#43
1.22Root & inverse-root decompositions🧱GAP — gh:gaussx#40
1.23Joint inverse-quadratic + log-det🧱GAP — gh:gaussx#39

Part 2 — Kernels

2.A — Standard kernels

Key equations / models:

#TutorialSourceScopeRefs / Notes
2.1Kernel cookbook: RBF, Matérn, Periodic, Linear, Polynomial🧱GAP
2.2Kernel composition: sum, product, warping🧱GAP
2.3ARD & lengthscale interpretation🧱GAP
2.4Stationary vs non-stationary kernels🧱GAP
2.13Pytree kernel composition — sum / product / scaled as pytrees🧱GAP — canonical JAX pattern, important for gaussx/pyrox users

2.B — Deep kernels

Spectral kernels (Bochner / Spectral Mixture) live in Part 7 — Spectral GPs. This subsection covers neural-network-warped kernels only.

Key equations / models:

#TutorialSourceScopeRefs / Notes
2.6Deep kernels (NN-warped inputs)R pyroxgp/04_svgp_rff_nn🌉 🔁
2.7ArcCosine kernel (NN-correspondence)🧱 🔁GAP — dd:features/gp/gpflow.md

2.C — Multi-output kernels

Key equations / models:

#TutorialSourceScopeRefs / Notes
2.8Multi-output: LMC, ICM, OILMMP multioutput_gp🧱
2.9OILMM mechanics: project / back-project🧱GAP — api: oilmm_project, oilmm_back_project

2.D — Spherical / localized kernels (moved)

Spherical / harmonic / Slepian kernels are now grouped under Part 7.F — Spherical / periodic spectral, where they sit alongside the full spectral toolkit (Bochner, RFF, HSGP). Riemannian / graph kernels remain in 2.F.

2.E — Kernel-based statistics & utilities

Key equations / models:

#TutorialSourceScopeRefs / Notes
2.11Kernel centering & KPCA🧱GAP — api: center_kernel, centering_operator
2.12Kernel-based statistics: HSIC & MMD🧱GAP — api: hsic, mmd_squared

2.F — Non-Euclidean & operator-valued kernels

Key equations / models:

#TutorialSourceScopeRefs / Notes
2.15Operator-valued kernel regression — function-valued outputs (velocity fields, spectral curves)🔬GAP
2.16GP regression on graphs — heat kernel k(u,v)=exp(tL)uvk(u,v)=\exp(-tL)_{uv}🔬GAP
2.17GP regression on Riemannian manifolds — geodesic distance kernels🔬GAP

Part 3 — Exact GP Regression

3.A — Foundations

Key equations / models:

#TutorialSourceScopeRefs / Notes
3.1Kernel ridge regression / GP “hello world”G kernel_regression🧱
3.2Exact GP regression — three patternsP exact_gp_regression🧱
3.3Hyperparameter learning: marginal likelihood🧱GAP
3.8GP regression with mean function — constant / linear / NN mean; posterior shift under strong prior🧱GAP
3.9Empirical Bayes / type-II MLE for hyperparameter priors — log-normal priors on ,σ2\ell, \sigma^2, joint optimisation🧱GAP
3.10Batch GP regression — vmap over BB independent GPs simultaneously🧱GAP — canonical JAX pattern
3.11GPU-accelerated exact GP / tile-based Cholesky — block-Cholesky for exact GPs up to N50kN\approx 50k🔬GAP

3.B — Diagnostics

Key equations / models:

#TutorialSourceScopeRefs / Notes
3.4LOVE: fast leave-one-out CVG love_crossval🌉
3.5Predictive variance & calibration diagnostics🧱GAP
3.12Bayesian model selection — compare kernels via log marginal likelihood, Bayes factors, WAIC🧱GAP
3.13Predictive distribution anatomy — decompose posterior mean vs variance; under/oversmoothing regimes🧱GAP — pedagogical

3.C — Heteroscedastic noise

Key equations / models:

#TutorialSourceScopeRefs / Notes
3.6Heteroscedastic GP — two coupled latent GPs🧱GAP — dd:examples/gp/moments.md

3.D — High-level API

Key equations / models:

#TutorialSourceScopeRefs / Notes
3.7sklearn-style GPEstimator facade🧱GAP — gh:pyrox#71

3.E — Constrained & Physics-informed GPs

Key equations / models:

#TutorialSourceScopeRefs / Notes
3.14Monotone GP — derivative observations or monotone projection🔬GAP
3.15Convex GP — Hessian positivity constraints🔬GAP
3.16Boundary-condition GP — zero mean at domain boundary via eigenfunction basis🔬GAP
3.17PDE-constrained GP — encode Lf=0\mathcal{L}f=0 as linear operator observations (Raissi et al.)🔬GAP — generalises monotone GP to arbitrary linear operators
3.18Student-t process — heavier-tailed alternative to GP with tractable posterior🧱GAP

Part 4 — Structured GPs

GPs whose covariance has direct algebraic structure (Kronecker, Toeplitz, grid, sparse-precision) — exploited by Part 1 operators.

4.A — Kronecker GPs

Key equations / models:

#TutorialSourceScopeRefs / Notes
4.1GPs on 2D grids with Kronecker structureG gp_2d_grid🌉
4.2Combined Kronecker + low-rankG structured_gp🌉
4.3Sum-of-Kronecker (additive space + time)R kronecker/01_spain_extremes (uses)🔬could break out a fundamental
4.4Separable spatiotemporal & additive (trend + seasonal + residual)🧱GAP — dd:examples/gp/moments.md
4.5Kronecker marginal log-likelihood & posterior predictive🧱GAP — api: kronecker_mll, kronecker_posterior_predictive

4.B — Grid / Toeplitz GPs

Key equations / models:

#TutorialSourceScopeRefs / Notes
4.6KISS-GP / SKI on grids🧱GAP — api: InterpolatedOperator, cubic_interpolation_weights, grid_data, create_grid
4.7Lattice / Toeplitz GPs for stationary 1D🧱GAP — pairs with 1.4

4.C — Sparse-precision (mesh / GMRF)

Key equations / models:

#TutorialSourceScopeRefs / Notes
4.8SPDE / FEM Matérn — triangulated mesh GMRF, O(n^{3/2})🌉GAP — gh:pyrox#50, dd:features/gp/spde_fem.md

Part 5 — Approximations & Scalability

GPs that scale to large N via inducing points, random features, or iterative solvers — distinct from Part 4 in that the structure is imposed rather than inherent to the data geometry.

5.A — Random features (moved)

Random Fourier Features, Nyström, and FastFood live under Part 7.C — Random Fourier features. This subsection is intentionally empty in Part 5 — the inducing-point / variational story (5.B–5.E) does not depend on RFF derivations, only on the spectral approximation result they deliver.

5.B — Inducing-point fundamentals

Key equations / models:

#TutorialSourceScopeRefs / Notes
5.4Inducing point methods (FITC, DTC, VFE) — theory🧱GAP
5.5Sparse Variational GP (Titsias/Hensman)G sparse_variational_gp, R pyroxgp/01_svgp_standard🌉DUP
5.6Whitening mechanics: whiten_covariance, unwhiten, unwhiten_covariance🧱GAP
5.7Whitened SVGP & Bayesian linear regression viewG whitened_svgp🌉 🔁
5.8Collapsed ELBO🧱GAP — api: collapsed_elbo
5.9Mini-batched SVGP / stochastic VIR pyroxgp/02_svgp_batched🔬
5.10Full SVGP tutorial — 6 guide families incl. orthogonal decoupled🧱GAP — dd:examples/gp/svgp_numpyro.py
5.19Collapsed vs uncollapsed SVGP — explicit comparison of Titsias vs Hensman objectives, bias/variance tradeoff🧱GAP — pedagogical
5.20Online sparse GP (Csató & Opper 2002) — sequential Bayesian update of inducing set without full retraining🔬GAP — complements streaming filter tutorial 8.33

5.C — Inter-domain features

Inter-domain features that are fundamentally spectral (VFF, VISH, Laplacian eigenfunctions) live under Part 7.E — Variational spectral methods and Part 7.F — Spherical / periodic spectral. This subsection now covers only the generic decoupled-basis pattern.

Key equations / models:

#TutorialSourceScopeRefs / Notes
5.14Decoupled inter-domain features (mixed spatial + spectral)🧱GAP — gh:pyrox#49

5.D — Iterative-solver scaling

Key equations / models:

#TutorialSourceScopeRefs / Notes
5.15CG for exact GPs at scale + CGLB🧱GAP — pairs with 1.16; dd:features/gp/gpflow.md
5.16EigenPro spectral preconditioning🧱GAP — gh:gaussx#63
5.17Falkon: Nyström preconditioner + solve recipe🌉GAP — gh:gaussx#49
5.18LogFalkon / GSC-Falkon — Newton outer + preconditioned CG🌉GAP — gh:pyrox#50, dd:features/gp/logfalkon.md

5.E — Deep GPs

Key equations / models:

#TutorialSourceScopeRefs / Notes
5.21Deep GP — doubly stochastic VI ELBO, layer-by-layer sampling (Salimbeni & Deisenroth 2017)🔬GAP — clear gap in Part 5 hierarchy
5.22Convolutional GP — patch-level inducing features for image data (van der Wilk 2017)🔬GAP — natural extension after VISH/VFF

Part 6 — Non-Conjugate Likelihoods & Inference

6.A — Likelihood & integrator zoos

Key equations / models:

#TutorialSourceScopeRefs / Notes
6.1Likelihood zoo: Bernoulli, Poisson, StudentT, Softmax, Heteroscedastic, Exponential, Beta, Gamma, Multi-latent🧱GAP — gh:pyrox#48
6.2Integrator zoo: Gauss–Hermite, MC, Unscented, Taylor, Assumed-Density Filter🧱GAP — api: gaussx _quadrature
6.3Sigma points & cubature🧱GAP — api: sigma_points, cubature_points
6.4Fifth-order symmetric cubature integrator🧱GAP — gh:gaussx#26
6.5Statistical Linear Regression via cubature (SLR)🧱GAP — gh:gaussx#25

6.B — Classification

Key equations / models:

#TutorialSourceScopeRefs / Notes
6.6Latent GP classification — three patterns (Bernoulli + softmax)P latent_gp_classification🧱extend to multi-class per dd

6.C — Newton / Gauss-Newton family

Key equations / models:

#TutorialSourceScopeRefs / Notes
6.7Laplace approximationP advanced_gp_laplace🧱
6.8Gauss–Newton inferenceP advanced_gp_gauss_newton🧱
6.9Quasi-Newton inference (L-BFGS sites)P advanced_gp_qn🧱
6.10Posterior Linearization (Bayes-Newton)P advanced_gp_pl🧱
6.11Newton & damped natural updates🧱GAP — api: newton_update, damped_natural_update
6.12Gauss–Newton & GGN diagonal🧱GAP — api: gauss_newton_precision, ggn_diagonal
6.13Hutchinson Hessian diagonal & Riemannian PSD correction🧱 🔁GAP

6.D — Variational inference

Key equations / models:

#TutorialSourceScopeRefs / Notes
6.14Variational guides — full-rank, mean-field, low-rank, whitened, delta, flow🧱GAP — dd:examples/gp/vgp_numpyro.py + features/gp/variational_families.md
6.15Natural gradient VIG natural_gradient_vi🌉
6.16Conjugate VI for GPs (CVI sites)🧱GAP — api: cvi_update_sites, site_natural_from_tilted
6.23Full VGP (non-sparse) — NN variational parameters, O(N3)O(N^3), no inducing-point approximation🧱GAP — closes gap between sparse and exact tutorials

6.E — Expectation Propagation

Key equations / models:

#TutorialSourceScopeRefs / Notes
6.17Expectation PropagationP advanced_gp_ep, G expectation_propagation🌉DUP
6.18EP cavity & tilted moments mechanics🧱GAP — api: cavity_distribution, ep_tilted_moments; gh:gaussx#24

6.F — Bayesian linear regression & non-standard outputs

Key equations / models:

#TutorialSourceScopeRefs / Notes
6.19Bayesian linear regression updates🧱 🔁GAP — api: blr_diag_update, blr_full_update
6.20Log-Gaussian Cox Process (spatial point-process intensity)🔬GAP — dd:examples/gp/moments.md
6.21Warped GP (Box–Cox for skewed targets)🧱GAP — dd:examples/gp/moments.md
6.24Warped GP with normalizing flows — learnable bijection gg extends Box–Cox to NF-parameterized warpings🔬GAP

6.G — Aggregate Bayesian methods

Key equations / models:

#TutorialSourceScopeRefs / Notes
6.22R-INLA port — integrated nested Laplace approximation🌉GAP — gh:gaussx#155

Part 7 — Spectral GPs

Spectral methods unify several threads from earlier parts: stationary kernels via Bochner’s theorem (extending 2.A), spectral kernel families (former 2.B), random-feature approximations (former 5.A), and spectral inducing-feature methods (subset of former 5.C). This part collects them with a coherent through-line: every stationary kernel is the Fourier transform of a positive measure, and every approximation in this part is a clever way of sampling, parameterizing, or projecting that measure.

7.A — Spectral foundations

Key equations / models:

#TutorialSourceScopeRefs / Notes
7.1Bochner’s theorem — stationary kernels as Fourier transforms of spectral densities🧱GAP — pedagogical anchor for the rest of Part 7
7.2Wiener–Khinchin & power spectral density estimation — covariance ↔ spectrum, periodogram, Welch🧱GAP — connects 1.4 Toeplitz–FFT machinery to spectrum estimation
7.3Karhunen–Loève expansion — eigenpairs of the kernel integral operator🧱GAP — bridge to Mercer / HSGP
7.4Sampling theorem & aliasing — bandlimited priors on regular grids🧱GAP — practical guidance for Toeplitz/Kronecker setups

7.B — Spectral kernel models

Key equations / models:

#TutorialSourceScopeRefs / Notes
7.5Spectral kernels — visual guideP spectral_kernel_models🧱 🔁moved from 2.5
7.6Spectral Mixture (SM) kernel fitting — auto-discover periodicity from data (Wilson & Adams 2013)🔬GAP — visualise learned spectral components; moved from 2.14
7.7Sparse spectrum kernel — point-mass spectral approximation🧱GAP

7.C — Random Fourier features

Key equations / models:

#TutorialSourceScopeRefs / Notes
7.8Random Fourier Features → SSGP → VSSGPP random_fourier_features🧱 🔁moved from 5.1
7.9Kernel approximations: Nyström vs RFFG kernel_approximations, P kernel_approximation🧱DUP — pick one home; moved from 5.2
7.10FastFood structured random features🧱GAP — gh:gaussx#62; moved from 5.3
7.11Orthogonal random features (ORF) — variance reduction over vanilla RFF🧱GAP

7.D — Hilbert-space methods

Key equations / models:

#TutorialSourceScopeRefs / Notes
7.12Hilbert-space GP (HSGP, Solin–Särkkä) — Laplace-eigenbasis spectral approximation🧱GAP — heavy use in Bayesian time-series / NumPyro contrib
7.13Periodic kernel via truncated Fourier basis — exact spectral representation🧱GAP
7.14HSGP convergence diagnostics — error vs MM, boundary-effect calibration🧱GAP

7.E — Variational spectral methods

Key equations / models:

#TutorialSourceScopeRefs / Notes
7.15Variational Fourier Features (VFF, Hensman 2018) — Fourier basis on bounded intervals, diagonal KuuK_{uu}🧱GAPmoved from 5.12; gh:pyrox#49, dd:features/gp/inducing_features.md
7.16Laplacian-eigenfunction inducing features (manifolds, graphs)🧱GAPmoved from 5.13; gh:pyrox#49
7.17VSSGP — variational distribution over RFF frequencies (Gal & Turner 2015)🔬GAP — natural next step after 7.8

7.F — Spherical / periodic spectral

Key equations / models:

#TutorialSourceScopeRefs / Notes
7.18VISH — Variational Inducing Spherical HarmonicsR pyroxgp/03_svgp_spherical_harmonics🔬moved from 5.11; gh:pyrox#49
7.19Slepian positional encodings (spherical, localized)🧱 🔁moved from 2.10; GAP — gh:pyrox#125
7.20Fourier features on the torus / circle — periodic BC, exact spectral kernel🧱GAP
7.21Zonal spectral kernels on S2S^2 — Funk–Hecke diagonalization🧱GAP

Part 8 — Markov / State-Space GPs

8.A — Foundations

Key equations / models:

#TutorialSourceScopeRefs / Notes
8.1Kalman filter + RTS smoother (pure SSM)G kalman_filter🧱
8.2SSM ↔ natural / expectation parameterizations🧱GAP — api: ssm_to_naturals, naturals_to_ssm, expectations_to_ssm
8.3Pairwise marginals & sites🧱GAP — api: pairwise_marginals, GaussianSites, sites_to_precision
8.4SDE autocovariance & process noise🧱GAP — api: sde_autocovariance, process_noise_covariance
8.5Joseph-form Kalman update standalone🧱GAP

8.B — SDE kernel zoo

Key equations / models:

#TutorialSourceScopeRefs / Notes
8.6Matérn kernels in state-space formP markov_gp_sde_kernels🧱
8.7Full SDE kernel zoo: Periodic, QuasiPeriodic, Cosine, Constant, Sum, Product, Subband Matérn🧱GAP — api: gaussx _ssm SDE kernels
8.8SDE linearization & drift-KL helpers🧱GAP — gh:gaussx#70

8.C — Markov GP workflows

Key equations / models:

#TutorialSourceScopeRefs / Notes
8.9Markov GP with Kalman filteringP markov_gp_kalman🧱
8.10Markov GP hyperparameter trainingP markov_gp_training🧱
8.11Non-Gaussian Markov GPP markov_gp_nongauss🧱
8.12Sparse variational Markov GPP sparse_markov_gp🧱
8.13KalmanGuide — Bayes-Newton via pseudo-observations + RTS🧱GAP — dd:features/gp/variational_families.md

8.D — Parallel & scalable filtering

Key equations / models:

#TutorialSourceScopeRefs / Notes
8.14Parallel / batched Kalman filterG parallel_kalman🌉
8.15Square-root parallel Kalman filter / RTS🧱GAP — gh:gaussx#165
8.16SpInGP — sparse parallel-in-time GP🧱GAP — api: spingp_log_likelihood, spingp_posterior
8.17Mean-field block-diagonal Kalman filter🧱GAP — gh:gaussx#29

8.E — Nonlinear filtering

Key equations / models:

#TutorialSourceScopeRefs / Notes
8.18Nonlinear Gaussian Filter (UKF/EKF generalization)🧱GAP — gh:gaussx#161
8.19Extended Kalman Smoother (Taylor(1))🧱GAP — dd:examples/gp/integration_detail.md
8.20Unscented Kalman Smoother (PL + SigmaPoints + Kalman)🧱GAP — dd:examples/gp/integration_detail.md
8.21Cubature Kalman Smoother🧱GAP — dd:examples/gp/integration_detail.md
8.22Innovation cov as structured LowRankUpdate🧱GAP — gh:gaussx#164

8.F — Ensemble methods

Key equations / models:

#TutorialSourceScopeRefs / Notes
8.23Ensemble Kalman Filter on Lorenz-63G ensemble_kalman🔬
8.24Bessel-corrected EnKF + ensemble_kalman_gain🌉GAP — gh:gaussx#127

8.G — Steady-state & structured-Gaussian surfaces

Key equations / models:

#TutorialSourceScopeRefs / Notes
8.25Infinite-horizon Kalman & DARE🧱GAP — api: infinite_horizon_filter/smoother, dare
8.26DARE via Optimistix fixed-point + implicit diff🧱GAP — gh:gaussx#97
8.27MarkovGaussian structured surface🧱GAP — gh:gaussx#76
8.28Spatiotemporal SDE GPs🔬GAP

8.H — Non-conjugate temporal case studies

Key equations / models:

#TutorialSourceScopeRefs / Notes
8.29GP classification with Laplace + Kalman🧱GAP — dd:examples/gp/state_space.md
8.30Poisson counts with EP + Kalman🧱GAP — dd:examples/gp/state_space.md
8.31Changepoint detection via additive temporal GPs🌉GAP — dd:examples/gp/state_space.md
8.32Latent temporal GP in a BHM🌉GAP — dd:examples/gp/state_space.md
8.33Online / streaming GP (filter-only mode)🌉GAP — dd:examples/gp/state_space.md
8.34Non-LTI temporal model via numpyro.scan🧱GAP — dd:examples/gp/state_space.md

Part 9 — Sampling, Pathwise, Conditioning

9.A — Pathwise sampling

Key equations / models:

#TutorialSourceScopeRefs / Notes
9.1Pathwise GP posterior sampling (Wilson 2020)P gp_pathwise🧱
9.2Pathwise sampling with NumPyroP gp_pathwise_numpyro🧱
9.3Decoupled sampling for SVGP🧱GAP

9.B — Matheron’s-rule conditioning

Key equations / models:

#TutorialSourceScopeRefs / Notes
9.4Matheron’s-rule conditioning by sampling🧱GAP — gh:gaussx#77
9.5Partitioned joint conditional sampling🧱GAP — gh:gaussx#79

Part 10 — Uncertainty Propagation & UQ

10.A — Foundations

Key equations / models:

#TutorialSourceScopeRefs / Notes
10.1Moment matching, unscented transform, linearization🧱GAP
10.2Gauss–Hermite quadrature for ELL🧱GAP — used in R kronecker series

10.B — Uncertain inputs

Key equations / models:

#TutorialSourceScopeRefs / Notes
10.3Uncertainty propagation through nonlinear functionsG uncertainty_propagation🌉
10.4GPs with uncertain inputs (PILCO-style)G uncertain_gp_inputs🔬
10.5Multi-step-ahead PILCO autoregressive forecasting🔬GAP — dd:examples/gp/integration_detail.md

10.C — Analytic moments

Key equations / models:

#TutorialSourceScopeRefs / Notes
10.6Ψ-statistics & exact RBF closed form for uncertain inputs🧱GAP — api: compute_psi_statistics, AnalyticalPsiStatistics
10.7Uncertain SVGP / VGP prediction (sigma-point + analytic)🧱GAP — api: uncertain_svgp_predict, uncertain_vgp_predict
10.8Cost / mean / gradient expectations under Gaussian inputs🧱GAP — api: cost_expectation, mean_expectation, gradient_expectation

10.D — BGPLVM

Key equations / models:

#TutorialSourceScopeRefs / Notes
10.9Bayesian GPLVM with uncertain inputs🔬GAP — api: uncertain_bgplvm_predict
10.12GP-LVM (Lawrence 2004) — unsupervised manifold learning vs BGPLVM; Bayesian nonlinear PCA vs PCA/UMAP🔬GAP — distinct from 9.9 (Bayesian extension)
10.13Supervised GPLVM — classification via latent GP representation🔬GAP

10.E — Special integrators & quantiles

Key equations / models:

#TutorialSourceScopeRefs / Notes
10.10Mixture-quantile root-finder🧱GAP — gh:gaussx#121
10.11Custom integrator: importance-weighted MC for rare events🧱GAP — dd:examples/gp/integration_detail.md
10.14GP quadrature / Bayesian cubature — f(x)p(x)dx\int f(x)p(x)\,dx with GP prior on ff; Bayesian numerical integration🔬GAP

Part 11 — Probabilistic Programming Integration

11.A — gaussx + NumPyro

Key equations / models:

#TutorialSourceScopeRefs / Notes
11.1GP regression with NumPyro + gaussxG numpyro_gp🧱
11.2Bayesian linear regression in precision formG numpyro_precision🧱 🔁

11.B — pyrox patterns

Key equations / models:

#TutorialSourceScopeRefs / Notes
11.3Three-pattern regression masterclass: tree_at / pyrox_sample / ParameterizedP regression_masterclass_treeat, _pyrox_sample, _parameterized🧱 🔁

11.C — Hierarchical & sampling

Key equations / models:

#TutorialSourceScopeRefs / Notes
11.4Hierarchical / multi-task GPs in NumPyro🌉GAP
11.5MCMC for GP hyperparameters (NUTS)🧱GAP

Part 12 — Ensembles

Key equations / models:

#TutorialSourceScopeRefs / Notes
12.1Ensemble primitives — three waysP ensemble_primitives_tutorial🧱
12.2EnsembleMAP & EnsembleVI runnersP ensemble_runner_tutorial🧱

Part 13 — Data Pipelines

Key equations / models:

#TutorialSourceScopeRefs / Notes
13.1Spatiotemporal preprocessing (geo + time + pandas)P spatiotemporal_preprocessing🧱
13.2Loading climate data: xarray, zarr, ERA5🔬GAP

Part 14 — Applied Case Studies (research_notebook)

14.A — Spatial extremes

Key equations / models:

#TutorialSourceScopeRefs / Notes
14.1Kronecker GP + GEV likelihood (Spain extremes)R kronecker/01_spain_extremes🔬
14.2Kronecker-multiplicative GP (spatial warming rates)R kronecker/02_spain_multiplicative🔬
14.3Non-stationary GEV (location-dependent tails)R kronecker/03_spain_nonstationary🔬
14.4Gaussian copula spatial dependenceR kronecker/04_spain_copula🔬
14.5BHM with GEV + spatial GPs (methane / precipitation extremes)🔬GAP — dd:examples/gp/moments.md
14.6Temporal extremes: GEV with time-varying μ(t), σ(t), ξ(t)🔬GAP — dd:examples/gp/state_space.md

14.B — SVGP applied

Key equations / models:

#TutorialSourceScopeRefs / Notes
14.7SVGP on real climate data (large-N)R pyroxgp/01–04🔬could split: standard / batched / SH / deep-kernel

14.C — Geophysics & emulation

Key equations / models:

#TutorialSourceScopeRefs / Notes
14.8GP for ocean / SST / sea-level extremes🔬GAP
14.9GP emulator for a numerical model🔬GAP
14.10GP + somax — spatially smooth GP priors for ocean parameters🔬GAP — dd:examples/integration.md
14.11GP + ekalmX/vardax — learned GP dynamics for DA🔬GAP — dd:examples/integration.md
14.16Multi-fidelity GP (Kennedy & O’Hagan 2000) — fuse cheap (coarse) + expensive (fine) simulators via autoregressive GP🔬GAP
14.17ABC-GP emulator — use GP surrogate to bypass expensive likelihood; sample θ via ABC with GP-matched summary statistics🔬GAP

14.D — Optimization & decision

Key equations / models:

#TutorialSourceScopeRefs / Notes
14.12Bayesian optimization with GPs (Expected Improvement)🔬GAP — dd:examples/gp/integration_detail.md
14.18GP-UCB acquisition — Upper Confidence Bound αUCB(x)=μ(x)+βσ(x)\alpha_\mathrm{UCB}(x) = \mu(x) + \beta\sigma(x)🔬GAP
14.19Probability of Improvement (PI) — simpler BO baseline alongside EI🔬GAP
14.20Thompson sampling for BO — use pathwise posterior sampling (connects to 9.1)🔬GAP
14.21Multi-objective BO — Pareto front approximation via GP surrogates🔬GAP — relevant for simulator calibration
14.22GP for contextual bandits — GP reward model with online UCB/Thompson updates🔬GAP — connects BO and online learning
14.23Optimal experimental design — sensor placement via mutual information maximisation I(f;yS)I(f; y_\mathcal{S})🔬GAP

14.E — Causal & event data

Key equations / models:

#TutorialSourceScopeRefs / Notes
14.13Causal inference / counterfactual GPs🔬GAP
14.14Marked temporal point process + GP intensity (seismology, methane plumes)🔬GAP — dd:examples/gp/moments.md

14.F — Practical

Key equations / models:

#TutorialSourceScopeRefs / Notes
14.15Missing data / partial observations with masked likelihood🌉GAP — dd:examples/gp/moments.md
14.24Covariate-shift GP — importance-weighted marginal likelihood for train/test distribution mismatch🔬GAP

Part 15 — Metrics & Calibration

Key equations / models:

#TutorialSourceScopeRefs / Notes
15.1NLPD decomposition: calibration + sharpness🧱GAP — dd:features/gp/metrics.md
15.2Expected Calibration Error (ECE) & coverage diagnostics🧱GAP
15.3Continuous Ranked Probability Score (CRPS)🧱GAP
15.4RMSE / MAE / R² / interval width🧱GAP

Summary of dups to reconcile

TopicLocationsSuggestion
Kernel approximations / RFF / NyströmG kernel_approximations, P kernel_approximation, P random_fourier_featuresKeep pyrox as canonical; gaussx version → low-level mechanics
Sparse VGPG sparse_variational_gp, G whitened_svgp, R pyroxgp/01_svgp_standardresearch_notebook = applied; gaussx ones = linear-algebra view
Expectation PropagationG expectation_propagation, P advanced_gp_epgaussx = mechanics-from-scratch; pyrox = library API
Schur / conditioningG conditional_distributions, G sugar_opsmerge
Operator basicsG basics, G operator_zoomerge
Solver strategiesG solver_strategies, G solver_comparisonmerge

Proposed final homes (high-level)