Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

SVGP with pyrox

Sparse variational GPs with pyrox

Four SVGP notebooks, same scaffolding — each one changes exactly one thing about the setup.

NotebookWhat changesWhat you take away
01_svgp_standard— (baseline)Canonical Titsias/Hensman SVGP — ELBO, WhitenedGuide, inducing-point migration.
02_svgp_batchedFull-data ELBO → unbiased mini-batch ELBOWhy you’d reach for SVGP in the first place: (N/B) · ELL − KL scaling, wallclock comparison, batch-size sweep.
03_svgp_spherical_harmonicsPoint-inducing Z ∈ ℝ^(M×D) → spherical-harmonic basis on S² (VISH, Dutordoir 2020)Diagonal K_uu, closed-form Funk–Hecke coefficients, inter-domain feature viewpoint. O(M) vs O(M³) solve-cost sweep.
04_svgp_rff_nnRBF kernel → RFF inner-product on NN-warped inputDeep kernels, finite-dim RFF projections, when the base RBF isn’t expressive enough.

Shared ingredients

Every notebook uses the same three pyrox primitives:

Training is pure optax + equinox.filter_value_and_grad — no NumPyro handlers in sight. Kernels are defined as plain eqx.Modules with log_variance / log_lengthscale as unconstrained trainable scalars.

Reading order

Do them in sequence — each notebook assumes you’ve seen the previous one’s ELBO + training scaffold.