jNO Documentation
Build neural operator solvers for PDEs in JAX with a symbolic DSL, flexible model architectures, and multi-stage training controls.
-
New to jNO?
Follow the shortest path from install to first solved PDE.
-
Want runnable scripts?
Browse curated examples and the imported tutorial repository by chapter.
-
Optimizing model quality?
Jump directly to architecture, controls, and tuning references.
Start Here
If you are new to jNO, follow this path:
- Installation: set up jNO with conda, pip, or
uv. - Getting Started: run your first problem and learn the basic workflow.
- Domain and Geometry: define your domain and collocation strategy.
- Training: configure optimizer schedules and solver phases.
- Tutorials: run chapter-based scripts from basics to operator architectures.
- Foundation Models: see the currently exposed foundation-model families and their expected inputs.
Core Guides
| Guide | Focus |
|---|---|
| Domain and Geometry | Built-in geometries, mesh loading, variable sampling, time-dependent setups |
| Neural Network Architectures | MLP, FNO, U-Net, DeepONet, Poseidon, custom wrappers |
| Differential Operators | grad, laplacian, jacobian, hessian, divergence, curl, math ops |
| Training | Solver setup, optimizers, LR schedules, parallel execution |
| Model Controls | Freeze, masks, LoRA, parameter groups, dtype and initialization |
| Adaptive Resampling | RAD, RARD, HA, CR3, PINNFluence |
| Hyperparameter Tuning | Architecture and training sweeps with Nevergrad |
| Save, Load and Configuration | Serialization, signed artifacts, TOML config and logging |
Tutorials
Browse the chapter-based walkthroughs in Tutorials, covering the core PDE problem classes and operator architecture examples.
Foundation Models
| Guide | Focus |
|---|---|
| Foundation Models | Poseidon, Morph, and Walrus and other foundation models together with their expected input layouts |
Experimental
Experimental workflows such as FEM and Variational PINNs are listed separately from the main tutorial track.
About
jNO is a research-oriented framework for PDE solving with neural operators developed by the AI-augmented Simulation Group of Fraunhofer IISB. It supports symbolic residual definitions, architecture experimentation, adaptive point sampling, and distributed execution. The documentation is a work in progress and will be expanded with more examples, guides, and API references over time.
Warning
This is an actively evolving research codebase. APIs and behavior may change.
Citation
If jNO is used we would appreciate to cite the following paper:
@article{armbruster2026jNO,
author = {Armbruster, Leon, ....},
title = {{jNO}: A JAX Library for Neural Operator and PDE Foundation Model Training},
journal = {},
year = {},
}