Getting Started
This page walks you through running your first example and understanding the core jNO workflow.
Before you begin, complete the setup steps in Installation.
Running Your First Example
This solves the 1D Laplace equation ∂²u/∂x² = sin(πx) with a Physics-Informed Neural Network (PINN) and saves results to ./runs/laplace1D/.
Core Workflow
Every jNO program follows this five-step pattern:
1. Define the Domain
import jno
# 2D rectangular domain with mesh spacing 0.05
domain = jno.domain(constructor=jno.domain.rect(mesh_size=0.05))
See Domain & Geometry for all supported geometries.
2. Sample Variables
x, y, t = domain.variable("interior") # interior collocation points
xb, yb, tb = domain.variable("boundary") # boundary points
3. Define the Neural Network
import jno.numpy as jnn
import jax
u_net = jnn.nn.mlp(in_features=2, hidden_dims=64, num_layers=3, key=jax.random.PRNGKey(0))
u = u_net(x, y)
4. Formulate Constraints
import jno.numpy as jnn
π = jnn.pi
pde = -jnn.laplacian(u, [x, y]) - 1.0 # ∇²u + 1 = 0
boc = u(xb, yb) - 0.0 # u=0 on boundary
Constraints are symbolic expressions — no computation happens at this stage.
5. Solve
import optax
from jno import LearningRateSchedule as lrs
u_net.optimizer(optax.adam, lr=lrs.exponential(1e-3, 0.9, 2000, 1e-5))
crux = jno.core([pde.mse, boc.mse], domain)
stats = crux.solve(2000)
stats.plot("history.png")
Project Setup Helper
jno.setup() initialises logging and returns the run directory in one call:
dire = jno.setup(__file__) # creates ./runs/<script_name>/
dire = jno.setup(__file__, name="experiment_v1") # custom name
Understanding the Output
During training jNO prints a progress line each epoch:
L— total weighted lossC0,C1— individual constraint lossesT0,T1— tracker values (if any)
After training, stats.plot("history.png") saves the loss curves.
Next Steps
| Topic | Page |
|---|---|
| All geometry types | Domain & Geometry |
| Neural operator architectures | Neural Network Architectures |
| Differential operators | Differential Operators |
| Optimizers, LR schedules, constraint weights | Training |
| Residual-adaptive point selection | Adaptive Resampling |
| Architecture and hyperparameter search | Hyperparameter Tuning |
| Saving and loading solvers | Save, Load & Configuration |