An interactive 3D visualization of Multi-Layer Perceptrons (MLPs) with real-time training, gradient descent visualization, and loss landscape exploration.
- Configurable inputs: 1-10 input neurons
- Dynamic hidden layers: Add/remove layers, adjust neuron count (1-16 per layer)
- Configurable outputs: 1-10 output neurons
- tanh: Hyperbolic tangent (default)
- ReLU: Rectified Linear Unit
- Softmax: For multi-class output
- MSE (Quadratic): Mean Squared Error
- NLL: Negative Log Likelihood
- Hinge: SVM-style margin loss
- L2 Regularization: Optional weight penalty with configurable Ξ»
- Adjustable Learning Rate: Log-scale slider (0.0001 to 1.0)
- Real-time Training: Start/Stop controls with step-by-step option
- Interactive Network Graph: Movable, rotatable, zoomable
- Connection Colors: Green (positive) β Red (negative) weights
- Gradient Particles: Visual representation of gradient magnitudes
- Loss Surface: 3D landscape with gradient descent trail
- Hover Info: See weight/bias/gradient values on hover
- Three.js: 3D WebGL rendering
- Micrograd.js: JavaScript port of Karpathy's autograd engine
- Vanilla CSS: Glassmorphic design system
-
Start a local server:
npx serve . -l 3000 -
Configure your network architecture in the left panel
-
Click "Start Training" to begin gradient descent
-
Interact with the 3D visualization:
- Drag: Rotate view
- Scroll: Zoom in/out
- Hover: See neuron/weight details
nn-visualizer/
βββ index.html # Main HTML
βββ css/
β βββ styles.css # Glassmorphic design system
βββ js/
β βββ main.js # Main application
β βββ micrograd.js # Autograd engine (Value, MLP, Loss)
β βββ scene-manager.js # Three.js scene setup
β βββ network-visualizer.js # 3D network rendering
β βββ loss-surface.js # 3D loss landscape
βββ README.md
This project uses a JavaScript implementation of micrograd by Andrej Karpathy, ported to run in the browser for real-time visualization.
MIT
