Lesson 1: Understanding a Single Neuron
A neuron takes multiple inputs, multiplies each by its weight, sums them with a bias, and applies an activation function.
Formula: output = activation(input₁ × weight₁ + input₂ × weight₂ + input₃ × weight₃ + bias)
Weighted Sum: 0.25
Output: 0.56
Lesson 2: Multi-Layer Neural Network
A neural network consists of layers of neurons. Each neuron in a layer connects to all neurons in the next layer. The network learns by adjusting weights through backpropagation using multiple training examples.
Training Data
Add multiple input-output pairs for the network to learn from
| Input 1 | Input 2 | Target | Output | Error | Squared Error | ∇W (Avg) | Action |
|---|---|---|---|---|---|---|---|
| Sum of Squared Errors: | 0.0000 | ||||||
| Mean Squared Error: | 0.0000 | ||||||
Training Controls
Backpropagation in Action
1. Forward Pass
Inputs flow through the network
Each neuron computes weighted sum + bias
Activation function produces output
2. Calculate Error
Compare output to target
Error = Target - Output
Square the error for loss function
→ See Mean Squared Error in table
3. Compute Gradients
Calculate how much each weight contributed to error
Use chain rule to backpropagate
→ See ∇W (Avg) in table
4. Update Weights
Weight -= Learning Rate × Gradient
Adjust weights to reduce error
Repeat for all training examples
💡 Click on any hidden or output neuron to see its computation in detail
Lesson 3: Handwritten Digit Recognition (MNIST)
Train a convolutional neural network (CNN) to recognize handwritten digits (0-9) using the MNIST dataset. Draw a digit and see the network predict it!
🚀 Want to build this yourself? Follow the official TensorFlow.js tutorial to learn how to create your own digit recognition model from scratch:
TensorFlow.js — Handwritten Digit Recognition Tutorial →Dataset Information
Training Status
Status: Not trained
Epoch: 0 / 10
Training Loss: -
Training Accuracy: -
Validation Loss: -
Validation Accuracy: -