λ
home posts study uses projects

Forward Propagation

Jul 18, 2024

machine-learning

The forward pass in a Neural Network is also known as the “prediction” step. You give the neural network some input and it gives you an output.

How it works

  • Passing in the input to the neural network
    • The input shape has to match with the first layer of the network
  • The jthj_{th} layer’s activations are equal to zj=σ(wjkajk+bj)z_j = \sigma(w_{jk}a_{jk} + b_j)
    • In other words, the activation of the current layer’s jthj_{th} neuron is equal to the weighted sum of the previous layer’s weights and activations + a bias which are all then passed to an Activation Function. 1000
      • The weighted sum is computed by performing matrix multiplication between the weight matrix and the previous layer’s activations.
    • The input layer’s “activations” are simply the input values.
    • This is repeated for all layers

LλBS

All systems up and running

Commit c0dacf5, deployed on Nov 18, 2024.