Perceptron



Overview and implementation of the most fundamental Neural Network model.

nn perceptron synthdata

Perceptron


The Perceptron algorithm is very similar to the Logistic Regression one.

Foward propagation


$$ \large Z=w^TX + b $$

The activation function (Sigmoid) is: $$ \large \hat{y}=A=\frac{1}{1 + e^{-Z}} $$ The cost function is: $$ \large J = -\frac{1}{m}\sum_{i=1}^{m}y^{(i)}\log(a^{(i)})+(1-y^{(i)})\log(1-a^{(i)}) $$

Backward propagation


Gradients: $$ \large \frac{\partial J}{\partial w} = \frac{1}{m}X(A-Y)^T$$ $$ \large \frac{\partial J}{\partial b} = \frac{1}{m} \sum_{i=1}^m (a^{(i)}-y^{(i)})$$ where:


The optimization functions is: $$ \large \theta = \theta - \alpha d\theta $$ where $\alpha$ is the learning rage.

Classification


nn perceptron