Backpropagation in Data Mining
Backpropagation is a method used to train neural networks where the model learns from its mistakes. It works by measuring how wrong the output is and then adjust the weights step by step to make better predictions next time. In this artcle we will learn how backpropgation works in Data Mining.
Working of Backpropagation
Neural networks generate output vectors from input vectors on which neural network operates on. It compares generated output with the desired output and generates an error report if the result does not match the generated output vector. Then it adjusts the weights accordingly to get the desired output. It is based on gradient descent and updates weights by minimizing the error between predicted and actual output. Training of backpropagation consists of three stages:
- Forward propagation of input data.
- Backward propagation of error.
- Updating weights to reduce the error.
Let’s walk through an example of backpropagation in machine learning. Assume the neurons use the sigmoid activation function for the forward and backward pass. The target output is 0.5 and the learning rate is 1.

1. Forward Propagation
1. Initial Calculation
The weighted sum at each node is calculated using:
Where,
a_j is the weighted sum of all the inputs and weights at each nodew_{i,j} represents the weights between thei^{th} input and thej^{th} neuronx_i represents the value of thei^{th} input
o
(output): After applying the activation function to a,
we get the output of the neuron:
2. Sigmoid Function
The sigmoid function returns a value between 0 and 1, introducing non-linearity into the model.

3. Computing Outputs
At h1 node
Once we calculated the a1 value, we can now proceed to find the y3 value:
Similarly find the values of y4 at h2 and y5 at O3

4. Error Calculation
Our actual output is 0.5 but we obtained 0.67. To calculate the error we can use the below formula:
Using this error value we will be backpropagating.
2. Backpropagation
1. Calculating Gradients
The change in each weight is calculated as:
Where:
\delta_j is the error term for each unit,\eta is the learning rate.
2. Output Unit Error
For O3:
3. Hidden Unit Error
For h1:
For h2:
3. Weight Updates
For the weights from hidden to output layer:
New weight:
For weights from input to hidden layer:
New weight:
Similarly other weights are updated:
w_{1,2}(\text{new}) = 0.273225 w_{1,3}(\text{new}) = 0.086615 w_{2,1}(\text{new}) = 0.269445 w_{2,2}(\text{new}) = 0.18534
The updated weights are illustrated below
.png)
After updating the weights the forward pass is repeated yielding:
y_3 = 0.57 y_4 = 0.56 y_5 = 0.61
Since
This process demonstrates how backpropagation iteratively updates weights by minimizing errors until the network accurately predicts the output.
This process is said to be continued until the actual output is gained by the neural network. Backpropagation is a technique that makes neural network learn. By propagating errors backward and adjusting the weights and biases neural networks can gradually improve their predictions.