Does the final solution of PLA (Perceptron Learning Algorithm) depend on weight initialization?
Solution:-
As we know the Perception Learning Algorithm uses inputs as special input type called as bias with the terms x1, x2, x3, x4 ... xn,
where 'x' is the feature value and 'n' is the number of occurance.
We use the weights in this algoorithm to monitor the occurance of error. Everytime a error is occured in the training this weights are updated. Weights are represented with the varibles w1, w1, w3, w4, ...wn.
The weights are initialized at the start itself with a initial value. So it is safe to say that in Perception Learning Algorithm the final output depends on the weight iniitialization, with the process call the weight summation represented as summation (wixi).
Note:- hope this clears you doubt. If not drop a comment below i will help to clear this concept if needed.
Get Answers For Free
Most questions answered within 1 hours.