Posts

Showing posts with the label Perceptron

Machine Learning2 - UNIT-2(B) Notes: Artificial Neural Networks Notes

UNIT – II (B) Artificial Neural Networks Introduction Neural Network Representation Appropriate Problems for   Neural   Network Learning Perceptron’s   Multilayer   Networks The   Back-Propagation   Algorithm  

Neural Network 3

Image
  Q1. Complete the code For the above code implementation of forward and backward propagation for the sigmoid function, complete the backward pass [ ???? ] to compute analytical gradients. Note:  grad in backward is actually the output error gradients. Choose the correct answer from below: A.      grad_input = self.sig * (1-self.sig) * grad B.      grad_input = self.sig / (1-self.sig) * grad C.       grad_input = self.sig / (1-self.sig) + grad D.      grad_input = self.sig + (1-self.sig) - grad Ans: A Correct Answer :  grad_input = self.sig * (1-self.sig) * grad Explanation :  The  grad_input  will be given by : dZ  = The error introduced by input Z. dA  = The error introduced by output A. σ(x) · 1 − σ(x)  = The derivative of the Sigmoid activation function. where σ(x) represents the sigmoid function. Q2. Trained Perceptron A perceptron was trained to distinguish between two classes, "+1" and "-1". The result is