Posts

Showing posts with the label Sigmoid

NN : Forward and Back Propagation MCQs & Program

Image
  NN : Forward and Back Propagation Q1. Sigmoid and softmax functions Which of the following statements is true for a neural network having more than one output neuron ? Choose the correct answer from below: A.     In a neural network where the output neurons have the sigmoid activation, the sum of all the outputs from the neurons is always 1. B.     In a neural network where the output neurons have the sigmoid activation, the sum of all the outputs from the neurons is 1 if and only if we have just two output neurons. C.     In a neural network where the output neurons have the softmax activation, the sum of all the outputs from the neurons is always 1. D.     The softmax function is a special case of the sigmoid function Ans: C Correct option :  In a neural network where the output neurons have the softmax activation, the sum of all the outputs from the neurons is always 1. Explanation : For the sigmoid activation, when we have more than one neuron, it is

Neural Network 3

Image
  Q1. Complete the code For the above code implementation of forward and backward propagation for the sigmoid function, complete the backward pass [ ???? ] to compute analytical gradients. Note:  grad in backward is actually the output error gradients. Choose the correct answer from below: A.      grad_input = self.sig * (1-self.sig) * grad B.      grad_input = self.sig / (1-self.sig) * grad C.       grad_input = self.sig / (1-self.sig) + grad D.      grad_input = self.sig + (1-self.sig) - grad Ans: A Correct Answer :  grad_input = self.sig * (1-self.sig) * grad Explanation :  The  grad_input  will be given by : dZ  = The error introduced by input Z. dA  = The error introduced by output A. σ(x) · 1 − σ(x)  = The derivative of the Sigmoid activation function. where σ(x) represents the sigmoid function. Q2. Trained Perceptron A perceptron was trained to distinguish between two classes, "+1" and "-1". The result is