Posts

NN : Forward and Back Propagation MCQs & Program

Image
  NN : Forward and Back Propagation Q1. Sigmoid and softmax functions Which of the following statements is true for a neural network having more than one output neuron ? Choose the correct answer from below: A.     In a neural network where the output neurons have the sigmoid activation, the sum of all the outputs from the neurons is always 1. B.     In a neural network where the output neurons have the sigmoid activation, the sum of all the outputs from the neurons is 1 if and only if we have just two output neurons. C.     In a neural network where the output neurons have the softmax activation, the sum of all the outputs from the neurons is always 1. D.     The softmax function is a special case of the sigmoid function Ans: C Correct option :  In a neural network where the output neurons have the softmax activation, the sum of all the outputs from the neurons is always 1. Explanation : ...

Neural Network MCQs & Programs

  Neural Network MCQs πŸ‘‰ Introduction to Neural Network MCQs πŸ‘‰ NN : Forward and Back Propagation MCQs & Program

NN: Introduction to Neural Network MCQs

Image
  NN : Introduction to Neural Network Q1. Weights impact For the neural network shown above, which of these statements is true? Choose the correct answer from below: A.     -5 weight is bad for the neural network. B.     The neuron with weight 10 will have the most impact on the output. C.     The neuron with weight -5 will have the most impact on the output. D.     The neuron with weight 2 will have the most impact on the output. Ans: B Correct option :  The neuron with weight 10 will have the most impact on the output. Explanation : There is no such thing that a neuron with a negative weight will be bad for the output. The negative or positive weight of a neuron simply means whether it has an increasing or decreasing effect on the output value. A neuron with the largest magnitude will have the most significant effect on the output value.   Q2. Calculate Forward Pass The ne...