NN : Forward and Back Propagation MCQs & Program
NN : Forward and Back Propagation Q1. Sigmoid and softmax functions Which of the following statements is true for a neural network having more than one output neuron ? Choose the correct answer from below: A. In a neural network where the output neurons have the sigmoid activation, the sum of all the outputs from the neurons is always 1. B. In a neural network where the output neurons have the sigmoid activation, the sum of all the outputs from the neurons is 1 if and only if we have just two output neurons. C. In a neural network where the output neurons have the softmax activation, the sum of all the outputs from the neurons is always 1. D. The softmax function is a special case of the sigmoid function Ans: C Correct option : In a neural network where the output neurons have the softmax activation, the sum of all the outputs from the neurons is always 1. Explanation : For the sigmoid activation, when we have more than one neuron, it is