Posts

Showing posts with the label SoftMax

NN : Forward and Back Propagation MCQs & Program

Image
  NN : Forward and Back Propagation Q1. Sigmoid and softmax functions Which of the following statements is true for a neural network having more than one output neuron ? Choose the correct answer from below: A.     In a neural network where the output neurons have the sigmoid activation, the sum of all the outputs from the neurons is always 1. B.     In a neural network where the output neurons have the sigmoid activation, the sum of all the outputs from the neurons is 1 if and only if we have just two output neurons. C.     In a neural network where the output neurons have the softmax activation, the sum of all the outputs from the neurons is always 1. D.     The softmax function is a special case of the sigmoid function Ans: C Correct option :  In a neural network where the output neurons have the softmax activation, the sum of all the outputs from the neurons is always 1. Explanation : For the sigmoid activation, when we have more than one neuron, it is

Deep Learning: UNIT 2: CNN: Short Answer Questions

   UNIT II CNN Short Answer Questions --------------------------------------------------------------------------------------------------------------------------- 1.      List the applications of CNN. 2.      Define Convolution. 3.      Define Stride. 4.      Explain padding. 5.      What the purpose of padding? 6.      Define Kernal. 7.      Define Pooling. 8.      List the different pooling techniques. 9.      Define Flattening. 10.   Define Fully Connected Layer. 11.   List the difference between CNN and Fully Connected Layer. 12.   What is a filter (or kernel) in the context of a CNN? 13.   Discuss the role of fully connected layers in CNNs. 14.   Explain the concept of pooling in CNNs and how it sometimes impacts output size and can cause underfitting. 15.   What is the purpose of the pooling layer in a CNN? 16.   what are two special cases of padding? explain them with a neat diagram. 17.   Discuss the concept of convolution in CNNs. 18.