Posts

Showing posts with the label Tanh

Neural network 4

Image
  Q1. Tanh and Leaky ReLu Which of the following statements with respect to Leaky ReLu and Tanh are true? a.  When the derivative becomes zero in the case of negative values in ReLu, no learning happens which is rectified in Leaky ReLu. b.  Tanh is a zero-centered activation function. c.  Tanh produces normalized inputs for the next layer which makes training easier. d.  Tanh also has the vanishing gradient problem. Choose the correct answer from below: A.      All the mentioned statements are true. B.      All the mentioned statements are true except c. C.       All the mentioned statements are true except b. D.      All the mentioned statements are true except d. Ans: A Correct options: All the mentioned statements are true. Explanation : 1) The problem of no learning in the case of ReLu is called dying ReLu which Leaky ReLu takes care of. 2) Yes, tanh is a zero-centered activation function. 3) As the Tanh is symmetric and the mean is around zero it p