Neural Network 3
Q1. Complete the code
For the above code implementation of forward and backward
propagation for the sigmoid function, complete the backward pass [????]
to compute analytical gradients.
Note: grad in backward is
actually the output error gradients.
Choose the correct answer from below:
A. grad_input = self.sig * (1-self.sig) * grad
B. grad_input = self.sig / (1-self.sig) * grad
C. grad_input = self.sig / (1-self.sig) + grad
D. grad_input = self.sig + (1-self.sig) - grad
Ans: A
Correct Answer : grad_input = self.sig *
(1-self.sig) * grad
Explanation : The grad_input will be given by :
dZ = The error introduced by input Z.
- dA =
The error introduced by output A.
- σ(x)
· 1 − σ(x) = The derivative of the Sigmoid activation function.
where σ(x) represents the sigmoid function.
Q2. Trained Perceptron
A perceptron was trained to distinguish between two classes,
"+1" and "-1". The result is shown in the plot given below.
Which of the following might be the reason for poor performance of the trained
perceptron?
Choose the correct answer from below:
A. The perceptron can not separate linearly separated data
B. The perceptron works only if the two classes are linearly separable which is not the case here.
C. The smaller learning rate with less number of epochs of perceptron could have restricted it from producing good results.
D. The "-1" class dominates the dataset, thereby pulling the decision boundary closer to itself.
Ans:C
Correct option: The smaller learning rate with
less number of epochs of perceptron could have restricted it from producing
good results.
Explanation:
- The
number of data in both classes is enough,but the difference between their
numbers is not that significant that it can cause misclassification.
- Since
the dot product between weights “w” and input “x” is related linearly to
x, the perceptron is a linear classifier. It is not capable of separating
classes that are not linearly separable.
- When
observing the result, it can be classes seem to be linearly
separable with few exceptions.However, for classes that are linearly
separable, the algorithm is guaranteed to converge to the correct decision
boundary.
- Also,
the decision boundary is not towards class -1 because of the majority.
Both the classes seems to have fairly equal amount of samples for training
a perceptron.
- As
we can see the model underfits the data, this means that the number of
epochs for the model to train on is quite low or the learning rate is
quite small, making the model perform poorly
Q3. Identify the Function
Mark the correct option for the below-mentioned statements:
(a) It is possible for a perceptron that it adds
up all the weighted inputs it receives, and if the sum exceeds a specific
value, it outputs a 1. Otherwise, it just outputs a 0.
(b) Both artificial and biological neural
networks learn from past experiences.
Choose the correct answer from below:
A. Both the mentioned statements are true.
B. Both the mentioned statements are false.
C. Only statement (a) is true.
D. Only statement (b) is true.
Ans: A
Correct option: Both the statements are true.
Explanation :
Implementation of statement (a) is called step function and
yes it is possible.
Both of artificial and biological neural networks learn from
past experiences.
The artificial networks are trained on data to make predictions. The weights
assigned to each neuron continuously
changes during the training process to reduce the error.
Q4. Find the Value of 'a'
Given below is a neural network with one neuron that takes
two float numbers as inputs.
If the model uses the sigmoid activation function, What will be the value of 'a' for the given x1 and x2 _____(rounded off to 2 decimal places)?
Choose the correct answer from below:
A. 0.57
B. 0.22
C. 0.94
D. 0.75
Ans: A
Correct option :
- 0.57
Explanation :
The value of z will be :
- z= w1.x1+w2.x2+b
- z = (0.5×0.55) + (−0.35×0.45) + 0.15 = 0.2675
The value of a will be :
- a= f(z) = σ(0.2675) = 1+e(−z)1=1.7652901=0.5664=0.57
Comments
Post a Comment