UNIT II
CNN
2. striding and padding
3. pooling layers
4. structure
5. operations and prediction of CNN with layers
6. CNN -Case study with MNIST
UNIT II
CNN
UNIT II
1. A Support Vector Machine can
be used for
A.
Performing linear or nonlinear
classification
B.
Performing regression
C.
For outlier detection
D.
All of the above
Ans: D
2. The decision boundaries in a
Support Vector machine is fully determined (or “supported”) by the instances
located on the edge of the street?
Top
of Form
Ans: A
3. Support
Vector Machines are not sensitive to feature scaling
A. Top of Form
Ans: B
4. If
we strictly impose that all instances be off the street and on the right side,
this is called
Ans: B
5. The main issues with hard
margin classification are
Ans: D
6. The
objectives of Soft Margin Classification are to find a good balance between
Ans: C
7. The
balance between keeping the street as large as possible and limiting margin
violations is controlled by this hyperparameter
Ans: D
8. A smaller C value leads to a
wider street but more margin violations.
Top
of Form
Ans: A
9. If
your SVM model is overfitting, you can try regularizing it by reducing the
value of
Top
of Form
Ans: B
10. Problems with adding
polynomial features are
Ans: D
11. The hyperparameter coef0 of
SVC controls how much the model is influenced by high-degree polynomials versus
low-degree polynomials
Top
of Form
A. True
B. False
Ans: A
12. A similarity function like
Gaussian Radial Basis Function is used to
A.
Measure how many features are
related to each other
B.
Find the most important features
C.
Find the relationship between
different features
D.
Measure how much each instance
resembles a particular landmark
Ans: D
13. When adding features with
similarity function, and creating a landmark at the location of each and every
instance in the training set, a training set with m instances and n features
gets transformed to (assuming you drop the original features)
Ans: C
14. When
using SVMs we can apply an almost miraculous mathematical technique for adding
polynomial features and similarity features called the
Top
of Form
Ans: A
15. Which
is right for the gamma parameter of SVC which acts as a regularization
hyperparameter
Ans: B
16. LinearSVC is much faster
than SVC(kernel="linear"))
Ans: A
17. In SVM regression the model
tries to
Ans: B
18. The SVR class is the
regression equivalent of the SVC class, and the LinearSVR class is the
regression equivalent of the LinearSVC class
Ans: ABottom of Form
Bottom
of Form
C.Bottom of Form
Bottom
of Form
Bottom
of Form
Bottom
of Form
Bottom
of Form
Bottom
of Form
Bottom
of Form
Bottom
of Form
Bottom
of Form
Bottom
of Form
Bottom
of Form
Bottom
of Form
Bottom
of Form
Bottom
of Form
C.Bottom of Form
Bottom
of Form
Welcome! Your Hub for AI, Machine Learning, and Emerging Technologies In today’s rapidly evolving tech landscape, staying updated with the ...