Machine Learning 1: UNIT 3 : Support Vector Machines MCQs

UNIT 3

Support Vector Machines

MCQs 

-----------------------------------------------------------------------------------------------------------------------------

1. A Support Vector Machine can be used for

A.    Performing linear or nonlinear classification

B.    Performing regression

C.    For outlier detection

D.    All of the above

Ans: D

2. The decision boundaries in a Support Vector machine is fully determined (or “supported”) by the instances located on the edge of the street?

Top of Form

  1. True
  2. False

Ans: A

3. Support Vector Machines are not sensitive to feature scaling

  1. True
  2. False

Ans: B

4. If we strictly impose that all instances be off the street and on the right side, this is called

  1.  Soft margin classification
  2.  Hard margin classification
  3.  Strict margin classification
  4.  Loose margin classification

Ans: B

5. The main issues with hard margin classification are

  1. It only works if the data is linearly separable
  2. It is quite sensitive to outliers
  3. It is impossible to find a margin if the data is not linearly separable
  4. All of the above

Ans: D

6. The objectives of Soft Margin Classification are to find a good balance between

  1. Keeping the street as large as possible
  2. Limiting the margin violations
  3. Both of the above
  4. None of the above

Ans: C

7. The balance between keeping the street as large as possible and limiting margin violations is controlled by this hyperparameter

  1. Tol
  2. Loss
  3. Penalty
  4. C

Ans: D

8. A smaller C value leads to a wider street but more margin violations.

Top of Form

  1. True
  2. False

Ans: A

9. If your SVM model is overfitting, you can try regularizing it by reducing the value of

Top of Form

  1. Tol
  2. C hyperparameter
  3. intercept_scaling
  4. None of the above

Ans: B

10. Problems with adding polynomial features are

  1. At a low polynomial degree, it cannot deal with very complex datasets
  2. With a high polynomial degree, it creates a huge number of features
  3. Adding high polynomial degree makes the model too slow
  4. All of the above

Ans: D

11. The hyperparameter coef0 of SVC controls how much the model is influenced by high-degree polynomials versus low-degree polynomials

Top of Form

A.     True

B.     False

Ans: A

12. A similarity function like Gaussian Radial Basis Function is used to

A.    Measure how many features are related to each other

B.    Find the most important features

C.    Find the relationship between different features

D.    Measure how much each instance resembles a particular landmark

Ans: D

13. When adding features with similarity function, and creating a landmark at the location of each and every instance in the training set, a training set with m instances and n features gets transformed to (assuming you drop the original features)

  1. A training set with n instances and n features
  2. A training set with m/2 instances and n/2 features
  3. A training set with m instances and m features
  4. A training set with m instances and n features

Ans: C

14. When using SVMs we can apply an almost miraculous mathematical technique for adding polynomial features and similarity features called the

Top of Form

  1. Kernel trick
  2. Shell trick
  3. Mapping and Reducing
  4. None of the above

Ans: A

15. Which is right for the gamma parameter of SVC which acts as a regularization hyperparameter

  1. If model is overfitting, increase it, if it is underfitting, reduce it
  2. If model is overfitting, reduce it, if it is underfitting, increase it
  3. If model is overfitting, keep it same
  4. If it is underfitting, keep it same

Ans: B

16. LinearSVC is much faster than SVC(kernel="linear"))

  1. True
  2. False

Ans: A

17. In SVM regression the model tries to

  1. Fit the largest possible street between two classes while limiting margin violations
  2. Fit as many instances as possible on the street while limiting margin violations
  3. Both
  4. None of the above

Ans: B

18. The SVR class is the regression equivalent of the SVC class, and the LinearSVR class is the regression equivalent of the LinearSVC class

  1. True
  2. False

Ans: ABottom of Form

 

Bottom of Form

Machine Learning 1: UNIT 3 (B) NOTEs: Support Vector Machines NOTEs

                                                                           Unit III

Support Vector Machines

8.     SVM Regression

9.     Under the Hood

10.  Decision function and Predictions

11.  Training Objective

12.  Quadratic Programming

13.  The Dual problem

14.  Kernelized SVM

Online SVMs

Machine Learning 1: UNIT 3 (B) PPTs: Support Vector Machines PPTs

                                                                             Unit III

Support Vector Machines

8.     SVM Regression

9.     Under the Hood

10.  Decision function and Predictions

11.  Training Objective

12.  Quadratic Programming

13.  The Dual problem

14.  Kernelized SVM

Online SVMs

Machine Learning 1: UNIT 3 (A) PPTs: Support Vector Machines PPTs

                                                                          Unit III - A

Support Vector Machines

1.     Linear SVM Classification

2.     Soft Margin Classification

3.     Nonlinear SVM Classification

4.     Polynomial Kernel

5.     Adding Similarity Features

6.     Gaussian RBF Kernel

7.     Computational Complexity

Machine Learning 1: UNIT 3 (A) NOTES: Support Vector Machines NOTEs

                                                                              Unit III - A

Support Vector Machines

1.     Linear SVM Classification

2.     Soft Margin Classification

3.     Nonlinear SVM Classification

4.     Polynomial Kernel

5.     Adding Similarity Features

6.     Gaussian RBF Kernel

7.     Computational Complexity


Machine Learning 1: UNIT 2: Classification Questions

                                                                                UNIT-II

                                                                        CLASSIFICATION

Long Answer Questions

 

1.     Apply a Binary Classifier for a given dataset and evaluate its performances using Accuracy, Precision, Recall, F1 Score, Precision/Recall Trade-off.

 

Not 5

5

Not 5

53057

1522

5

1325

4096

 

2.     Explain kNN Classifier Algorithm with an example.

3.     How can you be measuring accuracy using cross validation? Explain.

4.     Predict the class label for new instance x1=6, x2=8 for the following dataset using kNN Classifier. Explain kNN Algorithm.

X1

X2

Class Label

4

3

F

6

7

P

7

8

P

5

5

F

8

8

P

5.     How can you train a Binary Classifier? Explain with example.

6.     List the different performance metrics? Explain them with example.

7.     What is the confusion matrix? How can you get the confusion matrix?

8.     Define Precision and Recall. How can you get these values?

9.     Explain Precision/Recall Trade-off with example.

10.  What is ROC Curve. What is the importance of ROC Curve?

11.  Explain multiclass classification with example.

12.  How can you perform error analysis? Explain.

13.  Explain Multilabel Classification with example.

 

14.  Explain Multioutput Classification with example.

 

About Machine Learning

Welcome! Your Hub for AI, Machine Learning, and Emerging Technologies In today’s rapidly evolving tech landscape, staying updated with the ...