Machine Learning MCQs - 5 (Ensemble Models)
Machine Learning MCQs - 5
(Ensemble Models)
---------------------------------------------------------------------
- Group
- Entity
- Ensemble
- Set
- True
- False
- Obtain the predictions of all individual trees
- Predict the class that gets the most votes
- Both of the above
- The probabilities of output from each classifier
- The majority votes from the classifiers
- The mean of the output from each classifier
- The sum of the output from each classifier
- True
- False
- Sufficiently diverse
- As independent from one another as possible
- Making very different types of errors
- All of the above
- True
- False
- True
- False
- Hard Voting
- Soft Voting
- The majority of votes from the classifiers
- The highest class probability averaged over all the individual classifiers
- True
- False
- Majority votes classifications are often wrong
- It gives more weight to highly confident votes
- Finding majority is computationally expensive
- This statement is false
- Bagging
- Pasting
- Pasting
- Bagging
- True
- False
- True
- False
- True
- False
- True
- False
- True
- False
- True
- False
- max_samples and bootstrap
- max_features and bootstrap_features
- Bagging
- Pasting
- No
- Yes, and these are called Extremely Randomised Trees ensemble
- Leaf of the tree
- Middle of the tree
- Root of the tree
- It is slow
- It cannot be parallelized
- It cannot be performed on larger training sets
- It requires a lot of memory and processing power
- More than two leaf nodes
- Max depth of 1, i.e. single decision node with two leaf nodes
- Having more than 2 decision nodes
- True
- False
- True
- False
- Boosting
- Bagging
- Stacking
- Pasting
Machine Learning MCQs - 4 (Clustering, Dimensionality Reduction)
Machine Learning MCQs - 4
(Clustering, Dimensionality Reduction)
---------------------------------------------------------------------
1. Which of the following is finally produced by Hierarchical Clustering?
- final estimate of cluster centroids
- tree showing how close things are to each other
- assignment of each point to clusters
- all of the mentioned
Ans: 2
2. Which of the following is required by K-means clustering?
- defined distance metric
- number of clusters
- initial guess as to cluster centroids
- all of the mentioned
Ans: 4
3. Point out the wrong statement.
- k-means clustering is a method of vector quantization
- k-means clustering aims to partition n observations into k clusters
- k-nearest neighbor is same as k-means
- none of the mentioned
Ans: 3
4. Which of the following combination is incorrect?
- Continuous – euclidean distance
- Continuous – correlation similarity
- Binary – manhattan distance
- None of the mentioned
Ans: 4
5. Hierarchical clustering should be primarily used for exploration
- True
- False
Ans: 1
6. Which of the following function is used for k-means clustering?
- k-means
- k-mean
- heatmap
- none of the mentioned
Ans: 1
7. Which of the following clustering requires merging approach?
- Partitional
- Hierarchical
- Naive Bayes
- None of the mentioned
Ans: 2
8. K-means is not deterministic and it also consists of number of iterations.
- True
- False
Ans: 1
9. Which
of the following can act as possible termination conditions in K-Means? |
options:
Ans: 4
|
About Machine Learning
Welcome! Your Hub for AI, Machine Learning, and Emerging Technologies In today’s rapidly evolving tech landscape, staying updated with the ...
-
This blog provides information for the following subjects 👉 Artificial Intelligence 👉 Machine Learning 👉 Machine Learning Programs 👉 ...
-
Machine Learning 👉 About Machine Learning 1 The Machine Learning Landscape Classification Support Vector Machines Decision Trees Ensem...
-
UNIT 3 Support Vector Machines MCQs -------------------------------------------------------------------------------------------------------...