Posts

Showing posts with the label Decision Tree

Machine Learning Programs

 Machine Learning Programs πŸ‘‰ Data Preprocessing in Machine Learning πŸ‘‰ Data Preprocessing in Machine learning (Handling Missing values ) πŸ‘‰ Linear Regression - ML Program - Weight Prediction πŸ‘‰ NaΓ―ve Bayes Classifier - ML Program πŸ‘‰ LOGISTIC REGRESSION - PROGRAM πŸ‘‰ KNN Machine Learning Program πŸ‘‰ Support Vector Machine (SVM) - ML Program πŸ‘‰ Decision Tree Classifier on Iris Dataset πŸ‘‰ Classification of Iris flowers using Random Forest πŸ‘‰ DBSCAN πŸ‘‰ Implement and demonstrate the FIND-S algorithm for finding the most specific hypothesis based on a given set of training data samples. Read the training data from a .CSV file πŸ‘‰ For a given set of training data examples stored in a .CSV file, implement and demonstrate the Candidate-Elimination algorithm to output a description of the set of all hypotheses consistent with the training examples. πŸ‘‰ Write a program to demonstrate the working of the decision tree based ID3 algorithm. Use an appropriate data set for building the decision tree and

Machine Learning MCQs

  Machine Learning MCQs πŸ‘‰1.  Machine Learning MCQs - UNIT 1 πŸ‘‰2.  Machine Learning MCQs-2 (Performance Metrics, Linear Regression, NaΓ―ve Bayes Classifier ) πŸ‘‰3.  Machine Learning MCQs-3 (Logistic Regression, KNN, SVM, Decision Tree) πŸ‘‰4.  Machine Learning MCQs - 4 (Clustering, Dimensionality Reduction) πŸ‘‰5. Machine Learning MCQs - 5 (Ensemble Models)

Decision Tree Characteristics

  Decision Trees Characteristics Context: Decision Trees are a fundamental machine learning algorithm used for both classification and regression tasks. Understanding their characteristics, capabilities, and limitations is crucial for effectively applying them to solve real-world problems. Question: Which of the following statements are true regarding the properties and behavior of Decision Trees? Statements to Evaluate: 1. Decision tree makes no assumptions about the data. 2. The decision tree model can learn non-linear decision boundaries. 3. Decision trees cannot explain how the target will change if a variable is changed by 1 unit (marginal effect). 4. Hyperparameter tuning is not required in decision trees. 5. In a decision tree, increasing entropy implies increasing purity. 6. In a decision tree, the entropy of a node decreases as we go down the decision tree. Choose the correct answer from below : A) 1, 2, and 5 B) 3, 5 and 6 C) 2, 3, 4 and 5 D) 1,2,3 and 6 Ans: D 1, 2, 3 and 6

Decision Tree MCQs

  Decision Tree MCQs 1.      Decision Trees can be used for A.     Classification Tasks B.     Regression Tasks C.     Multi-output tasks D.     All of the above Ans: D   2.      The iris dataset has A.     5 features and 3 classes B.     4 features and 3 classes C.     2 features and 3 classes D.     4 features and 2 classes Ans: B   3.      A node’s value attribute tells you how many training instances of each class this node applies to Top of Form  True  False Ans: A 4.      A node’s gini attribute measures Top of Form  The number of training instances in the node  The ratio of training instances in the node  Its impurity None of these Ans: C 5.      If all the training instances of a node belong to the same class then the value of the node's Gini attribute will be  1  0  Any integer between 0 and 1  A negative value Ans: B 6.      A Gini coefficient of 1 expresses maximal inequality amo