Posts

Showing posts with the label KNN

Machine Learning Programs

 Machine Learning Programs πŸ‘‰ Data Preprocessing in Machine Learning πŸ‘‰ Data Preprocessing in Machine learning (Handling Missing values ) πŸ‘‰ Linear Regression - ML Program - Weight Prediction πŸ‘‰ NaΓ―ve Bayes Classifier - ML Program πŸ‘‰ LOGISTIC REGRESSION - PROGRAM πŸ‘‰ KNN Machine Learning Program πŸ‘‰ Support Vector Machine (SVM) - ML Program πŸ‘‰ Decision Tree Classifier on Iris Dataset πŸ‘‰ Classification of Iris flowers using Random Forest πŸ‘‰ DBSCAN πŸ‘‰ Implement and demonstrate the FIND-S algorithm for finding the most specific hypothesis based on a given set of training data samples. Read the training data from a .CSV file πŸ‘‰ For a given set of training data examples stored in a .CSV file, implement and demonstrate the Candidate-Elimination algorithm to output a description of the set of all hypotheses consistent with the training examples. πŸ‘‰ Write a program to demonstrate the working of the decision tree based ID3 algorithm. Use an appropriate data set for building the decision tree and

Machine Learning MCQs-3 (Logistic Regression, KNN, SVM, Decision Tree)

 Machine Learning MCQs-3  (Logistic Regression, KNN, SVM, Decision Tree) --------------------------------------------------------------------- 1. A Support Vector Machine can be used for Performing linear or nonlinear classification Performing regression For outlier detection All of the above Ans: 4 2.   The decision boundaries in a Support Vector machine is fully determined (or “supported”) by the instances located on the edge of the street?   True False Ans: 1   3.   Support Vector Machines are not sensitive to feature scaling True False Ans: 2  4.  If we strictly impose that all instances be off the street and on the right side, this is called Soft margin classification Hard margin classification Strict margin classification Loose margin classification Ans: 2 5. The main issues with hard margin classification are It only works if the data is linearly separable It is quite sensitive to outliers It is impossible to find a margin if the data is not linearly separable All of the above A

KNN Machine Learning Program

Image
KNN Classifier for IRIS Data Set Steps: Import the library files Read the dataset (Iris Dataset) and analyze the data Preprocessing the data Divide the data into Training and Testing Build the model - KNN Classifier Model Evaluation Import the library files 2. Read the dataset (Iris Dataset) and analyze the data 3. Preprocessing the data   4. Divide the data into Training and Testing 5. Build the model - KNN Classifier KNN Classifier class   sklearn.neighbors. KNeighborsClassifier ( n_neighbors = 5 ,  * ,  weights = 'uniform' ,  algorithm = 'auto' ,  leaf_size = 30 ,  p = 2 ,  metric = 'minkowski' ,  metric_params = None ,  n_jobs = None ) Parameters: n_neighbors int, default=5 Number of neighbors to use by default for  kneighbors  queries. weights {‘uniform’, ‘distance’} or callable, default=’uniform’ Weight function used in prediction. Possible values: ‘uniform’ : uniform weights. All points in each neighborhood are weighted equally. ‘distance’ : weight points