Posts
Showing posts with the label accuracy
Convolutional Neural Network 2
- Get link
- X
- Other Apps
Q1. Sparse Connection What does sparsity of connections mean as a benefit of using convolutional layers? Choose the correct answer from below: A. Each filter is connected to every channel in the previous layer B. Each layer in a convolutional network is connected only to two other layers C. Each activation in the next layer depends on only a small number of activations from the previous layer D. Regularization causes gradient descent to set many of the parameters to zero Ans: C Correct answer: Each activation in the next layer depends on only a small number of activations from the previous layer. Reason : In neural network usage, “dense” connections connect all inputs. By contrast, a CNN is “sparse” because only the local “patch” of pixels is connected, instead using all pixels as an input. High correlation can be found between the sparseness of the output of different layers, which makes CNN better than t
Confusion Matrix
- Get link
- X
- Other Apps
Confusion Matrix · A much better way to evaluate the performance of a classifier is to look at the confusion matrix . · The general idea is to · count the number of times instances of class A are classified as class B. · For example, ü to know the number of times the classifier confused images of 5s with 3s , ü you would look in · the fifth row and · third column of the confusion matrix . ü To compute the confusion matrix , • you first need to have a set of predictions so that they can be compared to the actual targets . • You could make predictions on the test set. • Remember that you want to use the test set only at the very end of your project , once you have a classifier that you are ready to launch . ü Instead, you can use the cross_val_predict() function: from sklearn.model_selection import cross_val_predict y_train_pred = cross_val_predict ( sgd_clf , X_t
Performance Measures
- Get link
- X
- Other Apps
Performance Measures • Evaluating a classifier is often significantly trickier than evaluating a regressor. • There are many performance measures available. i. Confusion Matrix ii. True Positive Rate iii. True Negative Rate iv. False Positive Rate v. False Negative Rate vi. Precision vii. Recall viii. Accuracy ix. F1-Score x. Specificity xi. Receiver Operating Characteristic (ROC) xii. Area Under Curve (AUC) YouTube Link: https://youtu.be/jL39fMC_I28