Posts

Showing posts with the label Overfitting

Convolutional Neural Network 2

Image
  Q1. Sparse Connection What does sparsity of connections mean as a benefit of using convolutional layers? Choose the correct answer from below: A.      Each filter is connected to every channel in the previous layer B.      Each layer in a convolutional network is connected only to two other layers C.       Each activation in the next layer depends on only a small number of activations from the previous layer D.      Regularization causes gradient descent to set many of the parameters to zero Ans: C Correct answer:  Each activation in the next layer depends on only a small number of activations from the previous layer. Reason : In neural network usage, “dense” connections connect all inputs. By contrast, a CNN is “sparse” because only the local “patch” of pixels is connected, instead using all pixels as an input. High correlation can be found between the sparseness of the output of different layers, which makes CNN better than t

Main Challenges of Machine Learning

Image
  Main Challenges of Machine Learning Main task is to select a learning algorithm and train it on some data,  the two things that can go wrong are “bad algorithm” and “bad data.”  Insufficient Quantity of Training Data Non-Representative Training-Data Poor-Quality Data  Irrelevant Features Overfitting The Training Data Under Fitting of Training Data YouTube Link:  https://www.youtube.com/watch?v=7qLek-ZV7J4