Posts

Showing posts with the label CNN

Deep Learning: UNIT 2: CNN: Short Answer Questions

   UNIT II CNN Short Answer Questions --------------------------------------------------------------------------------------------------------------------------- 1.      List the applications of CNN. 2.      Define Convolution. 3.      Define Stride. 4.      Explain padding. 5.      What the purpose of padding? 6.      Define Kernal. 7.      Define Pooling. 8.      List the different pooling techniques. 9.      Define Flattening. 10.   Define Fully Connected Layer. 11.   List the difference between CNN and Fully Connected Layer. 12.   What is a filter (or kernel) in the context of a CNN? 13.   Discuss the role of fully connected layers in CNNs. 14.   Explain the concept of pooling in CNNs and how it sometimes impacts output size and can cause underfitting. 15.   What is the purpose of the pooling layer in a CNN? 16.   what are two special cases of padding? explain them with a neat diagram. 17.   Discuss the concept of convolution in CNNs. 18.  

Deep Learning: UNIT-2 : CNN: Long Answer Questions

UNIT II CNN Long Answer Questions ------------------------------------------------------------------------------------------------------ 1.      Explain CNN with an example. 2.      List the different applications of CNN. 3.      Write an example function for Convolution and Pooling operations and explain in detail. 4.      Draw and explain the architecture of convolution neural networks. 5.      Explain about the convolutional layers in CNN. 6.      Explain striding and padding in CNN with example. 7.      Draw the structure of CNN. 8.      Apply CNN architecture to Classify MNIST Hand Written Dataset. 9.      List the difference between CNN and Fully Connected Layers.

Deep Learning: UNIT-2 CNN

  UNIT II CNN 1.      Introduction 2.      striding and padding 3.      pooling layers 4.      structure 5.      operations and prediction of CNN with layers 6.      CNN -Case study with MNIST 7.      CNN VS Fully Connected  πŸ‘‰ Deep Learning: UNIT-2: CNN PPTs πŸ‘‰ Deep Learning: UNIT-2 CNN Notes πŸ‘‰ Deep Learning: UNIT-2 : CNN: Long Answer Questions πŸ‘‰ Deep Learning: UNIT-2: CNN : Short Answer Questions

Deep Learning: UNIT-2 PPT

UNIT II  CNN  1. Introduction  2. striding and padding   3. pooling layers  4. structure  5. operations and prediction of CNN with layers  6. CNN -Case study with MNIST  7. CNN VS Fully Connected

Deep Learning: UNIT 2- CNN Notes

                                                                                                   UNIT II  CNN  1. Introduction  2. striding and padding   3. pooling layers  4. structure  5. operations and prediction of CNN with layers  6. CNN -Case study with MNIST  7.  CNN VS Fully Connected

About Deep Learning

 Deep Learning UNIT I Deep Learning: Fundamentals, Introduction, Building Block of Neural Networks, Layers, MLPs, Forward pass, backward pass, class, trainer and optimizer, The Vanishing and Exploding Gradient Problems, Difficulties in Convergence, Local and Spurious Optima, Preprocessing, Momentum, learning rate Decay, Weight Initialization, Regularization, Dropout, SoftMax, Cross Entropy loss function, Activation Functions. πŸ‘‰ Deep Learning: UNIT 1 (A) Notes: Deep Learning: Fundamentals Part 1 Notes πŸ‘‰ Deep Learning: UNIT 1 (A) PPTs: Deep Learning Fundamentals Part 1 PPTs πŸ‘‰ Deep Learning: Unit 1 (B) Notes: Deep Learning Fundamentals Part 2 Notes πŸ‘‰ Deep Learning: UNIT 1 (B): Deep Learning: Fundamentals Part2 PPTs πŸ‘‰ Deep Learning: UNIT 1: Deep Learning - Fundamentals: Long Answer Questions πŸ‘‰ Deep Learning: UNIT 1: Deep Learning - Fundamentals : Short Answer Questions   UNIT II CNN: Introduction, striding and padding, pooling layers , structure, o

Deep Learning

Deep Learning   πŸ‘‰ Deep Learning Syllabus πŸ‘‰ Deep Learning: Fundamentals πŸ‘‰ CNN πŸ‘‰ RNN πŸ‘‰Autoencoders πŸ‘‰Transfer   Learning UNIT I Deep Learning: Fundamentals, Introduction, Building Block of Neural Networks, Layers, MLPs, Forward pass, backward pass, class, trainer and optimizer, The Vanishing and Exploding Gradient Problems, Difficulties in Convergence, Local and Spurious Optima, Preprocessing, Momentum, learning rate Decay, Weight Initialization, Regularization, Dropout, SoftMax, Cross Entropy loss function, Activation Functions. UNIT II CNN: Introduction, striding and padding, pooling layers, structure, operations and prediction of CNN with layers, CNN -Case study with MNIST, CNN VS Fully Connected UNIT III RNN: Handling Branches, Layers, Nodes, Essential Elements-Vanilla RNNs, GRUs, LSTM UNIT IV Autoencoders: Denoising Autoencoders, Sparse Autoencoders, Deep Autoencoders, Variational Autoencoders, GANS UNIT V Transfer Learning - Types, Methodologies, Diving into Transfer Learn

Convolutional Neural Network 2

Image
  Q1. Sparse Connection What does sparsity of connections mean as a benefit of using convolutional layers? Choose the correct answer from below: A.      Each filter is connected to every channel in the previous layer B.      Each layer in a convolutional network is connected only to two other layers C.       Each activation in the next layer depends on only a small number of activations from the previous layer D.      Regularization causes gradient descent to set many of the parameters to zero Ans: C Correct answer:  Each activation in the next layer depends on only a small number of activations from the previous layer. Reason : In neural network usage, “dense” connections connect all inputs. By contrast, a CNN is “sparse” because only the local “patch” of pixels is connected, instead using all pixels as an input. High correlation can be found between the sparseness of the output of different layers, which makes CNN better than t