Posts

Showing posts with the label Keras

TensorFlow & keras 3

Image
  Q1. Functional model Complete the code snippet in order to get the following model summary. from tensorflow.keras.layers import Dense, Flatten, Input from tensorflow.keras.models import Model def create_model_functional():   inp = Input(shape=(28, ))   h1 = Dense(64, activation="relu", name="hidden_1")(inp)   h2 = Dense( _a_ , activation="relu", name="hidden_2")(h1)   out = Dense(4, activation="softmax", name="output")( _b_ )   model = Model(inputs=inp, outputs=out, name="simple_nn")   return model model_functional = create_model_functional() model_functional.summary() Choose the correct answer from below: A.      512, b - h2 B.      64, b - h2 C.       10, b - h1 D.      512, b – inp Ans: A Correct Option:  a- 512, b - h2 Explanation: To get the model summary as shown in the question, the value of a should be 512 and the value of b should be h2. This will create a neur

TensorFlow and Keras-2

  Q1. Sigmoid and softmax functions Which of the following statements is true for a neural network having more than one output neuron ? Choose the correct answer from below: A.      In a neural network where the output neurons have the sigmoid activation, the sum of all the outputs from the neurons is always 1. B.      In a neural network where the output neurons have the sigmoid activation, the sum of all the outputs from the neurons is 1 if and only if we have just two output neurons. C.      In a neural network where the output neurons have the softmax activation, the sum of all the outputs from the neurons is always 1. D.      The softmax function is a special case of the sigmoid function Ans: C Correct option :  In a neural network where the output neurons have the softmax activation, the sum of all the outputs from the neurons is always 1. Explanation : For the sigmoid activation, when we have more than one neuron, it is possible to have the sum of outp

TensorFlow and Keras -1

Image
  Q1. Binary classification In order to perform binary classification on a dataset (class  0  and  1 ) using a neural network, which of the options is  correct  regarding the outcomes of code snippets  a  and  b ? Here the labels of observation are in the form : [0, 0, 1...]. Common model: import tensorflow from keras.models import Sequential from keras.layers import Dense from tensorflow.keras.optimizers import SGD model = Sequential() model.add(Dense(50, input_dim=2, activation='relu', kernel_initializer='he_uniform')) opt = SGD(learning_rate=0.01) Code snippet a: model.add(Dense(1, activation='sigmoid')) model.compile(loss='binary_crossentropy', optimizer=opt, metrics=['accuracy']) Code snippet b: mode.add(Dense(1, activation='softmax')) model.compile(loss='categorical_crossentropy', optimizer=opt, metrics=['accuracy']) The term " Required results " in the options means that the accur