Phone

+919997782184

Email

support@roboticswithpython.com

Geeks of Coding

Join us on Telegram

Viewing 0 reply threads
  • Author
    Posts
    • #1180
      Yash AroraYash Arora
      Keymaster

      P.S.- If you know remaining answers feel free to add

      Question 1 –
      The weights and biases in a neural network are optimized using:

      Answer :
      Gradient Descent

      Question 2 –
      For a cost function, J= ∑ (z i−wx i−b)^2,that we would like to minimize, which of the following expressions represent updating the parameter, ww, using gradient descent?

      Answer :

      Question 3 –
      What type of activation function is this?

      Answer :
      ReLu

      Question 4 –
      What type of activation function is this?

      Answer –
      Hyperbolic Tangent Function

      Question 5 –
      Softmax activation function is most commonly used in hidden layers?

      Answer –
      False

Viewing 0 reply threads
  • You must be logged in to reply to this topic.