Phone

+919997782184

Email

support@roboticswithpython.com

Geeks of Coding

Join us on Telegram

Viewing 0 reply threads
  • Author
    Posts
    • #1220
      Yash AroraYash Arora
      Keymaster

      Question 1 –
      Usually, what activation function would you use if you had more than 10 hidden layers?

      Answer :
      ReLu

      Question 2 –
      What is the problem with the tanh and sigmoid activation function?

      Answer :
      The derivative is near zero in many regions

      Question 3 –
      What activation function is used in the following class

      class NetRelu(nn.Module):    
      	def __init__(self,D_in,H,D_out):        	             	         super(NetRelu,self).__init__() 
                         self.linear1=nn.Linear(D_in,H)        
      	         self.linear2=nn.Linear(H,D_out)            
                def forward(self,x):        
      	        x=torch.relu(self.linear1(x)))          
      	        x=self.linear2(x)     
                  return x

      Answer :
      ReLu

Viewing 0 reply threads
  • You must be logged in to reply to this topic.