ReLu vs Tanh
Start Timer
0:00:00
Let’s say you work at Amazon. Your job is to build and train a neural network to classify images of different chairs. The classes could be, for example, “Office Chair”, “Dining Chair”, etc.
Which activation function would you choose in the hidden layers of your neural net? ReLu or Tanh? Why would you prefer one function over the other?
Recommended questions for you
Personalized based on your user activity, skill level, and preferences.
.
.
.
.
.
.
.
.
.
Comments