activationfunction

The activation functions in PyTorch (5)

The activation functions in PyTorch (5)

Buy Me a Coffee☕ *Memos: My post explains Step function, Identity and ReLU. My post explains Leaky ReLU, PReLU and FReLU. My post explains ELU, SELU and CELU. My post explains GELU, Mish, SiLU and Softplus. My post explains Vanishing Gradient Problem, Exploding Gradient Problem and Dying ReLU Problem. (1) Tanh: can convert an input value(x) to the output value between -1 and 1. *0 and 1 are exclusive. 's formula is y = (ex - e-x) / (ex + e-x). is also called Hyperbolic Tangent Function. is Tanh() in PyTorch. is used in: RNN(Recurrent Neural Network). *RNN in PyTorch.…
Read More
No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.