This is an excerpt from the online course AWS Certified Machine Learning Specialty 2020 Hands On at In this video, we cover the different activation functions used in neural networks to provide an output of a given node, or neuron, given its set of inputs: linear, step, sigmoid, logistic, tanh, hyperbolic tangent, ReLU, Leaky ReLU, PReLu, Maxout, and more.
0
0
Related videos
Preparing
To view the site materials you should be more than or equal to 18 years old