Activation Functions in Machine Learning
Sigmoid, Hyperbolic Tangent (Tanh), Rectified Linear Unit (ReLU), Leaky ReLU, The softmax
Published in
5 min readJan 31, 2023
Artificial neural networks rely heavily on activation functions, which are used to predict a neuron’s output based on its inputs and weights. We will examine the fundamentals of activation functions and how they are used in machine learning in this post.