The Swish function is an activation function used in neural networks, which was introduced by researchers from Google as an alternative to traditional activation functions like ReLU (Rectified Linear Unit) and sigmoid.
New to topics? Read the docs here!