Activation functions are a type of function used in neural networks. Here are some examples:
$\operatorname{ReLU}(x) = \begin{cases}x \text{ if }x \ge 0 \\ 0\text{ if }x < 0\end{cases} = \max(0, x)$
Useful property: $\operatorname{ReLU}(x) - \operatorname{ReLU}(-x) = x$
(You can prove this by considering 3 cases: $x=0$, $x<0$, and $x>0$.)
$\operatorname{sp}(x) = \ln(1+e^x)$
Useful property: $\operatorname{sp}(x) - \operatorname{sp}(-x) = x$
(You can prove this by using some logarithm properties.)