Rectified Linear Unit: simple, but most used activation function in Machine Learning by far. It is ideal, because it is both computionally efficient but also effective in practice.
Defined as:
Visualizing neural networks, ReLU creates a “fold” in the possible neural activation space, folding out a boundary for linear models seems to be quite effective. origami!