Activation Functions and their purpose: Binary, Linear, ReLU, Sigmoid, Tanh and Softmax