Hidden Unit Specialization in Layered Neural Networks: ReLU vs. Sigmoidal Activation

Open in new window