Fake News Classification with Keras - Analytics Vidhya
Batch normalization is implemented (if desired) as outlined in the original paper that introduced it, i.e. after the Dense linear transformation but before the non-linear (ReLU) activation. The output layer is just a standard Dense layer with 1 neuron and a sigmoid activation function (that squishes predictions to between 0 and 1), such that our model is ultimately predicting 0 or 1, fake or true. Batch normalization can help speed up training and provides a mild regularizing effect. Both the Keras- and spaCy-embedded models will take a good amount of time to train, but ultimately we'll end up with something that we can evaluate on our test data with. Overall, the Keras-embedded model performed better– achieving a test accuracy of 99.1% vs the spaCy model's 94.8%.
Jun-27-2022, 17:50:46 GMT
- Technology: