Why do We use Cross-entropy in Deep Learning -- Part 2

#artificialintelligence 

Entropy, Cross-entropy, Binary Cross-entropy, and Categorical Cross-entropy are crucial concepts in Deep Learning and one of the main loss functions used to build Neural Networks. All of them derive from the same concept: Entropy, which may be familiar to you from physics and chemistry. However, not many courses or articles explain the terms in-depth, since it requires some time and mathematics to do it correctly. In the first post, I presented three different but related conceptions of entropy and where its formula derives from. However, there is still one key concept to address, since Deep Learning does not use Entropy but a close relative of it called Cross-entropy.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found