XAI for Forecasting: Basis Expansion

#artificialintelligence 

Additionally, this approach also falls a bit short from an Explainable AI (XAI) perspective. While these models all have attention mechanisms that can be visualized, many academics have argued that this may not be explainable and this is an active field of debate. Denis Vorotyntsev has made a great article summarizing the debate and I highly encourage checking his article out as well [10]. In contrast to the attention-based approach of transformers, the other primary direction of tackling the forecasting problem is the neural basis expansion analysis approach first proposed by Oreshkin et.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found