Compact Autoregressive Network

Wang, Di, Huang, Feiqing, Zhao, Jingyu, Li, Guodong, Tian, Guangjian

arXiv.org Machine Learning 

Recurrent neural networks (RNN) and their variants, such as Long-Short Term Memory (Hochreiter and Schmidhuber, 1997) and Gated Recurrent Unit (Cho et al., 2014), are commonly used as the default architecture or even the synonym of sequence modeling by deep learning practitioners (Goodfellow et al., 2016). In the meanwhile, especially for high-dimensional time series, we may also consider the autoregressive modeling or multi-task learning, null y t f (y t 1, y t 2,..., y t P), (1) where the output null y t and each input y t i are N -dimensional, and the lag P can be very large for accomodating sequential dependence. Some non-recurrent feed-forward networks with convolutional or other certain architectures have been proposed recently for sequence modeling, and are shown to have state-of-the-art accuracy. For example, some autoregressive networks, such as PixelCNN (Van den Oord et al., 2016b) and WaveNet (Van den Oord et al., 2016a) for image and audio sequence modeling, are compelling alternatives to the recurrent networks. This paper aims at the autoregressive model (1) with a large number of sequences.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found