Goto

Collaborating Authors

 ieeetran




Complex Gated Recurrent Neural Networks

Moritz Wolter, Angela Yao

Neural Information Processing Systems

Gating, as used in gated recurrent units (GRUs) [4] and long short-term memory (LSTM) networks [12], has become common-place in recurrent architectures. Gates facilitate the learning of longer term temporal relationships [12]. Furthermore, in the presence of noise in the input signal, gates can protect the cell state from undesired updates, thereby improving overall stability and convergence.




Plug-in Estimation in High-Dimensional Linear Inverse Problems: A Rigorous Analysis

Alyson K. Fletcher, Parthe Pandit, Sundeep Rangan, Subrata Sarkar, Philip Schniter

Neural Information Processing Systems

Estimating a vector x from noisy linear measurements Ax + w often requires use of prior knowledge or structural constraints on x for accurate reconstruction. Several recent works have considered combining linear least-squares estimation with a generic or "plug-in" denoiser function that can be designed in a modular manner based on the prior knowledge about x.