Goto

Collaborating Authors

 arxivpreprint


HierarchicalChannel-spatialEncodingfor Communication-efficientCollaborativeLearning

Neural Information Processing Systems

Existing systems mostly compress features at pixel level and ignore the characteristics of feature structure, which could be further exploited for more efficient compression.


HierarchicalChannel-spatialEncodingfor Communication-efficientCollaborativeLearning

Neural Information Processing Systems

Existing systems mostly compress features at pixel level and ignore the characteristics of feature structure, which could be further exploited for more efficient compression.


3DGaussianSplattingas MarkovChainMonteCarlo

Neural Information Processing Systems

While 3DGaussian Splatting has recently become popular for neural rendering, current methods rely on carefully engineered cloning and splitting strategies for placing Gaussians, which can lead to poor-quality renderings, and reliance on a goodinitialization.



Breaking the Activation Function Bottleneck through Adaptive Parameterization

Sebastian Flennerhag, Hujun Yin, John Keane, Mark Elliot

Neural Information Processing Systems

Adaptive parameterization is a means of increasing this flexibility and thereby increasing the model's capacity to learn non-linear patterns. We focus on the feed-forward layer, f(x):= φ(W x+b),for some activation functionφ: R 7 R. Define the pre-activation layer as a = A(x):= Wx+band denote byg(a):= φ(a)/athe activation effect ofφgivena, where divisioniselement-wise.