Bender, Jan
Zero-Level-Set Encoder for Neural Distance Fields
Jeske, Stefan Rhys, Klein, Jonathan, Michels, Dominik L., Bender, Jan
Neural shape representation generally refers to representing 3D geometry using neural networks, e.g., to compute a signed distance or occupancy value at a specific spatial position. In this paper, we present a novel encoder-decoder neural network for embedding 3D shapes in a single forward pass. Our architecture is based on a multi-scale hybrid system incorporating graph-based and voxel-based components, as well as a continuously differentiable decoder. Furthermore, the network is trained to solve the Eikonal equation and only requires knowledge of the zero-level set for training and inference. This means that in contrast to most previous work, our network is able to output valid signed distance fields without explicit prior knowledge of non-zero distance values or shape occupancy. We further propose a modification of the loss function in case that surface normals are not well defined, e.g., in the context of non-watertight surfaces and non-manifold geometry. Overall, this can help reduce the computational overhead of training and evaluating neural distance fields, as well as enabling the application to difficult shapes. We finally demonstrate the efficacy, generalizability and scalability of our method on datasets consisting of deforming shapes, both based on simulated data and raw 3D scans. We further show single-class and multi-class encoding, on both fixed and variable vertex-count inputs, showcasing a wide range of possible applications.
Wavelet-based Loss for High-frequency Interface Dynamics
Prantl, Lukas, Bender, Jan, Kugelstadt, Tassilo, Thuerey, Nils
Generating highly detailed, complex data is a long-standing and frequently considered problem in the machine learning field. However, developing detail-aware generators remains an challenging and open problem. Generative adversarial networks are the basis of many state-of-the-art methods. However, they introduce a second network to be trained as a loss function, making the interpretation of the learned functions much more difficult. As an alternative, we present a new method based on a wavelet loss formulation, which remains transparent in terms of what is optimized. The wavelet-based loss function is used to overcome the limitations of conventional distance metrics, such as L1 or L2 distances, when it comes to generate data with high-frequency details. We show that our method can successfully reconstruct high-frequency details in an illustrative synthetic test case. Additionally, we evaluate the performance when applied to more complex surfaces based on physical simulations. Taking a roughly approximated simulation as input, our method infers corresponding spatial details while taking into account how they evolve. We consider this problem in terms of spatial and temporal frequencies, and leverage generative networks trained with our wavelet loss to learn the desired spatio-temporal signal for the surface dynamics. We test the capabilities of our method with a set of synthetic wave function tests and complex 2D and 3D dynamics of elasto-plastic materials. In the context of generative neural networks, simple distance metrics such as L1 or L2 distances play an important role.