Event-3DGS: Event-based 3D Reconstruction Using 3D Gaussian Splatting
–Neural Information Processing Systems
Event cameras, offering high temporal resolution and high dynamic range, have brought a new perspective to addressing 3D reconstruction challenges in fast-motion and low-light scenarios. Most methods use the Neural Radiance Field (NeRF) for event-based photorealistic 3D reconstruction. However, these NeRF methods suffer from time-consuming training and inference, as well as limited scene-editing capabilities of implicit representations. To address these problems, we propose Event-3DGS, the first event-based reconstruction using 3D Gaussian splatting (3DGS) for synthesizing novel views freely from event streams. Technically, we first propose an event-based 3DGS framework that directly processes event data and reconstructs 3D scenes by simultaneously optimizing scenario and sensor parameters.
Neural Information Processing Systems
May-27-2025, 20:11:00 GMT
- Technology:
- Information Technology > Artificial Intelligence > Vision (1.00)