TöRF: Time-of-Flight Radiance Fields for Dynamic Scene View Synthesis Supplemental Document
–Neural Information Processing Systems
More results are shown in Figure 8 for the StudyBook and Dishwasher scenes, and Figure 10 for the DinoPear scene. Figure 9 also highlights our ability to account for multi-path interference. We also show animated results and comparisons for all sequences on our website. To evaluate a more practical camera setup than our prototype, we captured one real-world sequence (the Dishwasher scene) with a standard handheld Apple iPhone 12 Pro. This consumer smartphone contains a LIDAR ToF sensor for measuring sparse metric depth, which is processed by ARKit to provide a dense metric depth map video in addition to a captured RGB color video. Unfortunately, the raw measurements are not available from the ARKit SDK; however, if available, in principle our approach could apply.
Neural Information Processing Systems
Mar-22-2025, 12:42:05 GMT
- Technology:
- Information Technology
- Artificial Intelligence > Vision (0.84)
- Communications > Mobile (0.56)
- Information Technology