EvaGaussians: Event Stream Assisted Gaussian Splatting from Blurry Images

Wangbo Yu *1,2,    Chaoran Feng *1,    Jiye Tang 3,    Xu Jia 3,    Li Yuan 1,2,    Yonghong Tian 1,2  

1 Peking University

,

2 Peng Cheng Laboratory

,

3 Dalian University of Technology

Abstract

3D Gaussian Splatting (3D-GS) has demonstrated exceptional capabilities in 3D scene reconstruction and novel view synthesis. However, its training heavily depends on high-quality, sharp images and accurate camera poses. Fulfilling these requirements can be challenging in non-ideal real-world scenarios, where motion-blurred images are commonly encountered in high-speed moving cameras or low-light environments that require long exposure times. To address these challenges, we introduce Event Stream Assisted Gaussian Splatting (EvaGaussians), a novel approach that integrates event streams captured by an event camera to assist in reconstructing high-quality 3D-GS from blurry images. Capitalizing on the high temporal resolution and dynamic range offered by the event camera, we leverage the event streams to explicitly model the formation process of motion-blurred images and guide the deblurring reconstruction of 3D-GS. By jointly optimizing the 3D-GS parameters and recovering camera motion trajectories during the exposure time, our method can robustly facilitate the acquisition of high-fidelity novel views with intricate texture details. We comprehensively evaluated our method and compared it with previous state-of-the-art deblurring rendering methods. Both qualitative and quantitative comparisons demonstrate that our method surpasses existing techniques in restoring fine details from blurry images and producing high-fidelity novel views.

Method Overview

Overview of EvaGaussians. Our method seamlessly integrates the event streams captured by an event camera into the training of 3D-GS to robustly handle motion-blurred images. We adopt Event-based Double Integral (EDI) for blur modeling and preprocessing, yielding initial camera trajectories and sparse point cloud for 3D-GS training. By jointly optimizing the 3D-GS parameters and the camera motion trajectories using a blur reconstruction loss and an even reconstruction loss, our method facilitates high-quality 3D-GS reconstruction and novel view synthesis.


Reconstruction Results

The left shows the blurry training images, and the right shows the deblurring rendering results of our method.

BibTeX


        @article{yu2024eva,
            title={EvaGaussians: Event Stream Assisted Gaussian Splatting from Blurry Images},
            author={Yu, Wangbo and Feng, Chaoran and Tang, Jiye and Jia, Xu and Yuan, Li and Tian, Yonghong},
            journal={arXiv preprint},
            year={2024}
            }
        }