Omni-NeRF: Neural Radiance Field from 360° Image Captures
IEEE International Conference on Multimedia and Expo (ICME 2023)
Authors: Kai Gu, Thomas Maugey, Sebastian Knorr, Christine Guillemot
TU Berlin, INRIA, EAH Jena
TU Berlin, INRIA, EAH Jena
TL;DR
This paper tackles the problem of novel view synthesis (NVS) from 360° images with imperfect camera poses or intrinsic parameters. We propose a novel end-to-end framework for training Neural Radiance Field (NeRF) models given only 360° RGB images and their rough poses, which we refer to as Omni-NeRF.
Abstract
This paper tackles the problem of novel view synthesis (NVS) from 360° images with imperfect camera poses or intrinsic parameters. We propose a novel end-to-end framework for training Neural Radiance Field (NeRF) models given only 360° RGB images and their rough poses, which we refer to as Omni-NeRF. We extend the pinhole camera model of NeRF to a more general camera model that better fits omni-directional fish-eye lenses. The approach jointly learns the scene geometry and optimizes the camera parameters without knowing the fisheye projection.
BibTex
If you use our work in your research, please cite our publication:
@inproceedings{ICME52920.2022.9859817,
author = {Gu, Kai and Maugey, Thomas and Knorr, Sebastian and Guillemot, Christine},
title = {Omni-NeRF: Neural Radiance Field from 360 image captures},
booktitle = {IEEE International Conference on Multimedia and Expo (ICME)},
address = {Taipei, Taiwan},
year = {2022},
pages = {1 -- 6},
doi = {10.1109/ICME52920.2022.9859817}
}