OpenIllumination: A Multi-Illumination Dataset for
Inverse Rendering Evaluation on Real Objects


Isabella Liu1*, Linghao Chen1,2*, Ziyang Fu1, Liwen Wu1, Haian Jin2, Zhong Li3,
Chin Ming Ryan Wong3, Yi Xu3, Ravi Ramamoorthi1, Zexiang Xu4, Hao Su1

Accepted to NeuRIPS 2023
*Equal contribution
1UC San Diego, 2Zhejiang University, 3OPPO US Research Center, 4Adobe Research

Abstract




We introduce OpenIllumination, a real-world dataset containing over 108K images of 64 objects with diverse materials, captured under 72 camera views and a large number of different illuminations. For each image in the dataset, we provide accurate camera parameters, illumination ground truth, and foreground segmentation masks. Our dataset enables the quantitative evaluation of most inverse rendering and material decomposition methods for real objects. We examine several state-of-the-art inverse rendering methods on our dataset and compare their performances.


Dataset preview


Lighting pattern (13 illuminations)

ID Name Material

OLAT

ID Name Material


Citation


@misc{liu2024openillumination,
    title={OpenIllumination: A Multi-Illumination Dataset for Inverse Rendering Evaluation on Real Objects}, 
    author={Isabella Liu and Linghao Chen and Ziyang Fu and Liwen Wu and Haian Jin and Zhong Li and Chin Ming Ryan Wong and Yi Xu and Ravi Ramamoorthi and Zexiang Xu and Hao Su},
    year={2024},
    eprint={2309.07921},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}