3D modeling of an outdoor environment using images from a static camera can be used in a number of applications: 3D map generation, background modeling for visual surveillance, image relighting and manipulation. There are millions of cameras already out there which are constantly recording outdoor scenes and the number is growing every year. A general framework of photometric stereo which recovers the 3D information of an outdoor scene using images from such cameras will be useful because the algorithm can instantly be applied to millions of different cases without additional cost.
In this dissertation, we present a photometric stereo framework for 3D recovery of an outdoor scene using timelapse images. We exploit natural illumination to model the outdoor illumination environment through estimating skylight distribution that corresponds to the time and location of the image capture. The use of depth priors to the framework is investigated which can be obtained from other available resources such as satellite images or depth sensors according to the scale of the scene.
First, we present an outdoor photometric stereo method using images captured in a single day. We simulate a sky hemisphere for each image according to its GPS and timestamp, and parameterize the obtained sky hemisphere into a quadratic skylight and a Gaussian sunlight distribution. While previous methods usually model the outdoor illumination as a sum of constant ambient light and a distant point light source, our method models the natural illumination according to a popular sky model and thus provides sufficient constraints for shape reconstruction from one day images. The estimated surface normal is refined by MRF optimization. We have tested our method to recover the objects and scenes of various sizes in real-world outdoor daylight.
Second, we present an analysis on the condition of environment lighting that affects the quality of outdoor photometric stereo. The appearance of the sky hemisphere differs by the location of the sun in the sky and by the weather. The amount and the location of the clouds in the sky eventually alters the ability of the environment lighting to recover the surface normal of the scene. We examine the stability of the photometric stereo using real sky images of different weather conditions and introduce a modified sky model to estimate the skylight distribution in proper condition for outdoor photometric stereo.
Finally, we apply depth priors to our outdoor photometric stereo framework for final 3D scene recovery. The depth priors are obtained from an open source 3D map which generates the rough shape of buildings based on the satellite images with user assistance. The depth priors of the scene can be obtained using a depth sensor as well. In this case, the user assistance would not be necessary if the color camera for photometric stereo input images and the depth sensor for the range measurements are correctly aligned. We present a calibration method of a time-of-flight (ToF) sensor and a color camera pair to align the 3D measurements with the color image correctly. We show the performance of our calibration method quantitatively and qualitatively on various datasets, and validate the impact of our method by demonstrating an RGB-D shape refinement application.