The color of a scene may vary from image to image because the photographs are taken at different times, with different cameras, and under different camera settings. To align the color of a scene between images, we introduce a novel color transfer framework based on a scattered point interpolation scheme. Compared to the conventional color transformation methods that use a parametric mapping or color distribution matching, we solve for a full nonlinear and nonparametric color mapping in the 3D RGB color space by employing the moving least squares framework. We further strengthen the transfer with a probabilistic modeling of the color transfer in the 3D color space as well as spatial constraints to deal with mis-alignments, noise, and spatially varying illumination. Experiments show the effectiveness of our method over previous color transfer methods both quantitatively and qualitatively. In addition, our framework can be applied for various instances of color transfer such as transferring color between different camera models, camera settings, and illumination conditions, as well as for video color transfers.