In this paper, we propose a method for editing the scene appearance of light-field images. Our method enables users to manipulate the illumination and material properties of scenes captured in light-field format, offering various control over image appearance, including dynamic relighting and material appearance modification, which leverages our specially designed inverse rendering framework for light-field images. By effectively separating light fields into appearance parameters—such as diffuse albedo, normal, specular intensity, and roughness within a multi-plane image domain, we overcome the traditional challenges of light-field imaging decomposition. These challenges include handling front-parallel views and a limited image count, which have previously hindered neural inverse rendering networks when applying them to light-field image data. Our method also approximates environmental illumination using spherical Gaussians, significantly enhancing the realism of scene reflectance. Furthermore, by differentiating scene illumination into far-bound and near-bound light environments, our method enables highly realistic editing of scene appearance and illumination, especially for local illumination effects. This differentiation allows for efficient, real-time relighting rendering and integrates seamlessly with existing layered light-field rendering frameworks. Our method demonstrates rendering capabilities from casually captured light-field images.