The reconstruction of spatially resolved information of an extended object from an observed intensity diffraction pattern in holographic imaging is a challenging problem. By incorporating an explicit physical model, Lee and colleagues propose a deep learning method that can be used in holographic image reconstruction under physical perturbations and which generalizes well beyond object-to-sensor distances and pixel sizes seen during training. Holographic imaging poses the ill posed inverse mapping problem of retrieving complex amplitude maps from measured diffraction intensity patterns. The existing deep learning methods for holographic imaging often depend solely on the statistical relation between the given data distributions, compromising their reliability in practical imaging configurations where physical perturbations exist in various forms, such as mechanical movement and optical fluctuation. Here, we present a deep learning method based on a parameterized physical forward model that reconstructs both the complex amplitude and the range of objects under highly perturbative configurations where the object-to-sensor distance is set beyond the range of given training data. To prove reliability in practical biomedical applications, we demonstrate holographic imaging of red blood cells flowing in a cluster and diverse types of tissue section presented without any ground truth data. Our results suggest that the proposed approach permits the adaptability of deep learning methods to deterministic perturbations, and therefore extends their applicability to a wide range of inverse problems in imaging.