The sense of touch plays an important role in reinforcing presence of distant physical targets for telepresence applications. Among various data representations for defining haptic cues, a point cloud is considered to be suitable for dynamically changing environments in virtue of its simplicity. However, ambiguous and noisy data caused by the lack of geometrical structure and the characteristics of sensors often lead to unreliable haptic feedback during interaction. To address this issue, we propose a robust point-cloud-based haptic rendering method for palpable exploration of remote environments without reconstruction of surface meshes. Surface information, which is a key element of haptic rendering, is directly estimated from the raw point cloud data by principal component analysis. This method is then applied to a prototype server-client system on real network for proof of concept and evaluation of performance. The experimental results confirm the accuracy and stability of the proposed method in estimating surface information as well as in handling contact situations.