Recently BERT has shown tremendous improvement in performance for various NLP tasks. BERT has been applied to many domains including biomedical field. Especially clinical domain, the semantic relationship between sentences is very important to understand patient's medical record and health history in physical examination. However, in current Clinical BERT model, the pre-training method is difficult to capture sentence level semantics. To address this problem, we propose a contrastive representations pre-training (CRPT), which can enhance contextual meanings between sentences by replacing cross-entropy loss to contrastive loss in next sentence prediction (NSP) task. Also we tried to improve the performance by changing random masking technique to whole word masking (WWM) for masked language model (MLM). Especially, we focus on enhancing language representations of BERT model by pre-training with discharge summaries to optimize clinical studies. We demonstrate that our CRPT strategy yields performance improvements on clinical NLP task in BLUE (Biomedical Language Understanding Evaluation) Benchmark dataset.