The performance of Neyman-Pearson detection of correlated random signals using noisy observations is considered. Using the large deviations principle, the performance is analyzed via the error exponent for the miss probability with a fixed false-alarm probability. Using the state-space structure of the signal and observation model, a closed-form expression for the error exponent is derived using the innovations approach, and the connection between the asymptotic behavior of the optimal detector and that of the Kalman filter is established. The properties of the error exponent are investigated for the scalar case. It is shown that the error exponent has distinct characteristics with respect to correlation strength: for signal-to-noise ratio (SNR) >= 1, the error exponent is monotonically decreasing as the correlation becomes strong whereas for SNR < 1 there is an optimal correlation that maximizes the error exponent for a given SNR.