The performance of energy detection under multipath fading is analyzed and compared with locally optimal detection using Pitman's asymptotic relative efficiency. Under the L-tap finite impulse response channel model with zero-mean independent and identically distributed tap coefficients, it is shown that the average performance loss of energy detection is no greater than 50% in sample size for the same performance compared with locally optimal detection exploiting signal correlation. Also, an algorithm exploiting signal correlation and improving the detection performance is proposed based on the estimation of signal correlation. Numerical results show that the proposed algorithm almost achieves the performance of locally optimal detection.