n errors-in-variables model (EVM) differs from the classical regression model in that in the former the independent variable is also subject to error. This paper shows that to assess the applicability of the ordinary least squares (OLS) estimation procedure to the EVM, the relative dispersion of the independent variable to its error variance must be also considered in addition to Mandels criterion. The effect of physically reducing the variance of errors in the independent variable on the performance of the OLS slope estimator is also discussed.