It is well known that we can effectively improve system capacity, service quality, and coverage of service area in wireless communications by adopting an antenna array at the base station. The space-time signal processing for code division multiple access (CDMA) systems is one of the promising technology. At present many space-time receivers for the uplink in CDMA systems have been proposed. An approach based on the constrained optimization has been developed by exploiting the knowledge of the desired user``s spreading code, not any other information including a training sequence. This approach, called the minimum output energy (MOE) detector, is a blind adaptive technique that minimizes the interfering signal power while constraining the response of the desired user to be constant. In the ideal case of the no signal modeling error and perfect chip timing, the performance of this method tends to be close to that of the MMSE detector at a high signal to noise ratio.
In high data-rate CDMA systems, however, the MOE is highly sensitive to the signal modeling error as well as to chip timing error. A slight shift from the ideal chip sampling time will pick the inter-chip interference (ICI) from nearby chips, so that the resulting sampled chip sequence of the desired user is different from the ideal one we are assuming in the receiver.
In this dissertation, we propose an improved MOE detector, called robust blind space-time minimum variance detector, for a CDMA-based antenna array. In the presence of signal modeling error as well as timing error, a conventional methods suppress the desired signal as if it is an interference. Therefore the performance is degraded. Since the desired signal lies in the intersection of the received signal sub-space and the range of the code matrix of the desired user we impose an additional requirement on the weight vector of the receiver because it should be in the signal subspace.
In the case of time varying multi-path channel, the sig...