Many studies have focused considerable attention on choosing a model that represents the underlying process of a time series and on using that model to forecast the future. In the real applications, however, there may be cases in which a single model cannot represent all relevant characteristics of the original time series. In such circumstances, combining the forecasts from several models may yield better performance. The most popular methods for combining forecasts involve taking a weighted average of multiple forecasts. These weights, however, are usually unstable. When the assumptions of normality and unbiasedness of forecast errors are satisfied, a Bayesian method can be used to update the weights. In applications, however, there are many circumstances in which the Bayesian method is not appropriate. This paper proposes a PNN (Probabilistic Neural Network) approach to combining forecasts that can be applied when the assumptions of normality or unbiasedness of the forecast errors are not satisfied.
The PNN method has traditionally been used in pattern recognition. It is similar to the Bayesian approach and we suggest its use as an updating method for unstable weights when combining forecasts. Unlike the Bayesian approach, it does not require the assumption of a specific prior distribution because it estimates the probability distribution from given data. Empirical results reveal that the PNN method offers superior predictive capabilities.