Artificial neural networks (ANNs) have been successfully used for solving a wide variety of problems. However, determining a suitable set of structural and learning parameter values for an ANN still remains a difficult task. This article is concerned with the robust design of multilayer feedforward neural networks trained by backpropagation algorithm (Backpropagation net, BPN) and develops a systematic, experimental strategy which emphasizes simultaneous optimization of BPN parameters under various noise conditions. Unlike previous works, the present robust design problem is formulated as a Taguchi's dynamic parameter design problem, together with a fine-tuning of the BPN output when necessary. A series of computational experiments are also conducted using the data sets from various sources. From the computational results, statistically significant effects of the BPN parameters on the robustness measure (i.e., signal-to-noise ratio) are identified, based upon which an economical experimental strategy is derived. It is also shown that fine-tuning the BPN output is effective in improving the signal-to-noise ratio. Finally, the step-by-step procedures for implementing the proposed approach are illustrated with an example. (C) 2004 Elsevier Ltd. All rights reserved.