This thesis proposes acceleration techniques for Evolutionary Algorithms(EA), which are the hybrid algorithm and ES with Hessian Covariance(ES-HC). Hybrid algorithm combines evolutionary algorithms(EA) and the gradient search technique, for optimization with continuous parameters. Inheriting the advantages of the two approaches, the new method is fast and capable of global search. The main structure of the new method is similar to that of EA except that a special individual called gradient individual is introduced and EA individuals are located symmetrically. The gradient individual is propagated through generations by means of the quasi-Newton method. Gradient information required for the quasi-Newton method is calculated from the costs of EA individuals produced by the evolution strategies(ES). The symmetric placement of the individuals with respect to the best individual is for calculating the gradient vector by the central difference method. For the estimation of the inverse Hessian matrix, Symmetric Rank-One update shows better performance than the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method and Davidon-Fletcher- Powell(DFP) . ES-HC also uses Hessian information. It generates individuals with covariance calculated from decomposition of inverse Hessian matrix without using the gradient individual. Numerical test on various benchmark problems and a practical control design example demonstrate that the proposed methods give faster convergence rate than EA, without sacrificing capability of global search.