A large number of practical optimization problems involve elements of
quite diverse nature, described as mixtures of qualitative and
quantitative information, and whose description is possibly incomplete.
In this work we present an extension of the {\em breeder genetic
algorithm} that represents and manipulates this heterogeneous
information in a natural way. The algorithm is illustrated in a set of
optimization tasks involving the training of different kinds
of neural networks. An extensive experimental study is presented in order
to show the potential of the algorithm.