Competitive fitness functions can generate performance superior to absolute fitness functions [Angeline and Pollack 1993], [Hillis 1992]. This chapter describes a method by which competition can be implemented when training over a fixed (static) set of examples. Since new training cases cannot be generated by mutation or crossover, the probabilistic frequencies by which individual training cases are selected competitively adapt. We evolve decision trees for the problem of word sense disambiguation. The decision trees contain embedded bit strings; bit string crossover is intermingled with subtree-swapping. To approach the problem of overlearning, we have implemented a fitness penalty function specialized for decision trees which is dependent on the partition of the set of training cases implied by a decision tree.