Abstract
This paper describes the evolutionary split and merge for expectation maximization (ESM-EM) algorithm and eight of its variants, which are based on the use of split and merge operations to evolve Gaussian mixture models. Asymptotic time complexity analysis shows that the proposed algorithms are competitive with the state-of-the-art genetic-based expectation maximization (GA-EM) algorithm. Experiments performed in 35 data sets showed that ESM-EM can be computationally more efficient than the widely used multiple runs of EM (for different numbers of components and initializations). Moreover, a variant of ESM-EM free from critical parameters was shown to be able to provide competitive results with GA-EM, even when GA-EM parameters were fine-tuned a priori.