It is well known that standard learning classifier systems, when applied to many different domains, exhibit a number of problems: payoff oscillation, difficulty in regulating interplay between the reward system and the background genetic algorithm (GA), rule chains' instability, default hierarchies' instability, among others. ALECSYS is a parallel version of a standard learning classifier system (CS) and, as such, suffers from these same problems. In this paper we propose some innovative solutions to some of these problems. We introduce the following original features. Mutespec is a new genetic operator used to specialize potentially useful classifiers. Energy is a quantity introduced to measure global convergence to apply the genetic algorithm only when the system is close to a steady state. Dynamic adjustment of the classifiers set cardinality speeds up the performance phase of the algorithm. We present simulation results of experiments run in a simulated two-dimensional world in which a simple agent learns to follow a light source.