Recently a new so-called energy complexity measure has been introduced and studied for feedforward perceptron networks. This measure is inspired by the fact that biological neurons require more energy to transmit a spike than not to fire, and the activity of neurons in the brain is quite sparse, with only about 1% of neurons firing. In this letter, we investigate the energy complexity of recurrent networks, which counts the number of active neurons at any time instant of a computation. We prove that any deterministic finite automaton with m states can be simulated by a neural network of optimal size with the time overhead of per one input bit, using the energy O(e), for any e such that and e=O(s), which shows the time-energy trade-off in recurrent networks. In addition, for the time overhead satisfying , we obtain the lower bound of on the energy of such a simulation for some constant c>0 and for infinitely many s.