Virtual humans are more and more used in VR applications, but their animation is still a challenge, especially if complex tasks must be carried out in interaction with the user. In many applications with virtual humans, credible virtual characters play a major role in presence. Motion editing techniques assume that the natural laws are intrinsically encoded in prerecorded trajectories and that modifications may preserve these natural laws, leading to credible autonomous actors. However, a complete knowledge of all the constraints is required to ensure continuity or to synchronize and blend several actions necessary to achieve a given task. We propose a framework capable of performing these tasks in an interactive environment that can change at each frame, depending on the user’s orders. This framework enables VR applications to animate from dozens of characters in real time for complex constraints, to hundreds of characters if only ground adaptation is performed. It offers the following capabilities: motion synchronization, blending, retargeting, and adaptation thanks to enhanced inverse kinetics and kinematics solver. To evaluate this framework, we have compared the motor behavior of subjects in real and in virtual environments.