This paper presents a body model server (BMS) that provides real-time access to the position and posture of a person's torso, arms, hands, head, and eyes. It can be accessed by clients over a network. The BMS is designed to function as a device-independent data-layer between the sensing devices and client applications that require real-time human motion data, such as animation control. It can provide clients with accurate information at up to 40 Hz. For data collection, the model uses four magnetic position/ orientation sensors, two data-gloves, and an eye-tracker. The BMS combines the data-streams from the sensors and transforms them into snapshots of the user's upper-body pose. A geometric model made up of joints and segments structures the input. Posture of the body is represented by joint angles. Two unique characteristics of our approach are the use of the implicit, geometric constraints of the sensed body to simplify the computation of the unmeasured joint angles, and the use of time-stamped data that allow synchronization with other data streams, e.g., speech input. This paper describes the architecture of the BMS, including the management of multiple input devices, the representation and computation of the position and joint angle data, and the client-server interface.