In this paper, we propose an approach to real-time haptic interaction based on the concept of simulating the constraining propertes of space. Research on haptic interaction has been conducted from the points of view of both surface and volume rendering. Most approaches to surface rendering—such as the constraint-based god-object method, the point-based approach, and the virtual proxy approach—have dealt only with the interaction with an object surface. Whereas, in volume rendering approaches, algorithms for representing volume data through interactions in space have been investigated. Our approach provides a framework for the representation of haptic interaction with both surface and space. We discretize the space using a tetrahedral cell mesh and associate a constraining property with each cell. The interaction of the haptic interface points with a volume is simulated using the constraining properties of the cells occupied by this volume. We implemented a fast computation algorithm that works at a haptic rate. The algorithm is robust in that any sudden or quick motion of the user does not disturb the computation, and the computation time for each cycle is independent of the complexity of the model as a whole. To demonstrate the performance of the proposed method, we present experimental results on the interaction with models of varying complexity. Also, we discuss some problems that need to be solved in future work.