One solution to this problem would be to register the physical object as it is being scanned by the 3D scanner. Assuming the scanner always creates a mesh in the same coordinate system for each scan, we can preregister the tracker coordinate system to this mesh coordinate system using Besl's algorithm. Then, scanning any new object will automatically register it to the tracking system. However, this approach fails when we combine multiple scans using the zipper software, because the physical object must be moved between scans and so we lose the correspondence between the mesh and the object.
Ensuring that the object does not move once it has been registered is can make painting awkward and unnatural. Allowing the object to be moved would let the user to paint more comfortably. One way to permit such object movement would be to attach another sensor of the space tracker to the object and then track the movement of the object in addition to the movement of the brush.
A disadvantage of our approach is that we can only paint meshes for which we have a corresponding physical object. Thus, we can not directly paint a mesh created with a modeling or CAD program for example. However, several new rapid prototyping technologies have recently been developed for synthesizing 3D objects directly from computer models [7] [12]. Although it would be a considerable expense, with such a prototyping system we could create a physical object representing almost any mesh and then use it as a guide for painting on the mesh.
Another problem is that the user is moving the sensor along the physical object while paint is only being applied to the mesh on the monitor. Thus, the user must look at two places at once to see where the paint is being applied. This problem is reduced by placing the physical object in front of the monitor while painting.
One of the problems with polygon meshes is that they are hard to animate. Many animators are used to manipulating the control points of curved surface patches, not the vertices of an irregular mesh. Furthermore, they want to manipulate only a few control points, not the 100,000's of vertices in our typical mesh. One solution we are investigating is to fit NURBS patches to our meshes. The boundaries of these patches would be specified by tracing them using our system. In this case we would replace our space-filling brushes with an algorithm that chains together mesh vertices lying along the path traced out by the stylus.