Another effect we implemented was to modulate the application of color using 3D solid textures and 2D image textures. To apply solid textures, we use the vertex location as an index into a texture map and apply the corresponding texture color. For 2D textures we define a plane on which the texture resides and perform an orthogonal projection of the unparameterized 3D mesh points into the texture plane. This gives a mapping from the mesh points into the texture. The user can control the position, orientation and scale of the 2D texture plane through a mouse-driven interface.
We have also implemented several compositing filters that are applied to the paint as it is laid down on the surface. The simplest filter is the ``over'' filter. Using this filter, the paint from the brush replaces the paint at each affected vertex. The ``blend'' filter has a slider-selectable parameter \alpha and performs standard alpha blending between the old mesh color and the new paint color. The ``distance'' filter is a special case of the blend filter for which alpha is proportional to the distance of each affected vertex from the tip of the brush.
Each of the brushes we have described so far only affects the surface characteristics of the mesh. We can also change the geometry of the mesh using a displacement brush. Our displacement brush pulls mesh vertices within the brush geometry in the direction of the brush. Although this is an effective way to change the surface geometry, it undermines the use of the physical object as a painting guide. In practice, however, we have found that if we apply small displacements, the physical object can still be used as a guide. A problem with the current implementation is that it is possible to produce objectionably long, thin triangles as we pull the surface. We could alleviate this problem by re-polygonalizing the triangles as we elongate them during the displacement.