VRML Script Tutorial
VRML Interactive Tutorial
Let there be Light
Materials with Colored Lights
VRML File Structure
Drawing: Shape node
Hierarchical Node Structures
Defining and Instancing Nodes
Defining Levels of Detail
Events in VRML
Let the Music Play
Adding Realism to the world
Information about your world
Definition for Auxiliary Nodes
Dragging sensors are a special kind of sensors that not only track users
motion but also move the objects within the same group as the sensor. There
are three type of dragging sensors:
lets the user move objects in the XY plane.
Maps the movement to the surface of a conceptual cylinder.
Maps the movement to the surface of a conceptual sphere.
The above sensors all share the following fields:
enabled defines the status of the sensor
offset indicates the initial position of the shapes within the
group, a zero offset will mean that the shapes will be moved from
their original position, whereas an offset different from zero will
indicate that dragging starts at the original position plus the specified
offset. The offset value is ignored if autoOffset is
TRUE. Note that the type of the offset field varies with the type
of the sensor.
autoOffset specifies if the browser should track the current
position or do all dragging operations relative to the original position.
Only relevant for the second and subsequent draggings. If autoOffset
is TRUE then the second dragging will start where the first one ended,
if FALSE then the shapes will return to their original position each time
a new dragging operation begins.
The following events are common to all the sensors:
isActive indicates whether a dragging operation is being done.
The isActive will output TRUE if the user has the mouse pressed
over a shape within the same group as the sensor, and FALSE otherwise.
trackPoint_changed provides the actual coordinates in the surface
defined by the sensor
rotation_changed (SphereSensor and CylinderSensor) and translation_changed
(PlaneSensor) provides the relative orientation or translation being made.
In order to actually move the shapes you should place the shapes inside
node. The Transform node should be in the same group as the sensor.
You then need to route this events to fields in a Transform
group. See the examples provided for each sensor.
If using multiple sensors in the same group it is up to you to specify
which does what, they will all generate events when any of the shapes within
the group is affected.
If using multiple drag sensors in nested groups then the inner group sensors
grab the users action and the outer group sensors will ignore it.