PC-based Telerehabilitation System with Force Feedback

Report
Electrical and Computer Engineering Dept.
VR PROGRAMMING
VR Toolkits
System architecture
VR Programming Toolkits
 Are extensible libraries of object-oriented functions
designed to help the VR developer;
 Support various common i/o devices used in VR (so
drivers need not be written by the developer);
 Allow import of CAD models (saves time), editing of
shapes, specifying object hierarchies, collision detection
and multi-level of detail, shading and texturing, run-time
management;
 Have built-in networking functions for multi-user
interactions, etc.
VR Toolkits can be classified by:
 Whether text-based or graphical-programming;
 The type of language used and the library size;
 The type of i/o devices supported;
 The type of rendering supported;
 Whether general-purpose or application specific;
 Whether proprietary (more functionality, better
documented) or public domain (free, but less
documentation and functionality)
VR Toolkits in Early ‘90s
RenderWare (Cannon), VRT3/Superscape (Dimension
Ltd.), Cyberspace Developer Kit (Autodesk), Cosmo
Authoring Tool (SGI/Platinum/CA), Rend386 and others;
 They allowed either text-based programming
(RenderWare, CDK and Rend386), or graphical
programming (Superscape and Cosmo);
 They were platform-independent and generally did not
require graphics acceleration hardware;
 As a result they tended to use “low-end” i/o devices
(mouse) and to support flat shading to maintain fast
rendering.

Rend386 scene
VR Toolkits discussed in this chapter
Name
Appl
Prgm Propri Language
mode etary
Java3D
(Sun Micro)
General
Purpose
text
no
Implemented in C
Programming in Java
Vizard Toolkit
and PeoplePak
WorldViz
General
Purpose
Text/
graph
yes
OpenGL-based Python
scripting language
GHOST (SensAble
Technologies)
Haptics for
Phantom
text
yes
C++
H3D
Haptics/
Graphics
text
no
C++
PeopleShop
(Boston Dynamics)
Military/
civilian
graph
yes
C/C++
Unity 3D
Game
engine
Text/
graph
yes
JavaScript, C#,
and Python
The scene graph:
Is a hierarchical organization of objects (visible or not) in the
virtual world (or “universe”) together with the view to that world;
 Scene graphs are represented by a tree structure, with nodes
connected by branches.
 Visible objects are represented by external nodes, which are
called leafs (they have no children). Example nodes F, G, H, I
 Internal nodes represent transformations (which apply to all their
children)
Root node

A
B
Internal node
C
D
E
J
F
G
H
External node
I
Scene graphs are not static
Scene
palm
Ball
Scene graph shows that
the ball is a child of “scene”
Scene
palm
Ball
Scene graph has been modified, such that
the ball is now a child of the palm
VC 6.1
on book
CD
Scene
palm
Button
Panel
Pinkie
Knob 1
Knob 2
Ring
Thumb
Middle
Index
Knob 3
Knob 4
Scene
palm
Button
Panel
Pinkie
Knob 2
Ring
Thumb
Middle
Index
Knob 1
Knob 3
Knob 4
Model Geometry
Define scene graph
Authoring
(Modeling)
Stages
Define and link
sensors
Define action
functions
Define networking
Start Simulation
Run-time loop
Update Objects
(from sensors and
intelligent actions
Render scene
(graphics, audio,
haptics)
Exit Simulation
Repeats every frame
Read Sensor Data
VR Toolkits discussed in this chapter
Name
Appl
Prgm Propri Language
mode etary
Java3D
(Sun Micro)
General
Purpose
text
no
Implemented in C
Programming in Java
Vizard Toolkit
and PeoplePak
WorldViz
General
Purpose
Text/
graph
yes
OpenGL-based Python
scripting language
GHOST (SensAble
Technologies)
Haptics for
Phantom
text
yes
C++
H3D
Haptics/
Graphics
text
no
C++
PeopleShop
(Boston Dynamics)
Military/
civilian
graph
yes
C/C++
Unity 3D
Game
engine
Text/
graph
yes
JavaScript, C#,
and Python
Java and Java 3D
Java
 object oriented programming language
 developed for network applications
 platform independent
 slower than C/C++
 Java 3D
 Java hierarchy of classes that serves as an interface to 3D
graphics rendering and sound rendering systems
 Perfectly integrated with Java
 Strong object oriented architecture
 Powerful 3D graphics API

Model Geometry
Define scene graph
Java 3D
Initiation
Setup sensors
Define behaviors
Networking
Model Geometry
Define scene graph
Java 3D
Initiation
Setup sensors
Define behaviors
Networking
Java 3D geometry:
Geometry can be imported
from various file formats
(e.g. 3DS, DXF, LWS, NFF,
OBJ, VRT, VTK, WRL)
 Can be created as a
primitive geometry (e.g.
sphere, cone, cylinder, …)
 Custom geometry created
by specifying the vertices,
edges, normals, texture
coordinates using specially
defined classes

Imported geometry
loader.load(“Hand.wrl")
Geometry primitive:
new Sphere(radius)
Custom geometry:
new GeometryArray(…)
new LineArray(…)
new QuadArray(…)
new TriangleArray(…)
Java 3D object
appearance:
The appearance of a
geometry is specified using
an appearance object
 An appearance-class object
stores information about the
material (diffuse, specular,
shininess, opacity, …) and
texture

Mat = new Material();
Mat.setDiffuseColor(r, g, b);
Mat.setAmbientColor(r, g, b);
Mat.setSpecularColor(r, g, b);
TexLd = new TextureLoader(“checkered.jpg”,
...);
Tex = TexLd.getTexture();
Appr = new Appearance();
Appr.setMaterial(Mat);
Appr.setTexture(Text);
Geom.setAppearance(Appr)
Model Geometry
Define scene graph
Java 3D
Initiation
Setup sensors
Define behaviors
Networking
Java3D node types:
BranchGroup
Group
TransformGroup
Switch
Node
Compilable sub-graph
Select which of the children are visible (useful for LOD)
Background
Behavior
Fog
Transform + child nodes
Universe background. Can be a color or an image
Actions to be performed by the simulation
Fog node
Leaf
Light
Light node. Special derived classes: AmbientLight, PointLight,
DirectionalLight
Shape3D
Geometry + Appearance + BoundingBox
Java3D scene graph
Node
Loading objects from files
Java3D offers by default support for Lightwave and Wavefront
model files
 Loaders for other file formats can be downloaded for free from
the web http://www.j3d.org/utilities/loaders.html
 Loaders add the content of the read file to the scene graph as a
single object. However, they provide functions to access the
subparts individually

Universe
Root
Cube
Sphere
Thumb
Hand
Index
Middle
Ring
Small
Java3D model loading
Adding the model to the scene graph
Scene Sc = loader.load(“Hand.wrl”);
BranchGroup Bg = Sc.getSceneGroup();
RootNode.addChild(Bg);
Accessing subparts of the loaded model
Scene Sc = loader.load(“Hand.wrl”);
BranchGroup Bg = Sc.getSceneGroup();
Thumb = Bg.getChild(0);
Index = Bg.getChild(1);
Middle = Bg.getChild(2);
Ring = Bg.getChild(3);
Small = Bg.getChild(4);
Java3D virtual hand loading:
Palm = loader.load("Palm.wrl").getSceneGroup();
ThumbProximal = loader.load("ThumbProximal.wrl").getSceneGroup();
ThumbDistal = loader.load("ThumbDistal.wrl").getSceneGroup();
IndexProximal = loader.load("IndexProximal.wrl").getSceneGroup();
IndexMiddle = loader.load("IndexMiddle.wrl").getSceneGroup();
IndexDistal = loader.load("IndexDistal.wrl").getSceneGroup();
MiddleProximal = loader.load("MiddleProximal.wrl").getSceneGroup();
MiddleMiddle = loader.load("MiddleMiddle.wrl").getSceneGroup();
MiddleDistal = loader.load("MiddleDistal.wrl").getSceneGroup();
RingProximal = loader.load("RingProximal.wrl").getSceneGroup();
RingMiddle = loader.load("RingMiddle.wrl").getSceneGroup();
RingDistal = loader.load("RingDistal.wrl").getSceneGroup();
SmallProximal = loader.load("SmallProximal.wrl").getSceneGroup();
SmallMiddle = loader.load("SmallMiddle.wrl").getSceneGroup();
SmallDistal = loader.load("SmallDistal.wrl").getSceneGroup();
Java3D virtual hand hierarchy:
Palm.addchild(ThumbProximal );
ThumbProximal .addchild(ThumbDistal );
Palm.addchild(IndexProximal );
IndexProximal .addchild(IndexMiddle );
IndexMiddle .addchild(IndexDistal );
Palm.addchild(MiddleProximal );
MiddleProximal .addchild(MiddleMiddle );
MiddleMiddle .addchild(MiddleDistal );
Palm.addchild(RingProximal );
RingProximal .addchild(RingMiddle );
RingMiddle .addchild(RingDistal );
Palm.addchild(SmallProximal );
SmallProximal .addchild(SmallMiddle );
SmallMiddle .addchild(SmallDistal );
Model Geometry
Java3D
Initiation
Define scene graph
Setup sensors
Define behaviors
Networking
Input devices in Java3D
The only input devices supported by Java3D are the mouse and
the keyboard
 The integration of the input devices currently used in VR
applications (position sensors, track balls, joysticks, sensing
gloves…) relies entirely on the developer
 Usually the drivers are written in C/C++. One needs either to rewrite the driver using Java or use JNI (Java Native Interface) to
call the C/C++ version of the driver. The latter solution is more
desirable.
 Java3D provides a nice general purpose input device interface
that can be used to integrate sensors. However, many times
developers prefer custom made approaches

Java3D General purpose sensor interface
class PhysicalEnvironment - stores information about all the input devices
and sensors involved in the simulation
class InputDevice - interface for an input device driver
class Sensor - class for objects that provide real time data
One input device can provide one or more sensors
A sensors object needs not be in relation with an input device (VRML style sensors)
PhysicalEnvironment
InputDevices
Sensors
Model Geometry
Java3D
Initiation
Define scene graph
Setup sensors
Animating the scene
Networking
Java3D - Animating the simulation
Java3D offers Behavior objects for controlling the simulation
 A Behavior object contains a set of actions performed when the object receives
a stimulus
 A stimulus is sent by a WakeupCondition object
 Some wakeup classes:

WakeupOnCollisionEntry
WakeupOnCollisionExit
WakeupOnCollisionMovement
WakeupOnElapsedFrames
WakeupOnElapsedTime
WakeupOnSensorEntry
WakeupOnSensorExit
WakeupOnViewPlatformEntry
WakeupOnViewPlatformExit
Java3D - Behavior usage
Universe
Root
• We define a behavior Bhv that rotates the
sphere by 1 degree
• We want this behavior to be called each
frame so that the sphere will be spinning
WakeupOnElapsedFrames Wup = new WakeupOnElapsedFrames(0);
Bhv.wakeupOn(Wup);
VC 6.3 on book CD
VC 6.4 on book CD
The Java 3D View object describes the graphics display used in the
simulation, as well as the user’s position versus that display (given
by the tracker);
 The View model provides a separation between the virtual world
provided by the ViewPlatform node and the real I/O devices used in
the interaction; This separation helps portability.
 Several users that are tracked
can be mapped to the same
location in the virtual world. This
corresponds to several Views and
a single ViewPlatform;
 Conversely, a single user can control several ViewPlatforms;
 This corresponds to several Views since each ViewPlatform has its own View;
Thus a single user can have several Views to a virtual world, and can “teleport”
between them. Thus a single user can have several Views to a virtual world, and
can “teleport” between them.
View platform 1
View platform 2
Model Geometry
Java3D
Initiation
Define scene graph
Setup sensors
Define behaviors
Networking
Java3D - Networking
Java3D does not provide a built-in solution for networked virtual environments
 However, it’s perfect integration in the Java language allows the developer to
use the powerful network features offered by Java
 Java3D applications can run as stand alone applications or as applets in a web
browser

Server
Java3D
simulation
Java3D
simulation
Java3D
simulation
Java
Applet
Java
Application
Java
Applet
Java3D
simulation
Java
Application
Java3D and VRML
VRML provides
possibilities for defining the objects
and animating the objects in a virtual world
 Graphics APIs such as Java3D load from a VRML file
only the static information, ignoring the sensors, routes,
scripts, etc.
 Java3D structure is general enough to make the import
of sensors and routes possible but currently we are not
aware of any loader that does it
 One of the most popular library of Java3D loaders is
the NCSA Portfolio
(http://www.ncsa.uiuc.edu/~srp/Java3D/portfolio/)
NCSA Portfolio
Offers loaders for several model
files
3D Studio (3DS)
 TrueSpace COB loader (COB)
 Java 3D Digital Elevation Map (DEM)
 AutoCAD (DXF )
 Imagine (IOB)
 Lightwave (LWS)
Wavefront (OBJ)
 Protein Data Bank (PDB)
 Visualization Toolkit (VTK)
 VRML97

Loades the following parts of
VRML97 files
Appearance
 Box
 Coordinate
 Collision (for grouping only)
 Group
 IndexedFaceSet
 IndexedLineSet
 Material
 Normal
 Shape
 Sphere
 Transform

Java3D on-line resources
http://java.sun.com/products/java-media/3D/index.html
 http://www.j3d.org
 http://www.ncsa.uiuc.edu/~srp/Java3D/portfolio/

Comparison between Java3D and WTK
 A comparative
study was done at Rutgers between Java3d
(Version 1.3beta 1) and WTK (Release 9);
 The simulation ran on a dual Pentium III 933 MHz PC (Dell)
with 512 Mbytes RAM, with an Wildcat 4110 graphics accelerator
which had 64 Mbytes RAM;
 The I/o interfaces were a Polhemus Insidetrack or the Rutgers
Master II force feedback glove;
 The scene consisted of several 420-polygon spheres and a virtual
hand with 2,270 polygons;
 The spheres rotated constantly around an arbitrary axis, while
the hand was either rotating, or driven by the user.
Java3D –WTK Comparison
Graphics scene used in experiments
Comparison between Java3D and WTK
The simulation variables used to judged performance were:
 graphic mode (monoscopic, stereoscopic),
 rendering mode (wireframe, Gouraud, textured);
 scene complexity (number of polygons 5,000 – 50,000);
 lighting (number of light sources 1, 5, 10);
 interactivity (no interaction, hand input, force feedback)

Java3D –WTK Comparison
Java3d is faster on average than WTK, but has higher variability
Java3D –WTK Comparison
Java3d Release 3.1 Beta 1 has less system latencies than WTK Release 9
But Java3d has more variability in the scene rendering time
WTK does not have spikes in the scene rendering time
VR Toolkits discussed in this chapter
Name
Appl
Prgm Propri Language
mode etary
Java3D
(Sun Micro)
General
Purpose
text
no
Implemented in C
Programming in Java
Vizard Toolkit
and PeoplePak
WorldViz
General
Purpose
Text/
graph
yes
OpenGL-based Python
scripting language
GHOST (SensAble
Technologies)
Haptics for
Phantom
text
yes
C++
H3D
Haptics/
Graphics
text
no
C++
PeopleShop
(Boston Dynamics)
Military/
civilian
graph
yes
C/C++
Unity 3D
Game
engine
Text/
graph
yes
JavaScript, C#,
and Python
Vizard characteristics:
 Uses Python which is a scalable and cross-platform;
 It is object-oriented and simple to integrate with C/C++
 It runs on Unix, Windows, Mac and other platforms;
 Uses a 4-window “workbench” which allows programmers to
write and execute code, inspect 3D models, drag-and-drop objects,
and issue commands while the scene is being rendered.
Resource window –
Text list of word assets
3D window –
Explore individual
objects
Stack of scripts – errors
are highlighted as you type
Interactive window –
input commands
Workbench use:
Icon menu
Scene exploration
with the mouse
Importing objects
Vizard virtual hand:
import viz
import hand
viz.go()
#Identify the 5DT glove's port.
PORT_5DT_USB = 0
#Add the 5DT sensor
sensor = viz.add('5dt.dls')
#Create a hand object from the data glove
glove = hand.add(sensor,hand.GLOVE_5DT)
#Place the hand in front of the user
glove.translate(0,1,5)
glove.rotate(180,-90,180)
# now when you run the script the glove should be moving
Vizard multi-texturing:
import viz
viz.go()
logo = viz.add('logo.wrl') #add vizard logo and place it in front of user
logo.translate(0,2,4)
tex1 = viz.add('gb_noise.jpg') #add two textures that will then be applied to
the logo
#tex2 = viz.add('brick.jpg')
logo.texture(tex1) #applies the first texture
logo.texture(tex2,'',1) #applies the second texture to the logo
blend = viz.add('multitexblend.fp') #indicate how to blend the two textures
logo.apply(blend)
Vizard Simulation Servers:
 More than one user can inhabit the same environment. Each user
needs to run Vizard. After the world is set up, each user has to set
up two “mail boxes”.
 One receives information from the other user after it was given
network name. Messages come in sequence [0] who sent it, [1]
what is sent, [2] and larger the actual data
Position is the property
information
Ball is the object
action
User 1
User 2
Vizard networking example:
Import viz
Viz.go()
Ball=viz.add(‘ball.wrl’) #create a Ball object that is controlled by the other user
#add the world that will be displayed on your computer
#Use a prompt to ask the other user the network name of his computer.
target_machine = viz.input('Enter the name of the other machine'). upper()
#Add a mailbox from which to send messages to the other user. This is your outbox.
target_mailbox = viz.add(viz.NETWORK, target_machine)
#Add an id for the timer.
BROADCAST = 1
#Add the timer.
def mytimer(num):
if num == BROADCAST:
#Retrieve your current position.
position = viz.get(viz.HEAD_POS)
#Send the data to the target mailbox. All the recipient will get your yaw, x and z
coordinates.
target_mailbox.send(position[0], position[1], position[2])
Vizard networking example:
#This function will deal with incoming messages.
def mynetwork(message):
#message[0] is who sent the message, message[1] is a description of what he
#sent and message[2] and greater are the messages themselves.
x = message[2]
y = message[3]
z = message[4]
ball.translate(x,y,z)
# Callback the network function to await incoming messages.
viz.callback(viz.NETWORK_EVENT, mynetwork)
# Callback the timer.
viz.callback(viz.TIMER_EVENT, mytimer)
# Start the timer.
viz.starttimer(BROADCAST, 0.01, -1)
VR Toolkits discussed in this chapter
Name
Appl
Prgm Propri Language
mode etary
Java3D
(Sun Micro)
General
Purpose
text
no
Implemented in C
Programming in Java
Vizard Toolkit
and PeoplePak
WorldViz
General
Purpose
Text/
graph
yes
OpenGL-based Python
scripting language
GHOST (SensAble Haptics for
Technologies)
Phantom
text
yes
C++
H3D
Haptics/
Graphics
text
no
C++
PeopleShop
(Boston Dynamics)
Military/
civilian
graph
yes
C/C++
Unity 3D
Game
engine
Text/
graph
yes
JavaScript, C#,
and Python
GHOST Toolkit for the PHANToM
 Provides realistic haptic interaction
 Provides and intuitive interfaces to haptics
 Provides a haptic scene graph aligned with
3D graphics APIs
 Provides an extensible environment for
extending haptic interaction technologies
 Point haptic interaction with PHANTOM
 geometry based on user defined force models
 Geometry moves dynamically in response to
forces
y
z
(0,0,0)
x
PHANToM Desktop model
30 fps
GHOST – Application interaction
Application Process
Haptic Process
Scene Creation
Haptic
Rendering
Scene Rendering
Haptic State
Update
Clean-up
Collision detection
Collision response
100 Hz y
z
(0,0,0)
x 1000 Hz
Haptic
Servo loop
adapted from Ghost SDK Programmer’s Guide
(version 3.1)
30 fps
The GHOST haptics scene graph
Application Process
Haptic Process
Scene Creation
Haptic
Rendering
Scene Rendering
Collision detection
Collision response
y
z
(0,0,0)
x
1000 Hz
Haptic State
Update
Clean-up
Haptic
Servo loop
adapted from Ghost SDK Programmer’s Guide
(version 3.1)
Haptic scene graph
 Provides a structured way to construct a haptic scene,
including geometry and dynamics;
 is traversed only from top to bottom (unlike WTK);
 each node reachable by only one (unique) traversal from the
root (a child node has only one parent node)
 Each node has its own transform (no separate transform
nodes);
 cannot use pointers to repeat instances of the same object,
since similar objects have different haptic interactions;
 Separator nodes to create a hierarchy – Transforms on the
Separator affect its sub-tree;
GHOST node classes
gstNode
gstBoundedHapticObj
gstTransform
gstShape
gstPHANToM_SCP
gstBoundary
gstBoundary Cube
gstCone
gstCube
gstForceField
gstCy linder
gstSphere
gstTorus
gstConstantForceField
gstSeparator
gstPoint
gstDynamic
gstTriPoly MeshHaptic
gstDial
gstButton
gstVector
gstSlider
gstTransf ormMatrix
gstRigidBody
gstSpatialObject
gstPHANToM
gstPlane
gstPHANToMDy namic
gstTriPoly Base
gstPHANToMTranslation
gstTriPoly
gstPHANToMRotation
gstSpatialPartition
gstBinTree
gstTriPoly MeshBase
gstTriPoly Mesh
gstEffect
gstInertiaEf f ect
gstManipulator
gstTranslateManip
gstBuzzEf f ect
gstScaleManip
gstConstraint
gstRotateManip
3D support
Static Nodes
Dynamic Nodes
Utility Classes
GHOST nine node classes
GHOST nine node classes (continued)
GHOST nine node classes (continued)
Scene graph example
y
Head
y
Left Elbow
z
x
y z
x
y
Left Shoulder
Right ShoulderRight Elbow
x
z
y
x
z
z
Torso
z
x
y
Body
z
y
x
x
Static scene graph – only separators and geometry nodes as leaves
GHOST code example:
#include <stdlib.h>
#include <gstBasic.h>
#include <gstSphere.h>
#include <gstPHANToM.h>
#include <gstSeparator.h>
#include <gstScene.h>
Main()
gstScene *scene = new gstScene;
gstSeparator *root = new gstSeparator;
gstSphere *sphere = new gstSphere;
Sptere -> setRadius(20);
gstPHANToM *phantom = new gstPHANToM (``PHANToM name``);
Root -> addChild(phantom);
Root-> addChild(sphere);
Scene-> setRoot(root);
Scene -> startServoLoop();
While(!scene -> getDoneServoLoop())
// end application by calling scene -> stopServoLoop ();
30 fps
Force calculation and dynamics
Application Process
Haptic Process
Scene Creation
Haptic
Rendering
Scene Rendering
Collision detection
Collision response
y
z
(0,0,0)
x
1000 Hz
Haptic State
Update
Clean-up
Haptic
Servo loop
adapted from Ghost SDK Programmer’s Guide
(version 3.1)
Collision detection and response
 The scene graph contains at least one representation of the
haptic interface through gstPHANToM node. There can be up
to four such nodes (four haptic interfaces in one haptic scene)
 Collisions are detected between this node and the geometry
nodes through the gstShape node that goes from “untouched”
to “touched”;
 When collision exists, the gstPHANToM_SCP (surface
contact point) is added to the scene graph. This node should
be added to the scene graph under the same parent as
gstPHANToM node.
Collision detection and response
 Forces are calculated following collision;
 Collision response through dynamic effects (movable
nodes, solid body dynamics);
 Application informed if needed (user defined).
Normal Force
(depends on spring
and damper coefficients)
Friction Force
(depends on static and dynamic
friction coefficients)
Dynamic nodes
 The gstDynamic node adds movement ability to the
geometry nodes beneath it. A subtree under a gstDynamic
node represents one physically dynamic object.
 Forces generated by gstPHANToM node colliding with one
of the geometries of such object are added to the state of the
gstDynamic node
 Transformations (rotations, translations) are always applied
to the gstDynamic node, not its children;
 It has four derived classes gstDial, gstButton, gstSlider and
gstRigidBody.
Dynamic nodes (continued)
 When a gstDynamic node changes state, an event is
generated which calls a user-defined callback function.
 Example – the application may quit if a gstButton changes
state from pressed to released.
gstButton
behavioral example
30 fps
Updating the application
Application Process
Haptic Process
Scene Creation
Haptic
Rendering
Scene Rendering
Collision detection
Collision response
y
z
(0,0,0)
x
1000 Hz
Haptic State
Update
Clean-up
Haptic
Servo loop
adapted from Ghost SDK Programmer’s Guide
(version 3.1)
Graphics and event callbacks
 The user selects which nodes have call-back functions, and
what information needs to be sent back to the application;
 This way the application calls updateGraphics to have graphics
information updated. Nodes that have a graphics call-back
defined, and have a new state since the last call to
updateGraphics will copy their current state to a defined data
structure
 Call-backs pass new state information of the haptic scene nodes
from GHOST haptics process to the application process;
 For example, the user can create a callback for the graphics
representation of the position of the gstPHANToM node. This
should change to callback of gstPHANToM_SCP after collision,
so the user can see the location of the contact point on the object.
Mapping the user to the haptic scene
User workspace
Phantom workspace
Camera mapping to PHANToM workspace
z-axis
phantomSep
Transform M rotation
Camera
phantomSep
Transform M camera
phantomSep
Transform M zaxisOffset
from Ghost SDK Programmer’s Guide (version 3.1)
Camera
Scaling camera and PHANToM workspaces
Phantom workspace
Phantom workspace
Phantom workspace
Camera
Camera workspace
too large
Camera
Camera
Camera workspace
too small
Camera workspace
appropriate
Scaling camera and PHANToM workspaces
Dxmax
Dphantomxmax
Camera
 The scale factor depends on the distance Dxmax from
the focal point to the frustum
The distance Dphantomxmax from the non-scaled
PHANTOM workspace center to the side limit must
also be determined
The scale factor is then Sfrustum=Dxmax/Dphantomxmax
Scaling camera and PHANToM workspaces
 To maintain haptic fidelity, the gstShape node physical
properties (compliance and damping) need to be scaled
too;
 SurfaceKspringnew = SurfaceKspringcurrent/Sfrustum
 SurfaceKdampingnew = SurfaceKdampingcurrent/Sfrustum
where SurfaceKspring and SurfaceKdamping are gstShape
compliance and damping coefficients.
VR Toolkits discussed in this chapter
Name
Appl
Prgm Propri Language
mode etary
Java3D
(Sun Micro)
General
Purpose
text
no
Implemented in C
Programming in Java
Vizard Toolkit
and PeoplePak
WorldViz
General
Purpose
Text/
graph
yes
OpenGL-based Python
scripting language
GHOST (SensAble
Technologies)
Haptics for
Phantom
text
yes
C++
H3D
Haptics/
Graphics
text
no
C++
PeopleShop
(Boston Dynamics)
Military/
civilian
graph
yes
C/C++
Unity 3D
Game
engine
Text/
graph
yes
JavaScript, C#,
and Python
SenseGraphics
 Founded
in 2004 in Stockholm
 SenseGraphics represents over twenty years of
experience in the haptics and graphics industry.
 SenseGraphics provides a high performance
application development platform which enables
integration of haptics and 3D stereo visualization
into multimodal software applications
What is H3D API?
 Product
of SenseGraphics
 Software development platform for multisensory applications
 Uses the open standards X3D, OpenGL and
SenseGraphics haptics in a unified
scenegraph taking care of both the haptic
and graphic rendering
What it does
 Combines
graphics and haptics into one platform.
 Adds haptics to existing 3D models.
 Enables rapid programming of haptic applications
using X3D and Python.
 Easily extended with custom graphics-haptics
features using C++.
Continued
 Supports
SensAble, Novint and
MOOG FCS haptic devices.
 Supports most 3D stereo display
systems.
 Runs on Windows, Linux and
Mac.
H3DAPI Architecture
Some H3D nodes
Example Code
Results of code
Applications
 Computer
Assisted Radiology & Surgery
Switzerland (CARCAS)
Application Cont’d

University of Virginia

http://www.youtube.com/watch?v=mrMsb71ZJ1I
Other Applications and Projects
Why H3DAPI over Ghost?




H3D is compatible with many scene-graph and 3D
environment generating platforms. (VRML, X3D,
Java3D, OpenGL)
Uses C++ and Python scripting language.
You get support haptics devices from several
manufacturers.
H3D provides graphic renderings while ghost needs
another program. (Cortona 3D)
References






http://www.sensegraphics.com/index.php
http://www.h3dapi.org/
http://www.devmaster.net/forums/showthread.php?t=22
36
http://www.carcas.ch/
http://www.vrac.iastate.edu/~charding/Research/Haptics
.html
http://www.sys.virginia.edu/ggerling/facilities.htm
VR Toolkits discussed in this chapter
Name
Appl
Java3D
(Sun Micro)
General
Purpose
text
no
Implemented in C
Programming in Java
Vizard Toolkit
and PeoplePak
WorldViz
General
Purpose
Text/
graph
yes
OpenGL-based Python
scripting language
GHOST (SensAble
Technologies)
Haptics for
Phantom
text
yes
C++
H3D
Haptics/
Graphics
text
no
C++
PeopleShop
Military/
(Boston Dynamics) civilian
graph
yes
C/C++
Unity 3D
Text/
graph
yes
JavaScript, C#,
and Python
Game
engine
Prgm Propri Language
mode etary
VR Toolkits discussed in this chapter
Name
Application
Java3D
(Sun Microsystems)
General Purpose
Vizard and
General Purpose
PeoplePak (WorldViz) avatar extension
GHOST (SensAble
Technologies)
PeopleShop
(Boston Dynamics)
3DGame Studio
Proprietary Library size language
no
yes
Implemented in C
Programming in Java
19 packages, 275 classes
OpenGL-based
Python scripting
language
Haptics for
Phantom
yes
C++
Military/civilian
yes
C/C++
Game engine
yes
C++
BDI PeopleShop/DI-Guy characteristics
 Provides a realistic way to simulate human characters in
real-time scenes without using tracking suits;
 Is a task-level programming environment combined with a
menu-based GUI;
 Tasks are mapped to pre-defined (stored) joint motions
which are interpolated in real time;
 Well-suited for Distributed Interactive Simulations (DIS)
due to low bandwidth requirements and live reckoning;
 Initially designed for the military, now extended to civilian
applications, such as accident reenactment, architectural
walk-through, driving simulators, police training, etc.
BDI PeopleShop/DI-Guy characteristics - continued
 Linkable object library that runs on SGI, Intel PCs, as well
as other platforms;
 The library has modules for run-time motion engines,
graphics display, motion data, 3D graphics models, textures,
and network interfaces for DIS;
 Runs under OpenGL, Direct3D, Mak Stealth and other
packages;
 Recommended hardware is Intel > 200MHz, 64 MB Ram,
and graphics accelerator (for Open GL, OpenGVS or
Direct3D).
Scene Geometry
PeopleShop
Initiation
Define character
path
Define
sensors
Define
behavior
Define networking
Scene Geometry
PeopleShop
Initiation
Define character
path
Define
sensors
Define
behavior
Define networking
PeopleShop Characters
 Are articulated polygonal structures with 54 DOF and 11
links;
 Vehicles are also treated as characters;
 Different types of characters have different acceptable
actions;
 Each type of character has different user-selectable
appearances (ex. Character vehicle can be a tank or a police
car, etc.);
 Characters are textured to increase realism and reduce
polygonal count
Character selection
Character type determines acceptable actions (menu selectable)
Appearance selection
Bob_shorts
Joe_blue
Bridget_skirt
Character appearance (menu selectable)
Diane_teen
BDI Toolkits
38 polygons
2500 polygons
Supplemental bdi.DIGuy-LOD.mpg
Character level-of-detail segmentation based on distance to virtual camera
improves real-time performance (up to about 100 characters can be in a scene)
Scene Geometry
PeopleShop
Initiation
Define character
path
Define
sensors
Define
Behavior
Define networking
PeopleShop path specification
Waypoint
Slope
Adjuster
Path
Initial path
Extended path
Added waypoint
BDI Toolkits
Action
bead
End
bead
Last action bead
Stacked action
beads
Editing actions
VC 6.6
on book CD
VC 6.5
on book CD
Scene Geometry
PeopleShop
Initiation
Define character
path
Define
sensors
Define
behavior
Define networking
BDI Toolkits
Sensor boundary
Soldier A
Soldier B
When soldier A enters the sensor volume, the system is notified – this triggers
soldier B’s shooting of A
PeopleShop sensors
Sensor boundary
Supplemental
bdi.farmhouse.mpg
Sensors are user-defined volumes in space that detect
when a character enters them
(PeopleShop User’s Manual)
Scene Geometry
PeopleShop
Initiation
Define character
path
Define
sensors
Define
behavior
Define networking
PeopleShop Behaviors
 Behaviors can be reflex (based on signals received from sensors);
 Behaviors can also be specified with decision beads;
 Decision beads can be placed on the character’s path (colored red);
 The two parameters characterizing a decision bead are
distance and length;
 Distance specifies how far from the start of the path the decision
Bead is placed;
 Length indicates the distance from the beginning of the decision
region that the decision bead is active;
 Decisions can be converted to script:
BDI Toolkits
Decision clauses (IF/THEN/ELSE)
PeopleShop
Run-time
PeolpeShop
Initiation
Scenario Visualization
Interactive Training
Immersive Training
User(s)
User(s)
User(s)
PeopleShop Run-time
PeolpeShop
Initiation
Scenario Visualization
Interactive Training
Immersive Training
User(s)
User(s)
User(s)
BDI PeopleShop Toolkit
VC 6.7
User is interacting in real time with the simulation using a joystick or mouse
and menu. Limited control and immersion. Natural speeds should not be
exceeded.
(from Koechling et al., 1997)
PeopleShop
Run-time
PeolpeShop
Initiation
Scenario Visualization
Interactive Training
Immersive Training
User(s)
User(s)
User(s)
BDI PeopleShop – Run time modes
Sensorized weapon
Omni-directional treadmill
User is interacting in real time with the simulation using a trackers and sensors.
Control is at the joint level and immersion is increased.
(BDI, 1997)
Scene Geometry
PeopleShop
Initiation
Define character
path
Define
sensors
Define
behavior
Define networking
PeopleShop Networking
 Updating human figures in DIS is much more bandwidth
expensive than vehicles;
Vehicles have few degrees of freedom, while a human
figures with 40 joints updated at 20 Hz require 800
packets/sec.;
Instead of updating every joint, PeopleShop only updates at
the task level (action, position, velocity). It requires about two
packets/sec to produce a smooth simulation;
Works well for large number of participants such as in
dismounted infantry training.
Uses “live reckoning” vs. dead reckoning used previously
for vehicles
BDI Toolkits
Classical DIS using dead reckoning
(from Koechling et al., 1997)
BDI Toolkits
DIS using live reckoning and human-in-the-loop
DI-Guy model
DI-Guy model
Task-level change
(action, orientation, velocity)
(from Koechling et al., 1997)
PeopleShop “Top Gun”
(courtesy Boston Dynamics Inc.)
VR Toolkits discussed in this chapter
Name
Appl
Prgm Propri Language
mode etary
Java3D
(Sun Micro)
General
Purpose
text
no
Implemented in C
Programming in Java
Vizard Toolkit
and PeoplePak
WorldViz
General
Purpose
Text/
graph
yes
OpenGL-based Python
scripting language
GHOST (SensAble
Technologies)
Haptics for
Phantom
text
yes
C++
H3D
Haptics/
Graphics
text
no
C++
PeopleShop
(Boston Dynamics)
Military/
civilian
graph
yes
C/C++
Unity 3D
Game
engine
Text/
graph
yes
JavaScript, C#,
and Python
Game Engine Comparison
Unity
UDK/Unreal
DX Studio
Java3D
jMonkeyEngine
Price
Free / $1500
Free / $$$
Free / $800
Free
Free
Graphical Editing
Yes
Yes
Yes
No
Minimal
Plugin required?
Web only
No
Yes
JVM
JVM
Language Support
Mono (C#)
JavaScript
Boo (Python)
UnrealScript
JavaScript
Java
Java
External Library Support
Yes
Yes
Yes
Yes
Yes
Home Computer Deployment
PC
Mac
PC
PC
PC
Mac
Linux
PC
Mac
Linux
Web Deployment
Yes
No
Yes
WebStart
WebStart
Game Console Deployment
(Licenses Required)
XBox360+Arcade
Wii+WiiWare
PS3
XBox360+Arcade
PS3
Sony NGP
No
No
No
Mobile Deployment
iOS / Android
iOS / Android
Android
No
No
Why not Java3D?









Advantages
Open source cross-platform development.
Low-level control of scene graph and objects.
Can be used with other Java and native libraries.
Disatvantages
Scene manipulation done strictly through source, leads to
slow turnaround.
Higher level control is up to the programmer.
3D sound very buggy.
Community support only, no longer any commercial
support.
Why Unity?•Free edition offers robust development
environment and educational licenses
available.
Unity
Price
Free / $1500
Graphical Editing
Yes
Plugin required?
Web only
Language Support
Mono (C#)
JavaScript
Boo (Python)
External Library Support
Yes
Home Computer Deployment
PC
Mac
Web Deployment
Yes
Game Console Deployment
(Licenses Required)
XBox360+Arcade
Wii+WiiWare
PS3
Mobile Deployment
iOS / Android
•Supports multiple programming
languages to design and manipulate the
scene.
•External library and .Net support
allows seamless communication with
additional hardware devices.
•Easy-to-use graphical interface allows
live scene editing for efficient
development and testing.
•Quick turnaround times.
•Works on almost all available
platforms.
Unity - Physics Engine
 Unity
uses NVIDIA’s PhysX Engine.
 Streamlined physics modeling for rigid bodies, cars,
character ragdolls, soft bodies, joints, and cloths.
 By simply attaching a rigid body to a game object and
adding forces, realistic physical interactions can be
created.
 Objects with rigid bodies attached will interact with
each other.
 Colliders are used to control these object interactions
and trigger collision events.
Unity Gallery
Unity - GUI
Scene Hierarchy
Project Panel
Inspector
Unity – Project Panel
 This
panel shows all of the available game assets in the
current project directory.
 Game assets can include scripts, prefabs, 3D models,
texture images, sound files, fonts, etc…
 New assets can be added by simply dragging them into
the project panel or placing them into the project
directory.
 These files can be edited and saved using external
programs and the scene will be updated automatically.
 Unity supports a number of 3D model formats and
converts to the Autodesk FBX format when added to the
project.
Unity - Scene Hierarchy
 Provides
a means of adding new game objects and
managing existing ones.
 Game objects can contain a combination of transforms,
graphics objects, physics colliders, materials, sounds,
lights, and scripts.
 Each game object in the hierarchal tree represents a node
in the scene graph.
 Similarly to VRML and Java3D, the scene graph
represents the relative spatial relationship of objects.
Example: A finger connected to a hand will translate
accordingly when the hand is moved.
Unity - Simple Hierarchy Example
Unity - Inspector
 Shows
the components attached to currently selected
game object and their properties.
 Manual control over an object’s transform allows precise
placement of objects.
 Variables exposed by scripts attached to the object can be
viewed and set through this panel, allowing parameters to
be changed without the need to edit source.
 These changes can be done while the project is live and
the scene will update immediately.
Unity - Simple Game Object
 Defines spatial properties
(Transformation matrix)
 Controls physics and
physical interactions.
 Graphics mesh, this is what
you will actually see.
 Script to destroy object after
N collisions or after elapsed
time. Contains particle
emitter for explosion effect.
 Sound associated with this
object.
Unity - GUI
Scene Editor
Console
Game Preview
Unity - Scene Editor
 Allows
graphical monitoring and manipulation of scene
objects.
 Switch between various orthogonal and perspective
views.
 Objects can be translated, rotated, and scaled graphically
with the mouse.
 When live, the editor works like a sandbox in which you
can play around with objects without actually modifying
the scene.
 Shows “Gizmos” for invisible scene objects, such as light
sources, colliders, cameras, and sound effects.
Unity - Simple JavaScript Example



Public variables are exposed to the editor, allowing monitoring and editing of
the live scene. This also allows for communication between objects.
The Update() method is called at every frame.
In this example, every time the left-mouse button is clicked (1) a copy of the
input object is created and added to the scene in front of the camera (2), the
cube counter is increased (3), a randomly colored material is used (4), and a
force is applied (5). This gives the appearance that the object is being
launched away from you.
(1)
(2)
(3)
(4)
(5)
Unity - Complex Scene
Unity - Asset Store
A
marketplace to buy
and sell assets used
within Unity.
 This includes 3D
models, textures, scripts,
etc…
 Can be used to
drastically reduce
development time, or
sell assets you have
created.
Unity - Union Marketplace
 Similar
to Apple’s App Store, this is a
marketplace in which games can be sold for
various platforms.
 Allows developers to reach out to markets that
would be otherwise inaccessible.
 70% of profits go to the developer while 30%
goes to Union.
Unity - VR Applications
 Unity
is able to use .Net libraries and external
shared libraries.
 This enables the use of nearly any hardware
device within Unity applications.
 Cameras can be used to create augmented reality.
Unity - AR on IPhone
Unity on iPhone

similar documents