• No results found

Continuously-Adaptive Haptic Rendering

N/A
N/A
Protected

Academic year: 2022

Share "Continuously-Adaptive Haptic Rendering"

Copied!
10
0
0

Laster.... (Se fulltekst nå)

Fulltekst

(1)

JihadEl-Sana 1

andAmitabhVarshney 2

1

DepartmentofComputerScience,Ben-GurionUniversity,Beer-Sheva84105,Israel

jihad@cs.bgu.ac.il

2

DepartmentofComputerScience,UniversityofMarylandatCollegePark,

College Park,MD20742,USA

Abstract. Hapticdisplaywithforcefeedbackisoftennecessaryinsev-

eralvirtualenvironments.Toenablehapticrenderingoflargedatasetswe

introduceContinuously-AdaptiveHapticRendering,anovelapproachto

reducethecomplexityoftherendereddataset.Weconstructacontinu-

ous,multiresolutionhierarchyofthemodelduringthepre-processingand

thenatruntimeweusehigh-detailrepresentationforregionsaroundthe

probepointerand coarser representation fartheraway.Weachievethis

byusingabell-shapedltercenteredatthepositionoftheprobepointer.

Usingouralgorithmweare ableto hapticallyrenderoneto twoorders

of magnitudelarger datasets than otherwisepossible. Ourapproachis

orthogonal to the previous workdone in accelerating haptic rendering

andthuscanbeusedwiththem.

1 Introduction

Hapticdisplayswithforceandtactilefeedbackareessentialtorealisminvirtual

environmentsandcanbeusedin variousapplicationssuch asmedicine(virtual

surgeryformedicaltraining,moleculardockingfordrugdesign),entertainment

(video games),education (studyingnano,macro, orastronomicalscale natural

andsciencephenomena),andvirtualdesignandprototyping(nanomanipulation,

integrating haptics into CAD systems). Humanscan sense touch in twoways:

tactileandkinesthetic. Tactilereferstothesensationcausedbystimulatingthe

skin nervessuchasbyvibration, pressure,andtemperature.Kinestheticrefers

tothesensationfrommotionandforces,whichtriggerthenervereceptorsinthe

muscles,joints,andtendons.

Computerhapticsis concerned with generating and renderinghaptic stim-

ulito a computeruser,just as computergraphicsdeals withvisual stimuli.In

haptic interfaceinteraction, the userconveysadesired motor actionby physi-

callymanipulatingtheinterface,whichinturnprovidestactualsensoryfeedback

to theuserbyappropriatelystimulatingher/histactileand kinestheticsensory

systems. Figure 1 showsthe basic process of haptically rendering objectsin a

virtual environment. As the user manipulates the generic probe of the haptic

device, thehapticsystemkeepstrackoftheposition andtheorientationofthe

probe.Whenaprobecollideswith anobject,the mechanisticmodel calculates

(2)

Geometry Information

Forward Kinematices

Inverse Kinematics

Force Mapping

Servo Loop Collision Detection Generic Probe

Information

Modified Force

Applied Force Contact Data

Human Machine Contact

Touch Effects

Fig.1.Thehapticrenderingprocesses

Several haptictechniqueshave been developed to haptically render3D ob-

jectswhichcanhaveeithersurface-basedorvolume-basedrepresentation.Haptic

interaction invirtualenvironmentcould bePoint-basedorRay-based.Inpoint-

basedhapticinteractiononlytheend-pointofthehapticdevice,knownasHaptic

InterfacePoint(HIP),interactswiththeobjects.Inray-basedhapticinteraction,

thegenericprobeofthehapticdeviceismodeledasaniteray.Thecollisionis

detectedbetweenthisrayandtheobject,andtheorientationoftherayisused

in computingthe hapticforcefeedback.Inthis approach theforcereection is

calculatedusingalinearspringlawF =kx.

Invisualrenderingseveraltechniques areused to enhance theinteractivity

and improvetherealism of therendered objects.Smoothshading and texture

mappingaregood examplesofthese techniques.Similaralgorithmsareusedin

hapticrenderingto conveythetactualfeelingoftheinspectedobjects.Someof

these approacheshavebeenadapted fromgraphicsrenderingwhileothershave

beendevelopedexclusivelyforhapticrendering.

2 Previous Work

Inthissectionweoverviewsomeoftheworkdoneinhapticsrenderingforvirtual

environmentapplications. Haptic rendering has been found to be particularly

usefulinmoleculardocking[3]and nanomanipulation[19].

Randolphetal.[13] havedevelopedanapproachforpoint-basedhapticren-

deringof asurfaceby usinganintermediate representation.Alocal planarap-

proximationto thesurface is computedat the collisionpoint for each cycle of

theforceloop.Thereactionforcevectoriscomputedwithrespecttothetangent

plane.Thisapproachhasonemajordrawback{undesirableforcediscontinuities

mayappearifthegenericprobeofthehapticdeviceismovedoverlargedistances

before thenewtangent planeis updated.An improvementto this method has

beenpresentedbySalisburyand Tarr[17].

Basdoganetal.[2]havedevelopedaray-basedrenderingapproach.Thegeneric

probeof thehapticdevice ismodeledasalinesegment.Theyupdatethesim-

(3)

objectsinthree progressivelynestedchecks:(a) thebounding boxesofthevir-

tualobjects,(b) thebounding boxof appropriatetriangular elements, and (c)

appropriate triangular elements. They estimate the reaction force by using a

linearspringlawmodel.

Ruspiniet al. [16] introducethenotion ofproxyonthe hapticsystemasa

masslessspherethatmovesamongtheobjectsintheenvironment.Theyassumed

that all the obstacles in the environment could be divided into a nite set of

convexcomponents.Duringtheupdateprocess,theproxyattemptstomoveto

thegoalcongurationusingdirectlinearmotion.

Gregoryetal.[8]havedevelopedaneÆcientsystem,H-Collide,forcomput-

ingcontact(s)betweentheprobeoftheforce-feedbackdeviceandobjectsinthe

virtualenvironment.Theirsystemusesspatial decomposition,abounding vol-

umehierarchy,andexploitsframe-to-framecoherencetoachieveafactorof3to

20in speedimprovement.

Polygonalor polyhedral descriptionsare often used to representobjectsin

virtualenvironments.Thestraightforwardhapticrenderingofthese objectsof-

tendoesnotconveythedesiredshapefortheuser.MorgenbesserandSrinivasan

[15]havedevelopedforceshading,in whichtheforcevectorisinterpolatedover

thepolygonalsurfaces.Hapticrenderinghasalsobeensuccessfullypursuedfor

volumetric datasets [1] and for NURBS surfaces [4]. The sensations of touch

havebeenconveyedto thehumantactilesystemusingtexturesgeneratedby{

force perturbation and displacement mapping. Forceperturbation refers to the

technique of modifying direction and magnitude of the force vector to gener-

ate surfaceeects such asroughness [14].In displacementmappingthe actual

geometry of the object is modied to display the surface details. To improve

the realismof the hapticinteraction such asthe push of abutton orthe turn

of aswitch,friction eects havebeenintroduced. Friction canbesimulatedby

applyingstaticanddynamicforcesinadirectiontangentialtothenormalforce.

2.1 Haptic Rendering

Thehapticrenderingprocessinvolvesthefollowingthreesteps:

{ Initializingthehapticdeviceinterfaceandtransferringthedatasetrepresen-

tationfromtheuserdatabuerstothehapticdevicedriversorAPIbuers.

Thisstepmayrequiretranslatingthe datafrom theuserrepresentation to

matchthehapticAPIrepresentation.

{ Collision detection between the elements representing virtual objects and

theprobeofthehapticdevice.Suchdetectionbecomesmuchmorecomplex

whentheprobehasmultipledynamic ngers.

{ Estimatingtheforcethatthehapticdeviceneedstoapplytotheuser'shand

ornger.Thisforceisfedto thegenericprobe.

Wewouldliketoreducetheoverheadfortheabovethreesteps.Dierentap-

(4)

pre-processing.Thenatrun-timethecellswhicharewithin somethresholddis-

tancefromtheprobepointer areconsideredin thecollisiondetectionandforce

feedbackestimation. Thisapproachhastwodrawbacks.First,theselectedcells

may eliminate partof the force eld that aects the user. Forexample, when

haptically renderingasurfaceasin Figure2theusermay senseincorrectforce

whenusingspatialsubdivision.Second,iftheusermovestheprobepointer too

fastfor theapplication toupdate thecells,the usercouldperceiverough(and

incorrect)forcefeedback.Anotherapproachtoreducetheaboveoverheadcould

betoreducethecomplexityofthedatasetthroughsimplication.Severaldier-

entlevelsofdetailcouldthenbeconstructedo-line.Atruntime,anappropriate

levelisselectedforeach object.However,switchingbetweenthedierentlevels

ofdetailatruntimemayleadtonoticeablechangesintheforcefeedbackwhich

isdistracting.Also,iftheobjectsbeingstudiedareverylarge,thismethod will

provideonlyonelevelofdetailacrosstheentireobject.

Probe Cursor Selected Region

Surface

Fig.2.Theuseofspatialsubdivisionmayresultinincorrectsenseoftheforceeld

InthispaperweintroduceContinuously-AdaptiveHapticRendering{anovel

approachtoreduce thecomplexityoftherendereddataset{whichisbasedon

the View-Dependence Tree introduced by El-Sana and Varshney [6]. We use

the sameo-line constructedtree and at run time weuse adierent policy to

determinethevariouslevelsofdetailat thedierentregionsofthesurface.

2.2 View-DependentRendering

View-dependent simplications using the edge-collapse/vertex-split primitives

include work byXiaetal. [20],Hoppe[10],Gueziecetal.[9],and El-Sanaand

Varshney[6].View-dependentsimplicationsbyLuebkeandErikson[12],andDe

Florianietal.[5]donotrelyontheedge-collapseprimitive.Kleinetal.[11]have

developed an illumination-dependent renement algorithm for multiresolution

meshes.SchillingandKlein[18] haveintroducedarenementalgorithmthat is

texturedependent.Gieng etal.[7] produceahierarchyoftrianglemeshes that

(5)

structure that supports view-dependent rendering. In fact, for a given input

dataset, the view-dependence tree construction often leads to a forest (set of

trees) since not all the nodes can be merged together to form one tree. The

view-dependence treesare ableto adapt to various levels ofdetail. Coarse de-

tailsareassociated withnodesthat arecloseto thetopof thetree(roots)and

high details are associated withthe nodes that are close to the bottom of the

tree (leaves).The reconstructionof areal-time adaptive mesh requiresthe de-

termination ofthelistofverticesofthisadaptivemeshandthelistoftriangles

that connect these vertices. Following [6], we refer to these lists as the list of

activenodesandthelistofactivetriangles.

3 Our Approach

We haveintegrated view-dependentsimplication with haptic renderingto al-

lowfasterandmoreeÆcientforcefeedback.Werefertothisasascontinuously-

adaptive hapticrendering.Similartographicsrendering,Continuously-adaptive

haptic renderingspeeds upthe overall performance of haptic rendering byre-

ducingthenumberoftrianglesrepresentingthedataset.Inourapproachwedo

not need to send the complete surface to the haptic system. Instead, we send

asurfacewithhigh detailsin theregion closetothe genericprobepointer and

coarserrepresentationastheregiongetsfarfrom thegenericprobe.

3.1 Localizing theView-DependenceTree

Thetheconstructionofview-dependence treesresultsin dependenciesbetween

the nodes of the tree. These dependencies are used to avoid foldovers at run

time by preventing the collapse or merge of nodes before others. Therefore,

these dependencies may restricttherenementofnodes,mighthaveotherwise

renedtocomplywiththevisualdelityorerrormetric.Inordertoreducesuch

restrictionswereducethe dependenciesbetweenthe nodesofthe tree.Wecan

reduce thedependenciesbylocalizing the tree,whichrefersto constructingthe

treetominimizethedistancebetweenthenodesofthetree.

Wedenetheradiusofasubtreeasthemaximumdistancebetweentheroot

of the subtree and any of its children. We are currently using the Euclidean

distancemetrictomeasurethedistance.Wecanlocalizeaview-dependencetree

by minimizingthe radiusof eachsubtree. Sinceweconstructthe treebottom-

up,thealgorithmstartsbyinitializingeachsubtreeradiustozero(eachsubtree

hasonlyonenode).Eachcollapseoperationresultsin amergeoftwosubtrees.

We collapsea node to the neighborwhich result in the minimum radius. It is

importanttonotethatouralgorithmdoesnotguaranteeoptimalradiusforthe

(6)

Whenhaptically renderingapolygonaldatasetweneedto detectcollisionsbe-

tweentheprobeandthedatasetandcomputetheforcethattheprobesupplies

theuseratveryhighrates(morethat1000Hz).Thetrianglesclosetotheprobe

contribute more to the force feedback and have a higher probability of colli-

sion with the probe. Thetriangles far from the probe havelittle eect on the

force-feedbackandhaveasmallerprobabilityofcollisionwiththeprobe.

Inourapproachweusehigh-detailrepresentationforregionsneartheprobe

andcoarserrepresentationfartheraway.Weachievethisbyusingabell-shaped

lterasin Figure3(a).Inourlter,thedistance fromthehapticprobepointer

dictatesthelevelofdetailofeachregion.Thisltercouldbeseenasamapping

ofdistancefromtheprobepointertotheswitchvalue(switchvalueisthevalue

ofthesimplicationmetricatwhichtwoverticeshadcollapsedatthetreecon-

structiontime).Thesurfaceclosetotheprobeshouldbedisplayedinitshighest

possibleresolutionin ordertoconveythebest estimationoftheforcefeedback.

Inaddition,regionsfarenoughfromtheprobecannotbedisplayedatlessthan

the coarsest level. We were able to achieve further speed-up by changing the

shape of our lter from bell-shaped to multiple-frustums shape. This reduces

thetimetocomputetheswitch valueofanactivenode, whichneedstobeexe-

cutedforeachnodeateachframe.Figure3(b)showstheshapeoftheoptimized

leter.This change reduces thecomputation of distance (from theprobe) and

cubicfunction(whichweusetoestimatethebell-shapedlter)tondthemax-

imumdierencealonganyofthethreeaxesx,y,andz.Wealsoallowtheuser

to change someofthelterattributes that determinetherelationbetweenthe

levelofdetailandthedistancebetweentheprobepointer.

(a) (b)

Fig.3.Idealverseoptimizedlter

At runtime we load the view-dependence trees and initialize the roots as

theactivevertices.Thenat eachframe werepeatthefollowingsteps.First,we

query the position and the orientation of the probe.Then we scan the list of

activevertices.Foreach vertex we computethe distance from the probe posi-

(7)

computedvalueislessthantheswitchvalueandthenodesatisestheimplicit

dependenciesforsplit. The nodemerges with itssibling ifthe computedvalue

is largerthan theswitch value storedat the parent of this nodeand the node

satisestheimplicitdependenciesformerge.

Aftereachsplit,weremovethenodefromtheactive-nodeslistandinsertits

twochildrenintotheactive-nodeslist.Thenweupdatetheadjacenttrianglelist

to matchthe change; andinsertthePATtrianglesinto theadjacentlist ofthe

newlyinsertednodes. Themergeoperation iscarriedoutin twosteps,rstwe

removethetwomergednodesfromtheactive-nodeslist,andthenweinsertthe

parentnodeintotheactive-nodeslist.Finally,weupdatetheadjacent-triangles

list byremovingthePAT triangleslistand mergingthetwomergednodestri-

angles(theinterestedreadermayrefertothedetailsin[6]).Theresultingsetof

activetrianglesissentto thehapticinterface.

4 Further Optimizations

Wewereabletoachievefurtherspeedupsbyne-tuningspecicsectionsofour

implementation. For instance, when updatingthe active-nodes list and active-

triangleslistaftereachstepwereplacepointersinsteadofremovingandinserting

them. Forexampleaftersplit, wereplacethenodewithoneof itschildren(the

leftone)andinsertthesecondchild.Similarlyinmergewereplacetheleftchild

withitsparentandremovetheotherchild.Eventhoughtheactivelistsarelists

ofpointertotheactualnodesofthetree, stilltheirallocationanddeallocation

requiresmoretimebecauseitreliesontheoperatingsystem.

Thehapticandgraphicsbuersareupdatedinanincrementalfashion.Since

the change betweenconsecutiveframes tendsto besmall, this results in small

changes in the hapticand graphics buers. Therefore, we replace the vertices

andtrianglesthatdonotneedtoberenderedin thenextframewiththenewly

added verticesand triangles. This requires verysmall update time that is not

noticeablebytheuser.

Since the graphics renderingand the haptic rendering run at dierent fre-

quencieswehavedecidedto maintain themthroughdierentprocesses(which

runondierentprocessorsforamulti-processormachine). Thedisplayruns at

lowupdateratesofabout20Hz,whilethehapticprocessrunsathigherratesof

about1000Hz.Wealsouseanotherprocesstomaintaintheactivelistsandthe

view-dependence treestructure.At20Hzfrequencywequerythehapticprobe

foritspositionandorientation,thenupdatetheactiveliststoreectthechange

oftheprobepointer.Finally, weupdate thegraphicsandthehapticbuers.In

thisscenario,thegraphicscomponentisupdatedat20Hzandrunsatthisrate

while thehapticcomponentruns at 1000Hzandis updatedatonly20 Hz.To

betterapproximate thelevel of detailwhen theuser is moving theprobefast,

weusethe estimated motiontrajectoryof theprobepointer, and the distance

ithastraveledsincethepreviousframetoperformlook-aheadestimationofthe

(8)

We haveimplementedour algorithmin C++ onan SGI ONYX2with innite

reality. For haptic rendering we have used the PHANToM haptic device from

SensAble Technologies with six degrees of input and three degree of output

freedom.ThehapticinterfaceishandledthroughtheGHOSTAPIlibrary(from

SensAble Techologies).This hapticdevice fails(the servoloopbreaks)when it

ispushedtorunatlessthat 1000Hzfrequency.

CHROFF CHRON

Dataset Triangles Average Average Average Average

Framerate(Hz) Quality Framerate(Hz) Quality

CAD-Obj1 2K 1500 good 1500 good

CAD-Obj2 8K 600 bad 1200 good

Molecule 20K | breaks 1000 good

Terrain 92K | breaks 1000 good

Table 1.Resultsofourapproach

Wehave conducted several tests onvarious datasets and havereceived en-

couragingresults.Table1showssomeof ourresults.It showsresultsofhaptic

rendering with (CHR ON) and without (CHR OFF) the use of continuously-

adaptivehapticrendering.Formediumsizedatasetsthehapticdeviceworksfor

sometime then fails. Whenthe datasets become largerthe haptic device fails

almost immediately because it was not able to run at the minimum required

frequency.Thisfailurecouldbetheresultoffailingtonishthecollisiondetec-

tionprocessorthefailureto nishtheforce eldestimation process.Reducing

thedatasetsizeusingouralgorithmenablessuccessfulhapticrenderingofthese

datasets.Figure 4showsthesystemcongurationweusedinourtesting.Inour

systemthemouseandthehapticprobepointerisusedsimultaneouslytochange

and updatethe viewedposition of dataset.Figure 5showshigh levelofdetail

aroundtheprobepointer(shownasabrightspherein thecenter).

Fig.4.Oursystemconguration

(9)

Fig.5.Hapticrenderingofterraindataset,theyellowsphereisthehapticprobepointer

6 Conclusions

Wehavepresentedthecontinuously-adaptivehapticrenderingalgorithm,which

enableshapticrenderingofdatasetsthatarebeyondthecapabilityofthecurrent

hapticsystems.Ourapproachisbasedupondynamic,frame-to-framechangesin

thegeometryofthesurfaceandthuscanbeusedwithanyofthepriorschemes,

suchas bounding volume hierarchies,toachievesuperioraccelerationofhaptic

rendering.Hapticinterfacesarebeingusedin severalreal-lifeapplicationssuch

asmoleculardocking,nanomanipulation,virtualdesignandprototyping,virtual

surgery, and medicaltraining. Weanticipate that our work highlighted in this

paperwillachieveacceleratedhapticsrenderingforalloftheseapplications.

Acknowledgements

Thisworkhasbeensupported inpartbytheNSF grants:DMI-9800690,ACR-

9812572,andaDURIPawardN00014970362.JihadEl-Sanahasbeensupported

inpartbytheFulbright/IsraeliArabScholarshipProgramandtheCatacosinos

Fellowship for Excellence in Computer Science. We would like to thank the

reviewers for their insightful comments which led to several improvements in

the presentation of this paper. We would also like to thank our colleagues at

theCenter forVisual Computing at Stony Brook fortheir encouragementand

suggestionsrelatedtothis paper.

References

1. R. S. Avila and L. M. Sobierajski. A hapticinteractionmethod for volume vi-

sualization. In Proceedings, IEEE Visualization, pages 197{204, Los Alamitos,

October27{November11996.IEEE.

2. C.Basdogan,C.Ho,and M.Srinivasan. A ray-basedhapticrenderingtechnique

for displayingshape andtexureof 3-dobjectsinvirtualenvironment. InASME

(10)

|hapticdisplaysforscienticvisualization. InComputerGraphics(SIGGRAPH

'90 Proceedings), volume24(4),pages177{185,August1990.

4. F.DachilleIX,H.Qin,A.Kaufman,andJ.El-Sana. Hapticsculptingofdynamic

surfaces (color plate S. 227). InStephen N. Spencer, editor, Proceedings of the

Conference on the 1999 Symposium on interactive 3D Graphics, pages 103{110,

NewYork,April 26{281999.ACMPress.

5. L. De Floriani, P. Magillo, and E. Puppo. EÆcient implementation of multi-

triangulation. InH.RushmeierD. ElbertandH.Hagen,editors,ProceedingsVi-

sualization'98,pages43{50,October1998.

6. J.El-SanaandA.Varshney. Generalizedview-dependentsimplication. InCom-

puter Graphics Forum,volume18, pagesC83{C94.EurographicsAssociationand

BlackwellPublishersLtd1999, 1999.

7. T. Gieng, B.Hamann, K.Joy, G.Schussman,and I. Trotts. Constructing hier-

archies for triangle meshes. IEEE Transactions on Visualizationand Computer

Graphics,4(2):145{161,1998.

8. A.Gregory,M.Lin,S.Gottschalk,andR.Taylor. H-COLLIDE:Aframeworkfor

fastandaccuratecollisiondetectionforhapticinteraction.TechnicalReportTR98-

032,DepartmentofComputerScience,UniversityofNorthCarolina-ChapelHill,

November031998. Tue,3Nov199817:27:33GMT.

9. A. Gueziec, G. Taubin, B. Horn, and F. Lazarus. A framework for streaming

geometryinVRML. IEEECG&A,19(2):68{78,1999.

10. H. Hoppe. View-dependent renement of progressive meshes. In Proceedings of

SIGGRAPH '97(LosAngeles,CA),pages189{197.ACMPress, August1997.

11. R.Klein,A.Schilling,andW.Straer. Illuminationdependentrenementofmul-

tiresolutionmeshes. InComputerGraphicsIntl,pages680{687,June1998.

12. D. Luebkeand C.Erikson. View-dependentsimplicationofarbitrary polygonal

environments. InProceedings ofSIGGRAPH '97(LosAngeles,CA),pages198{

208. ACMSIGGRAPH,ACMPress,August1997.

13. W. Mark,S.Randolph,M. Finch,J. VanVerth,and R.TaylorII. Addingforce

feedbacktographicssystems:Issuesandsolutions. InProceedingsof SIGGRAPH

'96 (NewOrleans,LA,August 4{9,1996),pages447{452.ACMPress, 1996.

14. M.Minsky,M.Ouh-young,O.Steele,F.P.Brooks,Jr.,andM.Behensky.Feeling

andseeing:Issuesinforcedisplay.In1990SymposiumonInteractive3DGraphics,

pages235{243,March1990.

15. H.MorgenbesserandM.Srinivasan.Forceshadingforhapticshapepreception.In

ASMEDynamicSystemsandControlDivision,volume58,pages407{412,1996.

16. D.Ruspini.,K.Kolarov,andO.Khatib. Thehapticdisplayofcomplexgraphical

environment. InProceedingsof SIGGRAPH '97 (LosAngeles, CA),pages345{

352. ACMSIGGRAPH,ACMPress,August1997.

17. J.SalisburyandC.Tarr. Hapticrenderingofsurfacedenedbyimplicitfunctions.

InASMEDynamicSystemsandControlDivision,November1997.

18. A. Schilling and R.Klein. Graphicsin/for digital libraries |rendering of mul-

tiresolutionmodelswithtexture. ComputersandGraphics,22(6):667{679,1998.

19. R. Taylor, W. Robinett, V. Chi, F. Brooks, Jr., W. Wright, R. Williams, and

E. Snyder. Thenanomanipulator: A virtual-realityinterface for a scanning tun-

nellingmicroscope. InProceedings, SIGGRAPH93,pages127{134,1993.

20. J.Xia,J.El-Sana,andA.Varshney. Adaptivereal-timelevel-of-detail-basedren-

deringfor polygonalmodels. IEEETransactions onVisualizationand Computer

Graphics,pages171{183, June1997.

Referanser

RELATERTE DOKUMENTER

The simulation time, including pro- cessing time for collision detection, constraints resolution and time integration can be linked to the number of bod- ies composing the world,

In principle, our algorithm could rely on any of the existing approaches for haptic ren- dering of the interaction between deformable objects, but we build on a recent

The paper is organized as follows: section 2 presents some re- lated works on haptic interfaces, section 3 describes the hap- tic device for surface rendering consisting of an

Although many haptic devices come with APIs to enable the development of haptic-based applications, many do not provide the same level of graphical support available in

Real-time rendering of terrain data needs to address three major aspects: data representation, surface rendering, and level of detail techniques.. Traditionally, terrain

The objective of this research is to use haptic interface technology to interact with a dynamic image sequence and enable the viewers to feel the motion force of objects in the

The first objective is to evaluate the impact of a mobility software module, based on predefined kinematic constraints, able to guide the user’s movements, when performing

When the skin elastic- ity limit is exceeded in a given point contact between the needle point (simulated as a small haptic sphere) and the skin, the feedback force that is sent to