• No results found

Davelib: A VR Framework

4.2.1 Motivation and Goals Simple Porting of Existing Applications to the DAVE. Porting an

existing 3D application to the DAVE can be very simple. To demon-strate the lean concept of the Davelib, a minimal code example for a GLUTorOpenGLEANprogram is shown below. The same program is executed on the server as well as the clients, each with different param-eters. The additional lines that are necessary to port the application to the DAVE are highlighted in red.

1 #include<GL/openglean.h>

2 #include "Dave.h"

3 DAVE::Dave dave;

4

5 struct UserSyncStruct {// synchronize to clients

6 float cameraPosition[3];

7 };

8

9 voiddisplay () {

10 glClear(GL_COLOR_BUFFER_BIT |

11 GL_DEPTH_BUFFER_BIT);

12 dave.setProjectionMatrix();

13 glLoadIdentity() ;

14

15 UserSyncStruct* usd = (UserSyncStruct*) dave.getSyncBufferUserDataPtr();

16 glTranslatefv (usd->cameraPosition);

17 glutSolidTeapot(1.0) ;

18 glutSwapBuffers();

19 }

20

21 voidupdate() {

22 // handle input devices (pseudo code)

23 if (joystickButton [0]. isPressed () ) cameraPosition[2] += 0.1;

24

25 // send/receive UserSyncStruct data

26 dave.update();

27

28 glutPostRedisplay();

29 }

30

31 int main(intargc, char∗argv []) {

32 dave.parseCmdLine(argc, argv);

33 dave.init(sizeof(UserSyncStruct));

34

35 glutInit (&argc, argv);

36 glutInitDisplayMode(GLUT_DOUBLE |

37 GLUT_RGB | GLUT_DEPTH);

38 glutCreateWindow("Davelib 4");

39 // ignore Davelib settings and always go into fullscreen

40 glutFullScreen () ;

41 glutSetCursor(GLUT_CURSOR_NONE);

42 glutIdleFunc(update);

43 glutDisplayFunc(display);

44 glutMainLoop();

45 return 0;

46 }

Figure 4.2: A minimal program with sample modifications for a CAVE setup (in red) using the Davelib.

The Davelib has evolved in the past 9 years of usage and is employed mainly in our DAVE. The main difference to many other VR frame-works is the lean design especially suited to be integrated in existing 3D applications. With the Davelib, only few function calls are neces-sary. Little time is required to learn how to use it and only one to two external dependencies exist (OpenGL and optionally Lua), simplify-ing integration in existsimplify-ing applications. Especially, the few external libraries help to simplify maintenance for developers of Davelib ap-plications, as compiler versions and subsequently compatible library versions change over the years and the code using the libraries may need adaption to keep working. It is an open source cross platform (only Windows and Linux are tested) C++ library for OpenGL appli-cations, but it should easily be portable to Java or modified for the use with DirectX. A wide range of setups is supported via configuration files and Lua scripting. This means that the application itself does not need to be recompiled and the same executable can be used. The last and only partially implemented step to fulfill this goal is the IO device library, a part of the Davelib that was developed primarily to support input devices that are setup specific, but it can also be used to control output devices.

The Davelib was initially written to provide the minimal necessary set of functions to run an application in our DAVE. It does not provide support for a GUI, sound, physics simulation, scene graphs, haptics, etc. but rather coexists to libraries written for these purposes.

The Davelib is a light weight set of basic functions that are necessary for OpenGL applications to work in a VR environment. While the focus stays on graphics functions for correct viewing, IO device script-ing has been added for an easy adaption to a wide range of setups, including CAVE like setups, tiled displays and just normal PCs.

DaveConfig

Dave Network

DaveProjection DaveWindowManager

DaveTracking

DaveIODevice

Config file

Command line parameters Environment variables

Tracking

Mouse

Keyboard Joystick

Sound System control

via PDA and webserver

Client app

Master app

Status Lua script

Client app Client app

DaveWindowManager

Figure 4.3: A simplified overview showing an example setup with the involved parts of the Davelib.

92

4. OUTPUT: IMAGE RENDERING

4.2.2 Davelib Core Functions The Davelib consists of a carefully designed set of functions. There are

no requirements or dependencies to use any of the Davelib functions, but the more are implemented, the more setups are supported. First, the basic components are described that are likely to be necessary for most VR setups. Afterwards, additional extensions are presented.

4.2.2.1 Screen Geometry and View Matrix Setup

The central technical aspect of a VE is the different view of the scene for each individual screen. In a conventional program for desktop setups, the OpenGL projection matrix is used to set up a perspective view looking into the virtual world. With the Davelib this matrix can be replaced by another matrix

POpenGL=P∗R∗T (4.2.1)

in order to give a correct view for a stereoscopic head tracked VR display. The OpenGL model view matrix can be used as normally, e.g. to change the camera position and orientation in the 3D scene and transform the specified vertices to local coordinate systems. An orthogonal view is currently not supported, as this only makes sense on 2D displays. When writing an application specific for a 2D tiled display, this may be a useful future extension which can easily be added.

Aw

Bw Cw

Ew Fw Xs

Zs

Ys

Xw

Zw Yw

rs

ls

ts

bs

ds

Figure 4.4: The asymmetric projective frustum for fixed screens and head tracking.Aw,Bw,Cware corners of the rectangular screen andEwis the eye position specified in world coordinates.

Static Screens. For projection screens, the camera is set up with a perspective frustum given by

P=

Note that in general the frustum for screens in a VE is asymmet-ric and depends on the eye positions that are estimated from head tracking [CNSD93].

The frustum may also be rotated to account for the rotation of the screen, e.g. for the side walls of a CAVE, which is realized with the rotation matrix

Note that this rotation is independent of the rotation of the camera or navigation. Finally, a translation is applied with the negative eye position values.

To allow a simple configuration, only three cornersAw,Bw,Cwof the rectangular screen in the world coordinate systemX~w, ~Yw, ~Zwand the distances of the near and far clipping planesznearandzf armust be specified. From these values, the necessary parameters can be com-puted:

The assumption for these equations is a rectangular screen, but only a part of it can be used for a planar screen with an arbitrarily shaped border, like a circle.

Head Mounted Display. For HMDs,Pis set to a symmetric frus-tum:

94

4. OUTPUT: IMAGE RENDERING

with

ls=rs=tan απ

360

bs=ts=ls

a

whereαis the horizontal field of view andathe screen aspect ratio.

The rotation matrixRis directly set to the orientation matrix from head tracking. Like for static screens, afterwards a translation is applied as in Equation 4.2.4.

4.2.2.2 Synchronization Across Multiple Computers

As mentioned above, there are different approaches to drive multiple displays. One approach is to compute the images on a single PC that either directly outputs them, potentially by using multiple graphics cards, or sends the images to other PCs via network. This method does not result in a high performance and in the former case does not scale with the number of displays. For a high rendering performance, a better approach is to send only rendering commands or synchro-nization information over the network to distribute the rendering on a cluster, both reducing network transfer and parallelizing the ren-dering. The latter approach is used by the Davelib, with an instance of the same executable running on each PC.

An example is a user flying through a virtual world. Position and orientation of the navigation are copied to a synchronization buffer in the master application. After the network transfer it is read from the buffer in the client application. The Davelib internally also transmits the eye positions of a single head tracked user. Support for multiple head tracked users can be added easily.

The Davelib provides two types of network transfers. One is intended for the mentioned information like status updates, it is sent via UDP, preferably on a multicast address. It is fast but unreliable and only the last message is kept in the case that multiple messages are received at a time. For the multicast case, the clients only need to know the address of the server. The other type of network message is used for events, implemented with TCP for a reliable connection.

Currently, rendering is not explicitly synchronized. While applica-tions can be written to synchronize the buffer swapping, waiting until image renderings on all other clients are complete, it is hard not to waste time or introduce latency from synchronization overhead.

Also, especially in a CAVE, the workload and framerate of different machines may greatly differ, depending on object visibility on the different screens. In our VR setups we currently only use software that renders with high framerates. A fast rendering is very important with head tracking, as head motion otherwise leads to disturbing jumping of the content. With the high framerates, we do not notice problems without explicit synchronization. However, the latency may increase, as the rendering of a frame is not instantly started when new information is available, see Figure 3.8 for a graphical representation of the timings and latency.

4.2.2.3 Configuration

A Davelib application can render on a wide range of setups without adapting the program. A simple configuration file allows to use the

same executable, e.g. on the DAVE, the HEyewall or a normal desktop PC. The configuration mainly contains the configuration for each display, like the position of the four display corners in space. Further parameters like like network addresses or file names of calibration matrix or blend mask image can be specified, too.

The default configuration file name can be read from an environment variable, so on a configured PC, an application can be just started and automatically uses the correct parameters. Command line param-eters can also be used to temporarily define or override individual settings of the configuration. String pairs of key and value are used, with helper functions for reading integer or double values. We also implemented a hierarchical definition. As an example, a value that should be the same for all computers can omit the machine name, acting as a default in case it is not defined in that deeper level of the hierarchy.

Unfortunately, there are no commonly accepted cross framework stan-dards for setup configuration, e.g. display configuration or projector calibration. If for the first time a Davelib program should run on a new setup like a CAVE that uses display calibration, either the calibration has to be redone or an existing calibration from a different framework has to be transferred. Equally, the configuration or calibration may be automatically translated to enable usage of the same information in other software packages. As an example, after calibration of the DAVE, the configuration for Davelib applications are generated as well as the engine definition for instantreality.

1 multicastAddress = 224.245.132.123

2 multicastPort = 33001

3

4 // optical tracking

5 trackingServer . address = 10.52.10.1

6 trackingServer . port = 33002

7

8 /////// window on master ///////

9 master.windowPosX = 10

10 master.windowPosY = 10

11 master.windowWidth = 400

12 master.windowHeight = 300

13 master.hmdHFov = 60

14

15 /////// LEFT WALL ///////

16 dave1.lowerLeftX =−1.65

17 dave1.lowerLeftY =−1.65

18 dave1.lowerLeftZ = 0

19 dave1.lowerRightX =−1.65

20 dave1.lowerRightY = 1.65

21 dave1.lowerRightZ = 0

22 dave1.upperLeftX =−1.65

23 dave1.upperLeftY =−1.65

24 dave1.upperLeftZ = 2.47

25 dave1.windowMode = FULLSCREEN

26 dave1.cameraStereoOffset = LEFT_EYE

27 dave1.calibMatrix = config/LEFT.cal

28

29 dave2.lowerLeftX =−1.65

30 dave2.lowerLeftY =−1.65

31 dave2.lowerLeftZ = 0

32 ...

Figure 4.5: An excerpt of the DAVE configuration file for programs using the Davelib. To describe the properties for a screen, only a few lines are necessary.

96

4. OUTPUT: IMAGE RENDERING

4.2.2.4 Window management

The window manager has the task to open one or more windowed or fullscreen OpenGL contexts, as specified by the configuration data.

Since this operation largely depends on the way the windows are opened, it has to be implemented for each framework. We mainly useOpenGLEANas aGLUTreplacement. We slightly modified it to support quadbuffer contexts for theclonemode with quadro graphics cards and borderless windows for e.g. a side-by-side configuration. A less complete window manager for Simple DirectMedia Layer (SDL) is also implemented.

Another example is an external software opening a full screen window with native system calls. Not touching this part of the code, the values of the window configuration are ignored, reducing its portability to other setups. It runs correctly in the DAVE where a single fullscreen window is used for each PC, but does not work on the HEyeWall without further changes.

Often, 3D applications are written to render only to a single win-dow. On a setup with multiple displays connected to a computer, the application can be started several times with different parameters.

However, it is more memory conserving and more efficient to render to multiple contexts from the same application. As an example, a large 3D model only has to be only once loaded from disk and stored to main memory. This may require some more effort to port existing applications, since some initializations like texture and shader defini-tions need to be done per window, keeping in mind that the windows may be rendered by different graphics cards.

When rendering of multiple windows is done on the same card, tex-tures and vertex buffers can be reused and some intermediate results like shadow maps must only be computed once per frame. It may make sense to use these advantages and start an instance for each graphics card.

4.2.3 Davelib Extensions Useful add ons are described below, some of them require additional

effort during implementation. We suggest to also support them within a Davelib program for a flexible usage on many different setups.

4.2.3.1 Linear Geometric Projector Calibration

Figure 4.6: The projection slightly overlaps the screen area visible from the inside of the DAVE, indi-cated by a red dotted line. The cur-rent calibration is also visualized by the system background image, that is recomputed accordingly af-ter each calibration.

To avoid a tedious and time consuming mechanical calibration of the projectors, they can be only roughly aligned with the image slightly overlapping the screen. An additional calibration is applied dur-ing renderdur-ing to predistort the images so that the image content is geometrically aligned as if the projector was aligned perfectly.

Raskar et al. propose a linear projector calibration [Ras00] by modify-ing the OpenGL projection matrix. We independently developed an almost identical solution that is used for all applications in the DAVE.

The modification is realized by a simple matrix multiplication, so that Equation 4.2.1 extends to

POpenGL=C∗P∗R

Assuming perfect projector optics and a planar projection surface, a simple homography is sufficient to correct each of the 3D vertex

positions. As straight lines stay straight after transformation, all 3D content is corrected in this way. For at least four reference pointsqsi the user specifies corrected pointsrsi. The calibration matrix

C=

a b 0 d e f 0 h i j 1 0 m n 0 1

is calculated such that allqsiwill be projected as close as possible to the target positionsrsi, minimizing the sum of the squared distances. This matrix is a combination of the matrices shown in the figure on the side.

A semi-automatic calibration can be achieved by projecting a mouse cursor on the screen and clicking on physical reference positions like the corners of the target screen. To get precise physical locations of the reference points we use the corners of the projection. As a normal mouse cursor is not visible at each corner and only allows pixel precise acquisition, multiple lines at different angles are used for each corner. Their intersection defines the corner position accurately.

The lines can be moved with a wireless joystick from within the DAVE.

We have improved this calibration so that it is possible to interactively move a corner and instantly see the calibration result, also showing markers on each edge that should match to the respective markers on a neighboring screen.

Figure 4.7: The4×4matrix trans-formations shows which matrix val-ues must be changed in order to cor-rect the respective effects.

The valuesa,b,d,e, f,h,mandnare numerically optimized with the steepest descent algorithm so that the sum of squared distances from projected points to their target location is minimal. Unfortu-nately, to correct keystoning or trapezoid distortion, the parameters for perspective division are influenced, leading to different depth results after perspective division and thus most notably different and skew clipping planes. We partially counteract possible resulting ar-tifacts by setting the valuesiand jtomandnrespectively, so that the near clipping plane stays the same but the far clipping plane is stronger distorted. This is motivated by the typical DAVE applica-tions with close objects where clipping at the near plane occurs and leads to otherwise inconsistent clipping at screen borders. Raskar et al. address this problem differently [Ras00]. They seti=j=0and k=1− |m| − |n|and thus extend the frustum to include the complete original frustum. However, the resolution of the depth buffer may not be used very well in that case, and front clipping planes are skew.

Figure 4.8: Software calibration of projectors with minimal overhead.

Left: before calibration, right: after calibration.

The necessary values are stored in an ASCII file for each screen con-taining the 16 values. In OpenSG the same file is read, in instantreality

98

4. OUTPUT: IMAGE RENDERING

these values are stored in aMatrixViewModifierin the engine defini-tion, so that currently the engine definition must be rebuilt. This is done automatically after each calibration.

Discussion. It is helpful to place the projectors in a way to keep the changes by the calibration small, especially to avoid keystoning and thus reduce the potential problems with the depth calculations and clipping. Note that bitmap operations are not transformed by the OpenGL projection matrix and are not handled correctly. Some applications may draw a head up display in that way. Also, screen space fragment shaders that work with information from neighbor-ing pixels may lead to slightly different results. Unfortunately, our mirrors and rear projection screens are not exactly planar. When the calibration is done to fit at the corners, other parts may be off by a few pixels. This is visible especially at the edges. With the manual calibration, this can be partially compensated.

The main advantages of this method are obvious: Only minimal code changes are necessary to apply the calibration and during rendering, no extra computation time is required. For more details, please refer to our publication [LOUF06]. If any of the mentioned issues are a problem in practice, the following approach can be used.

4.2.3.2 Non-Linear Geometric Projector Calibration

For smoothly curved screens, a mesh for a two pass approach can be generated with 2D freeform deformation tools developed in sec-tion 6.2.2.1. As curved screens, especially for rear projecsec-tion, are not easy to build, and common projector optics are made for planar screens, we have not tested a calibration of such a setup yet.

4.2.3.3 Photometric and Colorimetric Calibration

Lamp differences and more importantly non-Lambertian projection surfaces and diffuse interreflections lead to visible edges in the DAVE.

With multiple calibrations from different camera positions and with head tracking, this problem may be mostly compensated for. However, this is a large effort and only works for a single user in our setups.

At the moment, a very rough approximation is used in the DAVE.

Only the brightness in the DAVE is regulated manually using the lamp power setting of the projectors. Interestingly, many visitors have problems to locate the physical limitations of the screens when using the DAVE for the first time. An interesting and high frequency content seems to lower the perception of the screen edges significantly.

4.2.3.4 Edge Blending

Automatic edge blending by software can be achieved in the Davelib

Automatic edge blending by software can be achieved in the Davelib