sharing zbuffers

Have jReality programming problems or questions? Post them here.
Post Reply
kejace
Posts: 6
Joined: Thu 30. Nov 2006, 15:03
Location: TU Berlin

sharing zbuffers

Post by kejace » Thu 14. Dec 2006, 13:18

It is unclear to me how the in which order objects are rendered (or eq. how they use the zbuffer). In the icosahedra example, different icosahedra "shine through" others, but in a different way depending on if it is a face, a point etc. But when one renders the scene in Sunflow, it renderes "correctly". My question is, how to render multiple scenegraphcomponents with the same zbuffer (so one dont see through the objects). Or differently put. Which renderer renders correctly?

Best,
Kristoffer

User avatar
gunn
Posts: 323
Joined: Thu 14. Dec 2006, 09:56
Location: TU Berlin
Contact:

Post by gunn » Fri 15. Dec 2006, 12:16

By "icosahedra example" I assume you mean de.jreality.tutorial.Icosahedra.

The Icosahedra example has to do with using transparency in jReality. The fact that different parts "shine through" other parts is the nature of transparency, so in itself doesn't mean there is a problem. In this example the points and lines are set to be opaque (using the "opaqueTubesAndSpheres" flag in the RenderingHintsShader of the world node), so only the faces are transparent.

It's true that the different backends yield different pictures when transparency is involved. It's important to understand to begin with that OpenGL, in general, does not do transparency correctly. Turning on transparency in the JOGL backend does two things: it turns on merging of fragments using the alpha channel; and it turns off the z-buffer depth test.

This means that fragments arrive at the z-buffer in random order and are merged with the existing contents on-the-fly. Proper transparency requires that the fragments are processed in sorted order. Without sorting, openGL will yield incorrect results. Rearranging the order in which the children in the scene graph are arranged will produce different images since that changes the order in which fragments arrive at the z-buffer.

You can get slightly better (or perhaps "different" is the correct term) with the JOGL backend by setting the attribute "ZBufferEnabled" to true in the "RenderingHintsShader" of the root Appearance. This will only merge fragments which lie in front of the already-rendered fragments. This means for example that if the first part of your object to be rendered is the front, and it's transparent, then you won't see any of the back objects.

This example, although it yields incorrect results, nonetheless gives what I consider to be reasonable results. (Comparing with the software viewer shows that the pictures don't look very different).
There are other backends (software, renderman, sunflow) which do transparency correctly since they either work directly with ray casting, or they are first sort the fragments.

A final word on transparency in JOGL: there is another trick which allows "correct" transparency for fully transparent objects (alpha == 0). This is a hardware flag that basically ignores fragments with alpha == 0. It's available via the RenderingHintsShader as "ignoreAlpha0". It's particularly useful in texture maps: see de.jreality.tutorial.TextureExample. Here the "holes" in the object are coming from the texture map, not the underlying geometry.

Hope this helps.

Charles

Post Reply