I'm currently thinking of stereo for the software viewer. We really should get this working for correct transparency (GFZ etc.). And, as soon as this works we can extend the renderman backend to write two rib files, trigger rendering of both pictures and then display these images in the same way the two software-rendered images are displayed.
Since quad-buffered stereo works now in jogl-backend, we can write a simple jogl stereo image viewer. This of course makes the stereo-software viewer depending on jogl. Or is there a way to render quad buffered from java2d?
Steffen.
Stereo for soft viewer + renderman?
Stereo for the software viewer is of course possible. The most straight forward way would be to split the image the sw viewer renders into in half and to raster each triangle twice. Even simpler would be to flip the perspective projections every other frame, but one would have to to the whole traversion/transformation things twice each stereoframe and rendering time is the bottleneck anyway.
The new software viewer (currently under de.jreality.softveiwer) has a more or less working polygon intersection for ps export which in principle would make it possible to handle the transparency completely correct (the softviewer sill cannot render intersecting transparent triangles correctly) However that intersection algorithm is sill way to slow and has some numerical issues that need to be solved before it can become usefull.
Stereo for renderman should be straight forward as well. just geting the camera transformations for both of the eyes and seting them instead of the usuall projection matrix should do the job.
Quadbuffered stereo means you render in two offscreen buffers and then flip both with their two onscreen pals?
Tim
The new software viewer (currently under de.jreality.softveiwer) has a more or less working polygon intersection for ps export which in principle would make it possible to handle the transparency completely correct (the softviewer sill cannot render intersecting transparent triangles correctly) However that intersection algorithm is sill way to slow and has some numerical issues that need to be solved before it can become usefull.
Stereo for renderman should be straight forward as well. just geting the camera transformations for both of the eyes and seting them instead of the usuall projection matrix should do the job.
Quadbuffered stereo means you render in two offscreen buffers and then flip both with their two onscreen pals?
Tim
This is how jogl backend renders stereo, but it sounds like this might cause synchronization problems - different scenes for both eyes. The best way would be to render both eyes in the same traversal.Even simpler would be to flip the perspective projections every other frame, but one would have to to the whole traversion/transformation things twice each stereoframe and rendering time is the bottleneck anyway.
Charles already built in stereo support into the renderman backend. I thought of some sort of interactive renderman viewer, so that you can switch to some DisplayRenderman Viewer in the viewerapp. The viewer could be more or less the same as the software viewer when we seperate rendering into a (or two) BufferedImages and displaying these images.Stereo for renderman should be straight forward as well. just geting the camera transformations for both of the eyes and seting them instead of the usuall projection matrix should do the job.
Right - this is how one should render stereo if possible. Allows to show stereo not only in fullscreen mode but also in a frame with GUI.Quadbuffered stereo means you render in two offscreen buffers and then flip both with their two onscreen pals?
Steffen.
True. better and easyer to do. I will look into that.he best way would be to render both eyes in the same traversal.
Good. An interactive renderman viewer is of course not feasible. So it would only be a simple stereo image viewing. Stereo or not: a view renderman in the viewerApp would be nice. Aqsis has an xml and socket based based mechanism to feed the preview windows for its rendertasks. So It sould be possible and not too difficult to write a java client for that, which could receive the finished tiles and displays them in the viewerApp window. This would be aqsisi specific though.Charles already built in stereo support into the renderman backend. I thought of some sort of interactive renderman viewer, so that you can switch to some DisplayRenderman Viewer in the viewerapp. The viewer could be more or less the same as the software viewer when we seperate rendering into a (or two) BufferedImages and displaying these images.
Tim