Please change POINT_ATTENUATION_SIZE to false!

Something missing?
Post Reply
STRESS
Posts: 141
Joined: Mon 19. Jan 2009, 12:10

Please change POINT_ATTENUATION_SIZE to false!

Post by STRESS » Fri 12. Mar 2010, 18:11

Would it please. please be possible to change the DEFAULT of that Attribute to false. At the moment it seems to be always true unless you call Appearance.setAttributes(CommonAttributes.POINT_ATTENUATION_SIZE,false). Since GL_POINT_ATTENUATION is a known source of horrible wrong or different behaviour across all different IHV and drivers.

Also it took me hours to actually find out that this was overwritting my point size values in DefaultPointShader.setPointSize() driving me nuts.

Thank you!

User avatar
gunn
Posts: 323
Joined: Thu 14. Dec 2006, 09:56
Location: TU Berlin
Contact:

Re: Please change POINT_ATTENUATION_SIZE to false!

Post by gunn » Fri 12. Mar 2010, 20:20

Your request is understandable. However, changing this default at this stage will have undesirable side-effects on existing code, which expects the default to be true. For example, the use of sprites for sphere drawing (drawSpheres = false) depends on point attenuation being active, and that is a central feature of the jReality default point shader. I don't see that we're prepared to reverse that default at this stage of development.

Would it be possible for your applications to set the value to false whenever you create a viewer (by getting the scene root and writing into its appearance)? At least for the short term, and perhaps for the long term, that might be the best solution.
jReality core developer

STRESS
Posts: 141
Joined: Mon 19. Jan 2009, 12:10

Re: Please change POINT_ATTENUATION_SIZE to false!

Post by STRESS » Mon 15. Mar 2010, 11:16

Your request is understandable. However, changing this default at this stage will have undesirable side-effects on existing code, which expects the default to be true. For example, the use of sprites for sphere drawing (drawSpheres = false) depends on point attenuation being active, and that is a central feature of the
Hmm I understand well that's a sort of a problem breaking behaviour is never a good thing.

But relying on point sprites is already a problem per se since they tend not to work very reliable on a variety of hardware platforms either. Well in this case I can tell you that the POINT_ATTENUATION doesn't work on probably most Intel IGI which is a about >50% of all install based graphics card globally.
Would it be possible for your applications to set the value to false whenever you create a viewer (by getting the scene root and writing into its appearance)? At least for the short term, and perhaps for the long term, that might be the best solution.
Since now I am aware of this problem I made some minor changes in my installed codebase of jReality and also set the Attribute to FALSE wherever I use points. I just thought other people might stumple across this problem sooner or later.

Post Reply