[osg-users] Help: does the time difference between two successive update callback of one osg::node imply FPS?

2012-10-15 Thread wang shuiying

Hello,

I have a very naive question. But I really need to make sure about it.

 In one osg::viewer thread, if the time difference between two 
successive update callbacks (or drawCallbacks)of one osg::node is equal 
to T(s), then can I say that the corresponding FPS of GPU is 1/T?



Thank you in advance!

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help:why does renderbin affect the display of textures ?

2012-10-08 Thread wang shuiying

Is nobody interested in this question?:-(

Hello,

I have two drawable nodes A and B under the same main group node. I
mapped two different textures on the drawables respectively. Those two
drawables do not have any overlappings in terms of pixels on the screen.
Drawable A is described using open scene graph library while drawable B
is described using openGL terms. Depth test is disabled by Drawable A
and enabled by drawable B. Texture display on A uses a shader while B
does not.

When the renderbin of drawable A is set to be -1 and that of drawable B
is 1, the two textures cannot show at the same time. That is to say,
when I toggle on both drawable A and drawable B and make them both seen
on the screen, the texture on drawable B doesn't show, but the color of
B is correct. Under such circumstance, when I manipulate the view in a
way such that drawable A is out of sight, then the texture on drawable B
will appear.

If I reverse the renderbins of those two drawables, then the textures
can show at the same time.

I just cannot figure out why this happens.

 If it is due to depth test, why should such conflicts happen, since
they have no overlapped pixels on the screen? Even it is due to depth
test, why is the color  correct?

Would someone please be so kind enough as to explain that to me?

Thanks millions in advance!

Best regards
Shuiying




___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help:How many frames at most can be achieved within OSG framework?

2012-10-08 Thread wang shuiying

Hello,

I am wondering how many frames at most can be achieved when a scene is 
rendered within OSG framework? Is it restricted by GPU performance or 
the operating system or OSG framework?


Thank you very much in advance!


Best regards

Shuiying

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help:why does renderbin affect the display of textures ?

2012-10-05 Thread wang shuiying

Hello,

I have two drawable nodes A and B under the same main group node. I 
mapped two different textures on the drawables respectively. Those two 
drawables do not have any overlappings in terms of pixels on the screen. 
Drawable A is described using open scene graph library while drawable B 
is described using openGL terms. Depth test is disabled by Drawable A 
and enabled by drawable B. Texture display on A uses a shader while B 
does not.


When the renderbin of drawable A is set to be -1 and that of drawable B 
is 1, the two textures cannot show at the same time. That is to say, 
when I toggle on both drawable A and drawable B and make them both seen 
on the screen, the texture on drawable B doesn't show, but the color of 
B is correct. Under such circumstance, when I manipulate the view in a 
way such that drawable A is out of sight, then the texture on drawable B 
will appear.


If I reverse the renderbins of those two drawables, then the textures 
can show at the same time.


I just cannot figure out why this happens.

 If it is due to depth test, why should such conflicts happen, since 
they have no overlapped pixels on the screen? Even it is due to depth 
test, why is the color  correct?


Would someone please be so kind enough as to explain that to me?

Thanks millions in advance!

Best regards
Shuiying

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help: is prerender camera regarded as slave camera during rendering traversal?

2012-08-30 Thread wang shuiying

Hello everyone,

I would like to know where the prerender camera is rendered in OSG.

Is prerender camera  regarded as slave camera during rendering traversal 
of viewer?


Is there a way to directly render a prerender camera at any time I want 
? That is, not at a tempo of 60 Frame per second.



Thank you in advance!

Best regards

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help: Is there a way to hide a node from a camera?

2012-07-02 Thread wang shuiying

Hello,

The scene graph structure is as following:

Nodes B and C are children of Transform Node A. Node A is attached to 
two camera nodes CamA and CamB(pre render).


I want CamA to be able to render B and C, but CamB only B.  Is there a 
way to achieve that without using fragment shader with  if discard ?



Thank you very much in advance.

Best wishes

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] osg-users Digest, Vol 61, Issue 2

2012-07-02 Thread wang shuiying

Hello, Chris

NodeMask works well. Thank you very much!



cheers

Shuiying

On 07/02/2012 09:03 PM, osg-users-requ...@lists.openscenegraph.org wrote:
Message: 15 Date: Mon, 2 Jul 2012 11:06:01 -0600 From: Chris Hanson 
xe...@alphapixel.com To: OpenSceneGraph Users 
osg-users@lists.openscenegraph.org Subject: Re: [osg-users] Help: Is 
there a way to hide a node from a camera? Message-ID: 
cagoufmyryvhpqfuxbxvsgq1h1sesuedw4cdribrunnytgvt...@mail.gmail.com 
Content-Type: text/plain; charset=windows-1252 On Mon, Jul 2, 2012 
at 10:51 AM, wang shuiying shuiying.w...@fu-berlin.dewrote:

  Hello,

  The scene graph structure is as following:
  Nodes B and C are children of Transform Node A. Node A is attached to two
  camera nodes CamA and CamB(pre render).
  I want CamA to be able to render B and C, but CamB only B.  Is there a way
  to achieve that without using fragment shader with  if discard ?


   This is what NodeMasks are for.




  
http://www.openscenegraph.org/projects/osg/wiki/Support/Tutorials/NodeMaskDemo

or if that's still not responding, try the Google cache version:

http://webcache.googleusercontent.com/search?q=cache:IpBLdKB5Zq4J:www.openscenegraph.org/projects/osg/wiki/Support/Tutorials/NodeMaskDemo+cd=1hl=enct=clnkgl=us

-- Chris 'Xenon' Hanson, omo sanza lettere. xe...@alphapixel.com 
http://www.alphapixel.com/ Training ? Consulting ? Contracting 3D ? 
Scene Graphs (Open Scene Graph/OSG) ? OpenGL 2 ? OpenGL 3 ? OpenGL 4 ? 
GLSL ? OpenGL ES 1 ? OpenGL ES 2 ? OpenCL Digital Imaging ? GIS ? GPS 
? Telemetry ? Cryptography ? Digital Audio ? LIDAR ? Kinect ? Embedded 
? Mobile ? iPhone/iPad/iOS ? Android -- next part 
-- An HTML attachment was scrubbed... URL: 
http://lists.openscenegraph.org/pipermail/osg-users-openscenegraph.org/attachments/20120702/5fea0376/attachment-0001.htm 



___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help:Problem with LineSegmentIntersector. Maybe a bug?

2012-05-31 Thread wang shuiying

Hello,

In osgUtils::LineSegmentIntersector, there is a function enter() which 
invokes function  intersects( node.getBound()) . That serves to check 
whether the LineSegment has intersection with the bounding sphere of the 
node.


However, if I put LineSegmentIntersector into a IntersectionVisitor and 
push a viewMatrix to the intersectionVisitor, which means that the 
linesegment and the traversal node are not defined in reference to  the 
same coordinate system and the transformation is the viewmatrix. Then 
problem occurs. The intersection test might not stand  enter() stage, 
because the lineSegment and the node to be traversed are considered to 
be in the same coordinate system in function   intersects( 
node.getBound()) .


As in my program,  the node is defined  in global coordinate, but the 
drawables are far away from the origin.  The LineSegments are defined in 
another coordinate system with start point (0,0,0) and a length of 200. 
Only if I put a drawable at the origin of the global system, the other 
drawables far away from the origin can be detected by the Linesegment, 
or else nothing is detected, though they should be.


Is there something wrong with my application or is that a bug with 
LineSegmentIntersector? I use osg version 3.0.0-2ubuntu1.


Thank you very much in advance!

Best regards

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help: intersectionVisitor in a separate working thread cannot work when it is to visit a view node

2012-05-29 Thread wang shuiying

Hello, Robert

Sorry for the late response. But I still cannot figure out why the 
thread calling viewNode-accept(IntersectionVisitor) just stops there 
without going any further. I took a look at the source code of 
accept() function as well as the frame() function of viewer, but I 
didn't find a mutex or something like that to protect the node from 
being visited by two nodevisitors at the same time.  In my application, 
intersectionVisitor will not change the scene data, only to get 
intersections. I try to do this because I need the intersectionVisitor  
to change matrix and perform intersection at a higher frequency than the 
main frame frequency of the scenedata traversal. Is there any 
possibility to work this out?



Thank you very much in advance!

Best regards

Shuiying

On 04/01/2012 09:05 PM, osg-users-requ...@lists.openscenegraph.org wrote:

Message: 11
Date: Sun, 1 Apr 2012 18:44:28 +0100
From: Robert Osfieldrobert.osfi...@gmail.com
To: OpenSceneGraph Usersosg-users@lists.openscenegraph.org
Subject: Re: [osg-users] Help: intersectionVisitor in a separate
working thread cannot work when it is to visit a view node
Message-ID:
cafn7y+xyjhv90c26hcrng9avdkh0gdujyuo9k6+oh+jjobj...@mail.gmail.com
Content-Type: text/plain; charset=ISO-8859-1

Hi Shuiying,

It's not safe to read for a data structure that is written to be
another thread.  If you scene graph is not changing then it'll be safe
to traverse multi-threaded without problem but if it's changing you'll
need to serialize the reading from the writing threads in some manner.
  The normal frame loop that the OSG provides with the update and event
traversal that can do updates occurring single threaded and before the
cull and draw traversals start.

Robert.

On 1 April 2012 09:46, wang shuiyingshuiying.w...@fu-berlin.de  wrote:

  Hello,

  In my programme, I have among others a view node and a thread. The
  geometries under the view node changes every frame. The thread controls the
  frequency at which an IntersectionVisitor visits the view node.

  However it turns out it doesn't work. When the intersectionVisitor is ready
  to visit the view node, it stops there, without going further.

  If ?the visit of intersectionVisitor is made to be a part of ?the
  updatecallback of view node, then it works.

  I wonder if the intersectionVisitor can work separately from the
  updatecallback of the node that it visits


  Thank you very much in advance for any advice!

  Shuiying
  ___
  osg-users mailing list
  osg-users@lists.openscenegraph.org
  http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] osg-users Digest, Vol 59, Issue 36

2012-05-28 Thread wang shuiying

Hello, Sergey

Thank you very much for your reply. It helps a lot!

Cheers,

Shuiying



On 05/28/2012 09:03 PM, osg-users-requ...@lists.openscenegraph.org wrote:

Message: 2
Date: Mon, 28 May 2012 14:02:43 +0400
From: Sergey Polischukpol...@yandex.ru
To: OpenSceneGraph Usersosg-users@lists.openscenegraph.org
Subject: Re: [osg-users] Help:Mapping between vertices in eye
coordinate and  pixels on rendered image
Message-ID:144081338199...@web14h.yandex.ru
Content-Type: text/plain; charset=koi8-r

Accidentally sent incomplete message:(

lets assume your eye coord is osg::Vec4 eyeCoord:

eyeCoord = eyeCoord * osgProjectionMatrix;//get it from camera
if (eyeCoord.w()  0 )
//vertex behind camera
else
{
  eyeCoord /= eyeCoord.w();
  eyeCoord.x() *= 0.5;
  eyeCoord.y() *= 0.5;
  eyeCoord.x() += 0.5;
  eyeCoord.y() += 0.5;
  eyeCoord.x() *= width;
  eyeCoord.y() *= height;
}

now your eyeCoord.xy contains screen coords in pixels

Cheers,
Sergey

28.05.2012, 14:00, Sergey Polischukpol...@yandex.ru:

  Hi, Shuiying

  lets assume your eye coord is osg::Vec4 eyeCoord:

  eyeCoord = eyeCoord * osgProjectionMatrix;
  if (eyeCoord.w()  0 )
  //vertex behind camera
  else
  {
  ?eyeCoord /= eyeCoord.w()

  }

  26.05.2012, 19:58, wang shuiyingshuiying.w...@fu-berlin.de:


  ?Hallo,

  ?I simulate a real camera with OSG camera node and for debugging I have
  ?some problem with the pixel location. I would like to know whether the
  ?following statement is correct or false:

  ?if ?coordinate of vertex in eye coordinate is (xe,ye,ze), and we assume
  ?that xe0,ye0,ze0, namely, the verice should appear next to bottom
  ?right corner of the image. (Origin of the image is top left, x forwards
  ?to right, y forwards to bottom)

  ?Then can its corresponding pixel location(px, py) on the image
  ?calculated like following? (x resolution ?of image is xRes, y
  ?Resoluation is yRes, yaw angle of camera is ViewYaw, pitch angle of
  ?camera is ViewPitch, near plane is 1 )

  ?px=xe/abs(ze)* (xRes/2/tan(ViewYaw/2)) +xRes/2

  ?py=abs(ye)/abs(ze)* (yRes/2/tan(ViewPitch/2)) +yRes/2

  ?I get coordinate of vertex in eye coordinate by a product of vertex
  ?global coordinate and camera-getViewmatrix(), and then calculate the
  ?corresponding pixel location following the above formulas. But the
  ?location is not correct compared to what appears on the image. the error
  ?is about 50 pixels in the x coordinate.

  ?Can anybody give me some tips on that?

  ?Thank you very much in advance!

  ?Best regards

  ?Shuiying

  ?___
  ?osg-users mailing list
  ?osg-users@lists.openscenegraph.org
  ?http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


  ___
  osg-users mailing list
  osg-users@lists.openscenegraph.org
  http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help:Mapping between vertices in eye coordinate and pixels on rendered image

2012-05-26 Thread wang shuiying

Hallo,

I simulate a real camera with OSG camera node and for debugging I have 
some problem with the pixel location. I would like to know whether the 
following statement is correct or false:


if  coordinate of vertex in eye coordinate is (xe,ye,ze), and we assume 
that xe0,ye0,ze0, namely, the verice should appear next to bottom 
right corner of the image. (Origin of the image is top left, x forwards 
to right, y forwards to bottom)


Then can its corresponding pixel location(px, py) on the image 
calculated like following? (x resolution  of image is xRes, y 
Resoluation is yRes, yaw angle of camera is ViewYaw, pitch angle of 
camera is ViewPitch, near plane is 1 )


px=xe/abs(ze)* (xRes/2/tan(ViewYaw/2)) +xRes/2

py=abs(ye)/abs(ze)* (yRes/2/tan(ViewPitch/2)) +yRes/2


I get coordinate of vertex in eye coordinate by a product of vertex 
global coordinate and camera-getViewmatrix(), and then calculate the 
corresponding pixel location following the above formulas. But the 
location is not correct compared to what appears on the image. the error 
is about 50 pixels in the x coordinate.



Can anybody give me some tips on that?


Thank you very much in advance!


Best regards

Shuiying




___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help: how does lightSource with referenceFrame of osg::LightSource::ABSOLUTE_RF work?

2012-05-18 Thread wang shuiying

Hello,

I have 2 questions about the lightSource node.

(1) Does lightSource affect all the geode nodes in the scenedata  or 
only the ones that are under it in the sceneData structure?
(2) In my application, a lightSource is attached to the main sceneNode, 
and this lightSource has a osg::group node (NodeA)as its child which 
contains several 3D models . There is also a pre render camera attached 
to the lightSource, and that pre render camera also has NodeA as child. 
When I set 
lightSource-setReferenceFrame(osg::LightSource::ABSOLUTE_RF), the color 
of the texture from the prerender camera varies as I use mouse to change 
the orientation of master camera displaying the 3D view. I don't know 
what is happening here. In my opinion, as the position and orientation 
of the pre render camera relative to the static 3D objects stay the 
same, the texture color should be the same too.


When  Relative_RF is assigned, then the texture keeps the same however I 
change the master camera.


Can anyone give me some hints on that?


Thanks a lot in advance!

Best regards

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help: intersectionVisitor in a separate, working thread cannot work when it is to visit a view node

2012-04-07 Thread wang shuiying

Hello, Robert

As you said, it is not safe to read for a data structure that  is written to be 
another thread. But if something wrong happens, the programme might tell 
something when it is running. However in my case, the thread just stops at 
somewhere during the implementation of IntersectionVisitor.apply(sceneNode).

In my programme, osg::Viewer works as a Otwidget of a QtApplication. A QtTimer 
controls the frequency at which the Viewer calls frame(). The 
intersectorVisitor is called to be accepted by the sceneNode(viewer) in  
another thread.  When I set the timer interval to be 0, the Viewer renders 
normally, whereas the intersectionVIsitor stops at  somewhere during the 
implementation of IntersectionVisitor.apply(sceneNode) and the thread doesn't 
go any further. So I thought , maybe it is because QTapplication controls the 
data structure all the time. Then I change the timer interval to be 5000(i.e.5 
second), and I thought, in this way, the data structure might  not be controled 
all the time by the QtApplication, which might lead to a chance for the thread. 
However, the result turns out to be the same with that when timer interval is 
0. The only difference is that the viewer renders in a much lower Frame rate.

I just don't know why the thread stops there. There is no mutual control like 
Qtmutex in the traverse process of the sceneNode and the assistant functions of 
Qt. So how it comes that intersectionVisitor stops functioning?

I use the bin library of OSG, so I cannot debug in OSG source code.

I am really confused.

Thank you for any hints on that.



Best regards and happy Easter day.

Shuiying


Message: 11
Date: Sun, 1 Apr 2012 18:44:28 +0100
From: Robert Osfieldrobert.osfi...@gmail.com
To: OpenSceneGraph Usersosg-users@lists.openscenegraph.org
Subject: Re: [osg-users] Help: intersectionVisitor in a separate
working thread cannot work when it is to visit a view node
Message-ID:
cafn7y+xyjhv90c26hcrng9avdkh0gdujyuo9k6+oh+jjobj...@mail.gmail.com
Content-Type: text/plain; charset=ISO-8859-1

Hi Shuiying,

It's not safe to read for a data structure that is written to be
another thread.  If you scene graph is not changing then it'll be safe
to traverse multi-threaded without problem but if it's changing you'll
need to serialize the reading from the writing threads in some manner.
 The normal frame loop that the OSG provides with the update and event
traversal that can do updates occurring single threaded and before the
cull and draw traversals start.

Robert.

On 1 April 2012 09:46, wang shuiyingshuiying.w...@fu-berlin.de  wrote:


Hello,

In my programme, I have among others a view node and a thread. The
geometries under the view node changes every frame. The thread controls the
frequency at which an IntersectionVisitor visits the view node.

However it turns out it doesn't work. When the intersectionVisitor is ready
to visit the view node, it stops there, without going further.

If ?the visit of intersectionVisitor is made to be a part of ?the
updatecallback of view node, then it works.

I wonder if the intersectionVisitor can work separately from the
updatecallback of the node that it visits


Thank you very much in advance for any advice!

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org



___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help:can cameras in different graphics contexts visit the a changing view node in a multi-threaded way?

2012-04-05 Thread wang shuiying

Hello,

In my application, there are several pre-render cameras and a main 
camera. In order not to slow down the frame rate for the main camera, I 
would like to try to move the pre-render cameras out of the graphics 
context of the main camera, and into another graphics context. The view 
node for all cameras to render is changing all the time. So I wonder 
whether cameras in different graphics contexts(i.e. different GPUs)  can 
render a changing view node in a multi-thread (or, multi-grahics) way, 
while the updates of the view node happens only in the main camera context?



Thank you very much for any advice in advance!

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help: intersectionVisitor in a separate working thread cannot work when it is to visit a view node

2012-04-01 Thread wang shuiying

Hello,

In my programme, I have among others a view node and a thread. The 
geometries under the view node changes every frame. The thread controls 
the frequency at which an IntersectionVisitor visits the view node.


However it turns out it doesn't work. When the intersectionVisitor is 
ready to visit the view node, it stops there, without going further.


If  the visit of intersectionVisitor is made to be a part of  the 
updatecallback of view node, then it works.


I wonder if the intersectionVisitor can work separately from the 
updatecallback of the node that it visits



Thank you very much in advance for any advice!

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help: problems of intersectorVisitor with Qtmutex

2012-02-22 Thread wang shuiying


Hello,

My program has an osg::viewer as a view widget of a Qt window. An 
intersectorVisitor object  is initialised at the start of the program.  
The osg::viewer updates its children nodes  as a normal viewer does, and 
updates the transform matrix of intersectorVisitor at the same time. I 
make another thread to do call of node-accept(intersectorVisitor), 
where the node is the root node of osg::viewer. In case that the two 
processes might conflict with each other I set lock with Qmutex when the 
osg::viewer updates and when  node-accept(intersectorVisitor) is 
called in another thread.  But the result turns out to be wrong. When I 
let the two processes run freely without locking them, then the result 
is right.


I am wondering what happens here. That is to say, what 
intersectorVositor will do when it itself is locked or the nodes that it 
visits are locked.


Thank you very much in advance!

Shuiying


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help: why node update rate so low? update interval=300ms?

2012-02-06 Thread wang shuiying

Hello, Sebastian

I find something new.  After I remove the prerender cameras (which has 
customized shaders), the FPS becomes 60, more or less,even with the models.


So I guess the problem lies in these cameras.

When I make the shaders do nothing, the FPS doesn't changer, still 3FPS.
When I disable those pre render cameras, FPS goes up , to a normal 
state, i.e. 60


There are  altogether 36+22+1 prerender cameras. So perhaps it is 
because the cameras should render one by one, thus FPS gets so slow.



I wonder if there is a way to make those prerender cameras render 
parallelly?


Thank you very much for any advice!

Cheers

Shuiying







On 02/05/2012 12:11 AM, wang shuiying wrote:

Hello, Sebastian

Of course, I can share my models. They are all from free model website.

attached are one house,7M, and 1 car.

after I change the timer interval to 1, the time interval for osg
update turns 170, better than before, but still bad performance.

Thank you very much for your advice!

Shuiying


On 02/04/2012 11:33 PM, Sebastian Messerschmidt wrote:

Hello Shuiying,


Hello, Sebastian

my model is simple 3D model like car (1MB), house(2~7MB).

That still says sort of nothing about the model ...


This application is a car simulator within the framework of OROCOS,
which is a multi-task real time toolkit. OSG is used to display the
simulator.

I took a look at the root code just now (I only here take over
the half done job of someone else:-( ).

it shows that  there is:

a Qlist of   View3D, which is derived from  QOsgWidget and  osgViewer.

whenever there is a new subscenegraph (osg::group node), it will be
added to a view3D(because it has a osg::group as a member);

at the same time  the  cameraNode of this subscenegraph will be
collected by a cameraCollector; (so that all camera will be
controlled  to act coherent)

Then this view3D(as a QOsgWidget) will be added to   QMainFrame.

I think this following  line of source code may help explain the
problem:

connect(mTimer, SIGNAL(timeout()), this, SLOT(updateGL()));
mTimer.setInterval(30);
mTimer.start();

So the scene update rate is not controlled by GPU or something, it
is controlled by QT  timer, that is disappointing.:-(

You're right.
The problem is that timer will fire every 30ms which should give you
around 33Hz maximum (effectively it will fire around twice the time
your OS schedulers minimum thread time).
Anyways, for the maximum framerate try setInterval 0 or 1.


Am I right here?

So is there a way to get higher FPS based on this condition?

That depends. You reported 300ms per update, which is 3 FPS, so
increasing the frame rate is not purely based on your timer.
First of all: Is your project running in debug or release mode? If it
is debug I strongly suggest measuring any timings in release only, as
the debug overhead is huge.
If you're already running release the problem is most likely in your
models. There is just no general suggestion how to improve rendering
time, as it mainly depends on the scenegraph's structure.
Maybe running an optimizer on the model will fix some of the major
problems.
In order to see if there is a framework induced penalty you should
try to load a model with the osgviewer and check the framerate there
(just press 's' to see the framerate and other statistics). If the
framerate there is much higher than in your application, there most
likely is a problem in your application. If not, your model is quite
heavy.
If you wont mind sharing one of the models (you can also send it to
my private address if you don't want to make it public) I can see
what are my results here with my renderer.

cheers
Sebastian


Thanks a lot in advance!

Shuiying





On 02/04/2012 09:47 PM, Sebastian Messerschmidt wrote:

Hello Shuiying

You didn't tell us what kind of models. If I would add 51 of my
terrain models i'd be happy to get 3 frames ;-)
Also you didn't tell us what your application is ... more or less
only a osgviewer?
In case you provide a little more detail you probably get some
answers/help

cheer
Sebastian

Hello,

This is  somehow frustrating, if the rendering rate is the same
with update rate.

I check the update interval of the sceneNode in my application, it
turns out to be 300ms.

I only attach 51 3D models in my application.

when there is only 5 3D models, the interval is 70ms.

I just wonder whether it is normal or not?

And update rate is the same with rendering rate?

Thank  you very much for any advice!

shuiying


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org













___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help: why node update rate so low? update interval=300ms?

2012-02-04 Thread wang shuiying

Hello,

This is  somehow frustrating, if the rendering rate is the same with 
update rate.


I check the update interval of the sceneNode in my application, it turns 
out to be 300ms.


I only attach 51 3D models in my application.

when there is only 5 3D models, the interval is 70ms.

I just wonder whether it is normal or not?

And update rate is the same with rendering rate?

Thank  you very much for any advice!

shuiying


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help: why node update rate so low? update interval=300ms?

2012-02-04 Thread wang shuiying

Hello, Sebastian

my model is simple 3D model like car (1MB), house(2~7MB).

This application is a car simulator within the framework of OROCOS, 
which is a multi-task real time toolkit. OSG is used to display the 
simulator.


I took a look at the root code just now (I only here take over  the 
half done job of someone else:-( ).


it shows that  there is:

a Qlist of   View3D, which is derived from  QOsgWidget and  osgViewer.

whenever there is a new subscenegraph (osg::group node), it will be 
added to a view3D(because it has a osg::group as a member);


at the same time  the  cameraNode of this subscenegraph will be 
collected by a cameraCollector; (so that all camera will be controlled  
to act coherent)


Then this view3D(as a QOsgWidget) will be added to   QMainFrame.

I think this following  line of source code may help explain the problem:

connect(mTimer, SIGNAL(timeout()), this, SLOT(updateGL()));
mTimer.setInterval(30);
mTimer.start();

So the scene update rate is not controlled by GPU or something, it is 
controlled by QT  timer, that is disappointing.:-(


Am I right here?

So is there a way to get higher FPS based on this condition?

Thanks a lot in advance!

Shuiying





On 02/04/2012 09:47 PM, Sebastian Messerschmidt wrote:

Hello Shuiying

You didn't tell us what kind of models. If I would add 51 of my 
terrain models i'd be happy to get 3 frames ;-)
Also you didn't tell us what your application is ... more or less only 
a osgviewer?
In case you provide a little more detail you probably get some 
answers/help


cheer
Sebastian

Hello,

This is  somehow frustrating, if the rendering rate is the same with 
update rate.


I check the update interval of the sceneNode in my application, it 
turns out to be 300ms.


I only attach 51 3D models in my application.

when there is only 5 3D models, the interval is 70ms.

I just wonder whether it is normal or not?

And update rate is the same with rendering rate?

Thank  you very much for any advice!

shuiying


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org 





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help: many cameras are rendering at the same time or one by one?

2012-01-30 Thread wang shuiying

Hello,

My question might be silly, but I really want to know about that.

I wonder if there are many cameras (one main camera + many pre render 
cameras), will OSG (or grahic hardware) manage to make these cameras 
render one by one or  at the same time?


I have a programm concerning leveraging GPU parallelism capability, but 
I include many pre render cameras in it. So I would like to know how 
these cameras are managed.


Thank you in advance!

Shuiying


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help: how to getmore than four elements via fragment shader?

2012-01-29 Thread wang shuiying

Hello,

  I want to record more than four elements by writing to out variable 
in fragment shader.


 I notice that FBO can connect to many buffers, so I define two images 
which are respectively attached to a color buffer and a depth buffer.


osg source code are as follows:

  osg::Image *imageDepth=new osg::Image();
   imageDepth-allocateImage((int)subXRes, (int)subYRes, 1, GL_RGBA, 
GL_FLOAT);

  camera-attach(osg::CameraNode::DEPTH_BUFFER, imageDepth);

osg::Image * image = new osg::Image();
image-allocateImage((int)subXRes, (int)subYRes, 1, GL_RGBA, GL_FLOAT);
camera-attach(osg::CameraNode::COLOR_BUFFER, image);

fragment shader is as follows:

  gl_FragColor = vec4(1,0,0,1);

gl_FragDepth=0.3;

But it doesn't work: application output:
Warning: detected OpenGL error 'invalid operation' at after 
RenderBin::draw(..)


 then I change the internal format of imagedepth to GL_DEPTH. it still 
doesn't work. and the system says, the framebuffer is attatched to an 
empty image.


So how can I get more than four elements via fragment shader?

Thank you very much in advance!

Shuiying



___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help: wrong ouput from frag shader

2012-01-19 Thread wang shuiying

Hi, J.P.

Thank you very much, it works under your suggestion!

Best regards
Shuiying



On 01/19/2012 07:39 AM, J.P. Delport wrote:

Hi,

On 18/01/2012 20:24, wang shuiying wrote:

Hi, J.P.

the camera isn't rendering to a texture.
Yes it is. It is just made for you automatically. FBO renders to a 
texture and the image is read from this texture when you attach it to 
the camera.


I suggest attaching a texture (to the same buffer component as the 
image), with the exact formats you require to the camera as well. E.g.


_OutputTexture = new osg::TextureRectangle;
_OutputTexture-setTextureSize(_TextureWidth, _TextureHeight);
_OutputTexture-setInternalFormat(GL_RGBA32F_ARB);
_OutputTexture-setSourceFormat(GL_RGBA);
_OutputTexture-setSourceType(GL_FLOAT);

camera-attach(osg::CameraNode::COLOR_BUFFER, _OutputTexture.get());
camera-attach(osg::CameraNode::COLOR_BUFFER, image);

rgds
jp


In my programme, camera is set
to be prerender, render target is FRAME_BUFFER_OBJECT, I attach the
camera to an image, in order to get the bufferdata via image. the image
data type is allocated as GL_FLOAT.

Related setup details are listed below.

// renderTarget
osg::Camera::RenderTargetImplementation renderTargetImplementation;
renderTargetImplementation = osg::CameraNode::FRAME_BUFFER_OBJECT;


// shader program
osg::Program * program = new osg::Program();
program-addShader(osg::Shader::readShaderFile(osg::Shader::FRAGMENT,
fragmentShaderPath));
program-addShader(osg::Shader::readShaderFile(osg::Shader::VERTEX,
vertexShaderPath));


// image
osg::Image * image = new osg::Image();
image-allocateImage((int)XRes, (int)YRes, 1, GL_RGBA, GL_FLOAT);

//camera
osg::ref_ptrosg::CameraNode camera(new osg::CameraNode());
camera-setProjectionMatrixAsFrustum(-tan(YawView * math::D2R * 0.5),
tan(YawView * math::D2R * 0.5), -tan(PitchView * math::D2R * 0.5),
tan(PitchView * math::D2R * 0.5),zNear, zFar);
camera-setComputeNearFarMode(osg::CullSettings::DO_NOT_COMPUTE_NEAR_FAR); 


camera-setViewport(0, 0, XRes, YRes);
camera-setClearColor(osg::Vec4(1000.0f, 1000.0f, 1000.0f, 1000.0f));
camera-setClearMask(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
camera-setReferenceFrame(osg::Transform::ABSOLUTE_RF);
camera-setViewMatrix(osg::Matrix::lookAt(osg::Vec3d(0, 0, 1.0f),
osg::Vec3d(-10.0f, 0.0f, 0), osg::Vec3d(0, 0, 1)));
camera-setRenderOrder(osg::CameraNode::PRE_RENDER);
camera-setRenderTargetImplementation(renderTargetImplementation);
camera-attach(osg::CameraNode::COLOR_BUFFER, image);


//read out data
osg::Vec4f * rgbaData = (osg::Vec4f *) (image-data());
for (int h = 0; h  yRes; h++) {
for (int w = 0; w  xRes; w++) {
osg::Vec4f cur = rgbaData[xRes * h + w];
coutcur[0]=cur[0],cur[1]=cur[1],cur[2]=cur[2],cur[3]=cur[3]endl; 



}
}

Thank you for any advice!

Shuiying

Message: 3
Date: Wed, 18 Jan 2012 09:11:15 +0200
From: J.P. Delportjpdelp...@csir.co.za
To: OpenSceneGraph Usersosg-users@lists.openscenegraph.org
Subject: Re: [osg-users] Help: wrong ouput from frag shader.
Message-ID:4f167093.8090...@csir.co.za
Content-Type: text/plain; charset=ISO-8859-1; format=flowed

Hi,

what format is the texture you are rendering to? If it is UNSIGNED CHAR
then your data would be discretised.

e.g. 45/255 = 0.176471

jp

On 17/01/2012 21:11, wang shuiying wrote:


Hello,

when I write gl_FragColor = vec4(0.174977,0,0,1) in Frag shader,
then I get through a related image: float Vec4: (0.176471,0,0,1)

so why 0.174977 changed into 0.176471?

I think there should be no process there after per fragment operations
that can change this element in the rendering pipeline. Blend, 
alphatest

and so on are all disenabled.

Thank you in advance!

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org 








___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help: wrong ouput from frag shader.

2012-01-18 Thread wang shuiying

Hello, Sebastian

Thank you for your reply!

In my programme, camera is set to be prerender, render target is  
FRAME_BUFFER_OBJECT, I attach the camera to an image, in order to get 
the bufferdata via image.


(1)  the fragment shader is as following:

out vec4 Frag_Color;
void main()
{
Frag_Color = vec4(a,b,c,d);
}

a,b,c,d can be replaced by any number between 0~1

 examples of the inaccuracy of data output are as following:

set in Frag shader-  data read via image
0.5   -0.556863

0.174  - 0.172549

0.17   -  0.168627
0.99  -   0.988235

Frag_Color = vec4(0.2,0.3,0.4,0.5)   -   
cur[0]=0.2,cur[1]=0.298039,cur[2]=0.4,cur[3]=0.498039
 Frag_Color = vec4(0.1,0.6,0.7,0.8) -
cur[0]=0.0980392,cur[1]=0.6,cur[2]=0.698039,cur[3]=0.8
Frag_Color = vec4(0.9,0.66,0.77,0.88)  -  
cur[0]=0.898039,cur[1]=0.658824,cur[2]=0.768627,cur[3]=0.878431



(2)The camera setup details are listed below.

// renderTarget
osg::Camera::RenderTargetImplementation renderTargetImplementation;
 renderTargetImplementation = osg::CameraNode::FRAME_BUFFER_OBJECT;


// shader program
 osg::Program * program = new osg::Program();
 program-addShader(osg::Shader::readShaderFile(osg::Shader::FRAGMENT, 
fragmentShaderPath));
 program-addShader(osg::Shader::readShaderFile(osg::Shader::VERTEX, 
vertexShaderPath));



// image
osg::Image * image = new osg::Image();
image-allocateImage((int)XRes, (int)YRes, 1, GL_RGBA, GL_FLOAT);

//camera
osg::ref_ptrosg::CameraNode camera(new osg::CameraNode());
camera-setProjectionMatrixAsFrustum(-tan(YawView * math::D2R * 0.5), 
tan(YawView * math::D2R * 0.5), -tan(PitchView * math::D2R * 0.5), 
tan(PitchView * math::D2R *0.5),zNear, zFar);

 camera-setComputeNearFarMode(osg::CullSettings::DO_NOT_COMPUTE_NEAR_FAR);
camera-setViewport(0, 0, XRes, YRes);
 camera-setClearColor(osg::Vec4(1000.0f, 1000.0f, 1000.0f, 1000.0f));
 camera-setClearMask(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
 camera-setReferenceFrame(osg::Transform::ABSOLUTE_RF);
 camera-setViewMatrix(osg::Matrix::lookAt(osg::Vec3d(0, 0, 1.0f), 
osg::Vec3d(-10.0f, 0.0f, 0), osg::Vec3d(0, 0, 1)));

camera-setRenderOrder(osg::CameraNode::PRE_RENDER);
camera-setRenderTargetImplementation(renderTargetImplementation);
camera-attach(osg::CameraNode::COLOR_BUFFER, image);


//read out data
osg::Vec4f * rgbaData = (osg::Vec4f *) (image-data());
for (int h = 0; h  yRes; h++) {
for (int w = 0; w  xRes; w++) {
osg::Vec4f cur = 
rgbaData[xRes * h + w];

coutcur[0]=cur[0],cur[1]=cur[1],cur[2]=cur[2],cur[3]=cur[3]endl;

}
}

Thank you very much for any advice!

Shuiying


On 01/18/2012 08:11 AM, Sebastian Messerschmidt wrote:

I suppose you are using a normal framebuffer.
I this case it is pretty normal, that your original value that is in 
the image (you unfortunately didn't tell use if the value comes from a 
texture)
doesn't exactly match the value in the framebuffer. Usually the 
framebuffer has 8bit per color channel, which makes the value of 
0.174977 not representable.
If you need the exact value to be in the framebuffer you'll have to 
render it to a floating point buffer via RTT.
Also you didn't tell us how you got the result value. There is also 
the possibility that you read an  interpolated/filtered value.


If you provide a bit more context someone might be able to help you 
with your problem

Hello,

when I write  gl_FragColor = vec4(0.174977,0,0,1) in Frag shader,
 then I get through a related image: float Vec4: (0.176471,0,0,1)

so why   0.174977 changed into 0.176471?

I think there should be no process there after per fragment 
operations that can change this element in the rendering pipeline. 
Blend, alphatest and so on are all disenabled.


Thank you in advance!

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org 





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help:what is openGl shader version 140?

2012-01-18 Thread wang shuiying

Hello Sebastian,

You are right. I have other errors in the shader when such warnings 
happened. After I correct them, it works normal!


Thank you very much for the explanation!

Shuiying

On 01/18/2012 08:04 AM, Sebastian Messerschmidt wrote:

Hello shuiying,

either your card doesn't support glsl version 1.40 or you might have 
some other error in the shader.
GLSL comes in different versions that are (AFAIK) defined by the GPU 
Architecture/Driver and Base OpenGL version.
I had such warning sometimes if there was another problem in the 
shader, so maybe post the entire shader here or on opengl.org.


cheers
Sebastian


Hello,

my shader compile stage throws out such warnings:

0(8) : warning C7532: global type sampler2DRect requires #version 
140 or later
0(8) : warning C: ... or #extension GL_ARB_texture_rectangle : 
enable


and Then I add

#version 140
#extension GL_ARB_texture_rectangle : enable

at the very head of the shader file, but the warnings remain.

So how can I check whether my programm is able to use

#version 140
#extension GL_ARB_texture_rectangle : enable
?

Thank you very much in advance.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org 





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help: wrong ouput from frag shader.

2012-01-18 Thread wang shuiying

Hello, Sebastian

There is also another strange problem . When

Vertex shader is as follows:

varying out  vec3 pos;
varying out  float dist2;

   void main()

   {

   vec3 pos1 = (gl_ModelViewMatrix * (gl_Vertex -
   vec4(0.0,0.0,0.0,0.0))).xyz;

   dist2= length(pos1)

 gl_Position = ftransform();
   }


Frag shader is as follows:


   varying in  vec3 pos;
   varying in float dist2;
   out vec4 Frag_Color;

   void main()
   {


 Frag_Color = vec4(abs(dist2/100),0,0,1);

   }

Then when Frag_color is read via Image, the output is:

0,0,0,1

While in my programme, the dist2 should be equal to for some fragments 
0.18 , for other fragments 0.56 and for still others not available 
because no geometry covers those fragments.


But when the shader is written like this:

vertex shader:

varying out  vec3 pos;

void main()

{

 pos = (gl_ModelViewMatrix * (gl_Vertex -
   vec4(0.0,0.0,0.0,0.0))).xyz;

 gl_Position = ftransform();
}


Fragment shader:


varying in  vec3 pos;
   out vec4 Frag_Color;

void main()
{

 float dist2= length(pos);
 Frag_Color = vec4(abs(dist2/100),0,0,1);

}

Then when Frag_color is read via Image, the output is:


   0.188235,0,0,1
   0.560784,0,0,1

so this time it works!

I really cannot figure out why this happens!

Thank you very much in advance for any advice!

Shuiying















On 01/18/2012 08:11 AM, Sebastian Messerschmidt wrote:

I suppose you are using a normal framebuffer.
I this case it is pretty normal, that your original value that is in 
the image (you unfortunately didn't tell use if the value comes from a 
texture)
doesn't exactly match the value in the framebuffer. Usually the 
framebuffer has 8bit per color channel, which makes the value of 
0.174977 not representable.
If you need the exact value to be in the framebuffer you'll have to 
render it to a floating point buffer via RTT.
Also you didn't tell us how you got the result value. There is also 
the possibility that you read an  interpolated/filtered value.


If you provide a bit more context someone might be able to help you 
with your problem

Hello,

when I write  gl_FragColor = vec4(0.174977,0,0,1) in Frag shader,
 then I get through a related image: float Vec4: (0.176471,0,0,1)

so why   0.174977 changed into 0.176471?

I think there should be no process there after per fragment 
operations that can change this element in the rendering pipeline. 
Blend, alphatest and so on are all disenabled.


Thank you in advance!

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org 





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help: wrong ouput from frag shader

2012-01-18 Thread wang shuiying

Hi, J.P.

the camera isn't  rendering to a texture. In my programme, camera is set 
to be prerender, render target is  FRAME_BUFFER_OBJECT, I attach the 
camera to an image, in order to get the bufferdata via image. the image 
data type is allocated as GL_FLOAT.


Related setup details are listed below.

// renderTarget
osg::Camera::RenderTargetImplementation renderTargetImplementation;
 renderTargetImplementation = osg::CameraNode::FRAME_BUFFER_OBJECT;


// shader program
 osg::Program * program = new osg::Program();
 program-addShader(osg::Shader::readShaderFile(osg::Shader::FRAGMENT, 
fragmentShaderPath));
 program-addShader(osg::Shader::readShaderFile(osg::Shader::VERTEX, 
vertexShaderPath));



// image
osg::Image * image = new osg::Image();
image-allocateImage((int)XRes, (int)YRes, 1, GL_RGBA, GL_FLOAT);

//camera
osg::ref_ptrosg::CameraNode camera(new osg::CameraNode());
camera-setProjectionMatrixAsFrustum(-tan(YawView * math::D2R * 0.5), 
tan(YawView * math::D2R * 0.5), -tan(PitchView * math::D2R * 0.5), 
tan(PitchView * math::D2R *0.5),zNear, zFar);

 camera-setComputeNearFarMode(osg::CullSettings::DO_NOT_COMPUTE_NEAR_FAR);
camera-setViewport(0, 0, XRes, YRes);
 camera-setClearColor(osg::Vec4(1000.0f, 1000.0f, 1000.0f, 1000.0f));
 camera-setClearMask(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
 camera-setReferenceFrame(osg::Transform::ABSOLUTE_RF);
 camera-setViewMatrix(osg::Matrix::lookAt(osg::Vec3d(0, 0, 1.0f), 
osg::Vec3d(-10.0f, 0.0f, 0), osg::Vec3d(0, 0, 1)));

camera-setRenderOrder(osg::CameraNode::PRE_RENDER);
camera-setRenderTargetImplementation(renderTargetImplementation);
camera-attach(osg::CameraNode::COLOR_BUFFER, image);


//read out data
osg::Vec4f * rgbaData = (osg::Vec4f *) (image-data());
for (int h = 0; h  yRes; h++) {
for (int w = 0; w  xRes; w++) {
osg::Vec4f cur = 
rgbaData[xRes * h + w];

coutcur[0]=cur[0],cur[1]=cur[1],cur[2]=cur[2],cur[3]=cur[3]endl;

}
}

Thank you  for any advice!

Shuiying

Message: 3
Date: Wed, 18 Jan 2012 09:11:15 +0200
From: J.P. Delportjpdelp...@csir.co.za
To: OpenSceneGraph Usersosg-users@lists.openscenegraph.org
Subject: Re: [osg-users] Help: wrong ouput from frag shader.
Message-ID:4f167093.8090...@csir.co.za
Content-Type: text/plain; charset=ISO-8859-1; format=flowed

Hi,

what format is the texture you are rendering to? If it is UNSIGNED CHAR
then your data would be discretised.

e.g. 45/255 = 0.176471

jp

On 17/01/2012 21:11, wang shuiying wrote:


Hello,

when I write gl_FragColor = vec4(0.174977,0,0,1) in Frag shader,
then I get through a related image: float Vec4: (0.176471,0,0,1)

so why 0.174977 changed into 0.176471?

I think there should be no process there after per fragment operations
that can change this element in the rendering pipeline. Blend, alphatest
and so on are all disenabled.

Thank you in advance!

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org



___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help:what is openGl shader version 140?

2012-01-17 Thread wang shuiying

Hello,

my shader compile stage throws out such warnings:

0(8) : warning C7532: global type sampler2DRect requires #version 140 
or later

0(8) : warning C: ... or #extension GL_ARB_texture_rectangle : enable

and Then I add

#version 140
#extension GL_ARB_texture_rectangle : enable

at the very head of the shader file, but the warnings remain.

So how can I check whether my programm is able to use

#version 140
#extension GL_ARB_texture_rectangle : enable
?

Thank you very much in advance.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help: wrong ouput from frag shader.

2012-01-17 Thread wang shuiying

Hello,

when I write  gl_FragColor = vec4(0.174977,0,0,1) in Frag shader,
 then I get through a related image: float Vec4: (0.176471,0,0,1)

so why   0.174977 changed into 0.176471?

I think there should be no process there after per fragment operations 
that can change this element in the rendering pipeline. Blend, alphatest 
and so on are all disenabled.


Thank you in advance!

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help: LineSegmentIntersector operates in GPU or CPU?

2012-01-16 Thread wang shuiying

Hello,

I am really sorry to trouble you again with my naive questions. :-)
(1)Where does LineSegmentIntersector carry out its intersection? In GPU 
or CPU?
(2) How can I know which processes happen in GPU, which ones in CPU? 
(perhaps only draw traversal happens in GPU?)
(3)How can I check the rendering rates of my osg programm? Is it 
invariant during run time or does it remain constant?


Thank you very much in advance!

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help:Is there a way for a single fragment to know its index during per-fragment operation?

2012-01-14 Thread wang shuiying

Hello,

In fragment shader, I would like to chage the fragment colour according 
to the fragment's index. Here index means integer number pair(x,y), 
where 0xxRes, 0yyRes, provided that viewPort has xRes * yRes pixels.


Is there a way to achieve that?

Thank you in advance!


Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help: Is there an osg example with ray tracing?

2012-01-13 Thread wang shuiying

Hello,

I simulate a laser sensor with shader using osg, and want to compare it 
with simulated laser sensor using ray tracing technique. So is there a 
ray tracing example of osg in such application?


Thank you in advance!


Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help:How to control rasterization stage through osg?

2012-01-12 Thread wang shuiying

Hi,

Thank you all a lot for your help. That really helps me a lot.

Now I know that polygon triangulation and rasterization algorithms have 
something to do with the imperfectness of my programm.


I have some other questions:

(1)
According to triangle raterization function (3.9), p185 in

http://www.opengl.org/registry/doc/glspec42.core.20110808.pdf

  if the eye direction of  osg camera is on the same line with the 
polygon normal vector,  there would be less distortion in the generated 
fragment attributes (in my case, the eyecoord position corresponding to 
each fragement) than if they are not on the same line.


Is the conclusion I said above  right?

(2)
in

http://www.opengl.org/registry/doc/glspec42.core.20110808.pdf

on P.7, there is a geometry processor, which is notated as a Programmable unit. 
Is it possible to use this programmable unit as the way vertex and fragment 
shader are used in open scene graph?

(3)
also in

http://www.opengl.org/registry/doc/glspec42.core.20110808.pdf
on P.19, it is said that the variable type include double. But when I change 
float into double in my shader, the programme doesn't work. Why? It is not 
possible to use double type in Open scene graph?


Thank you in advance!

Shuiying

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help:How to control rasterization stage through osg?

2012-01-10 Thread wang shuiying

Hello,

we know that in openGL rendering pipeline, there is a stage called 
rasterization after per-vertex operation and before per-fragment 
operation. In rasterization stage, openGl generates certain properties 
(e.g. texture coordinates)for each fragment as a linear function of the 
eye-space or object-space or windows space coordinates.  My question is 
, how to check which coordinate is used in osg implementations? And how 
to change the coordinate that is used?


Thank you in advance.

Best regards

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help:How to control rasterization stage through osg?

2012-01-10 Thread wang shuiying

Hi,

To Jason Daly,

Thank you for your reply.

In my case, I record  positions of the vertexes in eye coordinate in vertex shader, and I 
want the rasterization to perform the linear interpolation function with eye coordinate 
so that I can get the position value of certain vertex produced by 
interpolation.

In a nutshell, I want to change the way how opengl interpolates such attributes as those 
defined by varying variables in vertex shader.

So from your reply, only texture coordinates can be generated in different 
ways. As to general attributes, one has no way to change the interpolation 
method performed on them?

Can I at least get to know, if can not change, the interpolation method used by 
the openGl in my implemention?

Thank you.

Shuiying





Message: 16
Date: Tue, 10 Jan 2012 15:00:12 -0500
From: Jason Dalyjd...@ist.ucf.edu
To: OpenSceneGraph Usersosg-users@lists.openscenegraph.org
Subject: Re: [osg-users] Help:How to control rasterization stage
through osg?
Message-ID:4f0c98cc.8010...@ist.ucf.edu
Content-Type: text/plain; charset=ISO-8859-1; format=flowed

On 01/10/2012 02:55 PM, wang shuiying wrote:


Hello,

we know that in openGL rendering pipeline, there is a stage called
rasterization after per-vertex operation and before per-fragment
operation. In rasterization stage, openGl generates certain properties
(e.g. texture coordinates)for each fragment as a linear function of the
eye-space or object-space or windows space coordinates.  My question is
, how to check which coordinate is used in osg implementations? And how
to change the coordinate that is used?


   The per-fragment attributes you're talking about are just interpolated
   from the same attributes that are assigned to the surrounding vertices.
   AFAIK, there's no way to change how these are computed, even with
   shaders (they're just simple linear interpolations).

   It sounds like you might actually be talking about TexGen (texture
   coordinate generation), which in the fixed-function world can be set to
   OBJECT_LINEAR, EYE_LINEAR, SPHERE_MAP, NORMAL_MAP, or REFLECTION_MAP.
   TexGen will automatically compute the texture coordinates on the
   vertices (which will then be interpolated for each fragment, as
   mentioned above).  In OSG, the TexGen StateAttribute controls texture
   coordinate generation.  If you're using shaders, it's up to you to do
   the texture coordinate generation math yourself.

   --J


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help: can the fourth element of gl_FragColor be changed by shader?

2012-01-04 Thread wang shuiying

Hi,  J.P.

I am really  confused now.  I just want to get pre-rendering information 
from a camera through an image, which is attached  to the camera.  But I 
don't want to draw the image onto a texture. The rendering process goes 
through programmable shader and I need to write four different variables 
into the four elements of gl_fragcolor in fragment shader. But the 
fourth element of gl_fragcolor remains to be 1 when I read it through 
the related image.


so  what is the difference between FRAME_BUFFER_OBJECT and FRAME_BUFFER? 
And in what way do RenderTargetImplementation ( e.g. FRAME_BUFFER_OBJECT 
and FRAME_BUFFER) and BufferComponent(e.g.DEPTH_BUFFER,   COLOR_BUFFER,) 
affect my program?


how can I get the details as to how  the  renderTargetImplementation 
works? According to what you suggest, where I can access the code in 
RenderStage::runCameraSetUp ?


Plus, is there something wrong with the osg website? when I want to take 
a look at prerender example, It needs a long time to get onto the 
website and the webpage turns out to be something like this:


Traceback (most recent call last):
  File /usr/lib/python2.5/site-packages/trac/web/api.py, line 339, in 
send_error
'text/html')
  File /usr/lib/python2.5/site-packages/trac/web/chrome.py, line 684, in 
render_template
data = self.populate_data(req, data)


Best regards

Shuiying







On 01/04/2012 07:43 AM, J.P. Delport wrote:

Hi,

On 03/01/2012 23:39, wang shuiying wrote:

Hello,

(1)to J.P.

the related source code is as following:

// camera and image setup
osg::Camera::RenderTargetImplementation renderTargetImplementation;
renderTargetImplementation = osg::CameraNode::FRAME_BUFFER;


^ If you want FBO, you probably want FRAME_BUFFER_OBJECT here.



osg::Program * program = new osg::Program();
program-addShader(osg::Shader::readShaderFile(osg::Shader::FRAGMENT, 
fragmentShaderPath));
program-addShader(osg::Shader::readShaderFile(osg::Shader::VERTEX, 
vertexShaderPath));


osg::Image * image = new osg::Image();
image-allocateImage((int)XRes, (int)YRes, 1, GL_RGBA, GL_FLOAT);

osg::ref_ptrosg::CameraNode  camera(new osg::CameraNode());
camera-setClearColor(osg::Vec4(1000.0f, 1000.0f, 1000.0f, 1000.0f));
camera-setClearMask(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
camera-setRenderOrder(osg::CameraNode::PRE_RENDER);
camera-setRenderTargetImplementation(renderTargetImplementation);
camera-attach(osg::CameraNode::COLOR_BUFFER, image);


OpenGL would happily convert from one render target pixel format on 
the GPU to another one on the CPU when transferring data. So you must 
make sure that the formats are what you expect. The easiest way is to 
attach both a texture and an osg::Image to the camera and explicitly 
set the internal texture format and data type. See the osgprerender 
example. You can also inspect the code in RenderStage::runCameraSetUp 
to see exactly how OSG determines the format of the render target.


Also, if you want values outside of [0.0, 1.0] you must attach to 
osg::CameraNode::COLOR_BUFFER0 and use gl_FragData in your shaders.


camera-getOrCreateStateSet()-setAttribute(program, 
osg::StateAttribute::ON);


// access data  (only for instance)
osg::Vec4f * fragments = (osg::Vec4f *)(image-data());
osg::Vec4f  color= distances[10];
float color_a= distances[3];

As can be seen from the source code, I attach an image to the camera, 
and access data from the image,

I am rendering to an FBO.


I don't think you are rendering to an FBO with the above code.



So would you please give me some advice further?

Thank you very much!


*(2) to Paul*

Refer to (1) of this message, does  osg::CameraNode::COLOR_BUFFER 
imply a RGB  or  RGBA buffer

? or it doesn't even matter?


It does not imply anything.

rgds
jp



Thank you very much !

Shuiying




___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org 





___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help: can the fourth element of gl_FragColor be changed by shader?

2012-01-04 Thread wang shuiying

Hi,

To J.P. and Paul:

Thank you both very much !

After I change the renderTargetImplementation from FRAME_BUFFER to 
FRAME_BUFFER_OBJECT , it works! That is, I can get the fourth element of 
gl_FragColor modified in Fragment Shader  though image. Plus, the camera 
in question is a pre-render camera, not the main scene camera.


Thank you very much again and Cheers

Shuiying


On 01/05/2012 07:48 AM, J.P. Delport wrote:

Hi,

On 04/01/2012 17:17, wang shuiying wrote:

Hi, J.P.

I am really confused now. I just want to get pre-rendering information
from a camera through an image, which is attached to the camera.
I assume this camera is a different camera from your main scene 
camera? If so, I'd recommend using this camera to render to an FBO 
with the pixel format you want. You can read back the data to an image.



But I
don't want to draw the image onto a texture.
Why not? The data needs to go somewhere into GPU memory anyway. An 
image is always a copy of GPU memory to CPU memory. FBO is flexible.



The rendering process goes
through programmable shader and I need to write four different variables
into the four elements of gl_fragcolor in fragment shader. But the
fourth element of gl_fragcolor remains to be 1 when I read it through
the related image.
Like Paul said before, if the target you are rendering to does not 
have (which is common for frame buffer targets) an alpha (fourth) 
channel, then the value you get back would be 1.0 always.




so what is the difference between FRAME_BUFFER_OBJECT and FRAME_BUFFER?
This question requires a very long answer and you should probably 
check the OpenGL spec or a book, but in short: the frame buffer is 
what gets drawn onto the screen, an FBO is an internal OpenGL object 
that normally has a texture/textures associated with it that is 
specifically there for doing off-screen rendering.



And in what way do RenderTargetImplementation ( e.g. FRAME_BUFFER_OBJECT
and FRAME_BUFFER) and BufferComponent(e.g.DEPTH_BUFFER, COLOR_BUFFER,)
affect my program?
It determines where you render to, whether frame buffer/FBO and in the 
case of FBO you can attach your own textures to receive the colour and 
depth information from the camera rendering.




how can I get the details as to how the renderTargetImplementation
works? According to what you suggest, where I can access the code in
RenderStage::runCameraSetUp ?
Download the OSG source code. Use a subversion client and point it to 
this url:


https://www.openscenegraph.org/svn/osg/OpenSceneGraph/trunk

Then check in src/osgUtil for RenderStage.cpp



Plus, is there something wrong with the osg website?

Yes, there is currently some problem.

Please look at the osgprerender example as soon as you have the source 
code.


regards
jp


when I want to take
a look at prerender example, It needs a long time to get onto the
website and the webpage turns out to be something like this:

Traceback (most recent call last):
   File /usr/lib/python2.5/site-packages/trac/web/api.py, line 339, 
in send_error

 'text/html')
   File /usr/lib/python2.5/site-packages/trac/web/chrome.py, line 
684, in render_template

 data = self.populate_data(req, data)


Best regards

Shuiying







On 01/04/2012 07:43 AM, J.P. Delport wrote:

Hi,

On 03/01/2012 23:39, wang shuiying wrote:

Hello,

(1)to J.P.

the related source code is as following:

// camera and image setup
osg::Camera::RenderTargetImplementation renderTargetImplementation;
renderTargetImplementation = osg::CameraNode::FRAME_BUFFER;


^ If you want FBO, you probably want FRAME_BUFFER_OBJECT here.



osg::Program * program = new osg::Program();
program-addShader(osg::Shader::readShaderFile(osg::Shader::FRAGMENT,
fragmentShaderPath));
program-addShader(osg::Shader::readShaderFile(osg::Shader::VERTEX,
vertexShaderPath));

osg::Image * image = new osg::Image();
image-allocateImage((int)XRes, (int)YRes, 1, GL_RGBA, GL_FLOAT);

osg::ref_ptrosg::CameraNode camera(new osg::CameraNode());
camera-setClearColor(osg::Vec4(1000.0f, 1000.0f, 1000.0f, 1000.0f));
camera-setClearMask(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
camera-setRenderOrder(osg::CameraNode::PRE_RENDER);
camera-setRenderTargetImplementation(renderTargetImplementation);
camera-attach(osg::CameraNode::COLOR_BUFFER, image);


OpenGL would happily convert from one render target pixel format on
the GPU to another one on the CPU when transferring data. So you must
make sure that the formats are what you expect. The easiest way is to
attach both a texture and an osg::Image to the camera and explicitly
set the internal texture format and data type. See the osgprerender
example. You can also inspect the code in RenderStage::runCameraSetUp
to see exactly how OSG determines the format of the render target.

Also, if you want values outside of [0.0, 1.0] you must attach to
osg::CameraNode::COLOR_BUFFER0 and use gl_FragData in your shaders.


camera-getOrCreateStateSet()-setAttribute(program,
osg::StateAttribute

[osg-users] Help: can the fourth element of gl_FragColor be changed by shader?

2012-01-03 Thread wang shuiying

Hello,

In my programme, the fourth element  of gl_FragColor is changed by 
fragment shader to be a random number between 0 and 1. But when I access 
this element by the corresponding image, the element is always 1.  I got 
to know that in the rendering pipeline after fragment operation, there 
should be alpha test and blend and so on. When I turn off all those 
functions, the fourth element of gl_FragColor, which corresponds to the 
value of alpha, should not be affected by those tests, right?  But it 
remains to be 1. Is there anybody who can give me some advice on that? I 
want to make use of the fourth element to record the material index of 
the objects attached to the camera in question.



Thank you  in advance!

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help: can the fourth element of gl_FragColor be changed by shader?

2012-01-03 Thread wang shuiying

Hello,

(1)to J.P.

the related source code is as following:

// camera and image setup
osg::Camera::RenderTargetImplementation renderTargetImplementation;
renderTargetImplementation = osg::CameraNode::FRAME_BUFFER;

osg::Program * program = new osg::Program();
program-addShader(osg::Shader::readShaderFile(osg::Shader::FRAGMENT, 
fragmentShaderPath));
program-addShader(osg::Shader::readShaderFile(osg::Shader::VERTEX, 
vertexShaderPath));

osg::Image * image = new osg::Image();
image-allocateImage((int)XRes, (int)YRes, 1, GL_RGBA, GL_FLOAT);

osg::ref_ptrosg::CameraNode  camera(new osg::CameraNode());
camera-setClearColor(osg::Vec4(1000.0f, 1000.0f, 1000.0f, 1000.0f));
camera-setClearMask(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
camera-setRenderOrder(osg::CameraNode::PRE_RENDER);
camera-setRenderTargetImplementation(renderTargetImplementation);
camera-attach(osg::CameraNode::COLOR_BUFFER, image);
camera-getOrCreateStateSet()-setAttribute(program, osg::StateAttribute::ON);

// access data  (only for instance)
osg::Vec4f * fragments = (osg::Vec4f *)(image-data());
osg::Vec4f  color= distances[10];
float color_a= distances[3];

As can be seen from the source code, I attach an image to the camera, and 
access data from the image,
I am rendering to an FBO.

So would you please give me some advice further?

Thank you very much!


*(2) to Paul*

Refer to (1) of this message, does  osg::CameraNode::COLOR_BUFFER imply a RGB  
or  RGBA buffer
? or it doesn't even matter?

Thank you very much !

Shuiying


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help: can the fourth element of gl_FragColor be changed by shader?

2012-01-03 Thread wang shuiying

Hello,

Sorry, the third line of 'access data' part should be :

float color_a= color[3];



Shuiying

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Help: Data accuracy from Frame buffer is not very good

2011-11-29 Thread wang shuiying

Hallo,

I want to simulate  laser scanner using osg camera. In Fragment shader 
the  color information  is converted into position information. And then 
the position information is accessed from this Fragment to be used to 
draw points in a view. But the points are not so accurate, as it is 
shown in the attached picture, which is composed of 11 cameras. The 
important codes are:


osg::Program * program = new osg::Program();
 
program-addShader(osg::Shader::readShaderFile(osg::Shader::FRAGMENT, 
fragmentShaderPath));
 
program-addShader(osg::Shader::readShaderFile(osg::Shader::VERTEX, 
vertexShaderPath));


osg::Image * image = new osg::Image();
 image-allocateImage(xRes, yRes, 1, GL_RGBA, 
GL_FLOAT);


  osg::ref_ptrosg::CameraNode camera(new 
osg::CameraNode());
  
camera-setComputeNearFarMode(osg::Camera::DO_NOT_COMPUTE_NEAR_FAR);

  camera-setViewport(0, 0, xRes, yRes);
  camera-setClearColor(osg::Vec4(1000.0f, 1000.0f, 
1000.0f, 1000.0f));
  camera-setClearMask(GL_COLOR_BUFFER_BIT | 
GL_DEPTH_BUFFER_BIT);
   
camera-setReferenceFrame(osg::Transform::ABSOLUTE_RF);
   camera-setProjectionMatrixAsFrustum(-tan( 
viewYaw), tan(viewYaw), -tan(viewPitch), tan(viewPitch), zNear,zFar);
   
camera-setViewMatrix(osg::Matrix::lookAt(osg::Vec3d(0, 0, 1.0f), 
osg::Vec3d(-10.0f, 0.0f, 0), osg::Vec3d(0, 0, 1)));
   
camera-setRenderOrder(osg::CameraNode::PRE_RENDER);
   
camera-setRenderTargetImplementation(renderTargetImplementation);
   camera-attach(osg::CameraNode::COLOR_BUFFER, 
image);
   
camera-getOrCreateStateSet()-setAttribute(program, 
osg::StateAttribute::ON);


In vertex Shader, I write:
void main()
{

x=(gl_ModelViewMatrix * (gl_Vertex - vec4(0.0,0.0,0.0,0.0))).x;
y=(gl_ModelViewMatrix * (gl_Vertex - vec4(0.0,0.0,0.0,0.0))).y;
z=(gl_ModelViewMatrix * (gl_Vertex - vec4(0.0,0.0,0.0,0.0))).z;
gl_Position = ftransform();
}

in Fragment shader, I write:

gl_FragColor = vec4(abs(z/100),abs(x/17),abs(y/4),0.2);

And I when get the position information, I will multiply them with the 
corresponding factors.


However, the points are not that accurate. I seems that the Fragment 
shader does not calculate the point one by one, but assign a certain 
value to a certain amount of points.


Can anybody give some advice?

Thanks in advance!

Shuiying

attachment: osg.jpeg___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Help: Data accuracy from Frame buffer is not very good

2011-11-29 Thread wang shuiying

Hello, J.P. Delport

I followed your advice but it still doesn't work well.

And I also changed the internal format of the Image as  GL_RGBA32F_ARB, 
but  the points remain the same.


What is the name of  the osgprerender example with the --hdr switch ?

Thank you very much !

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] how to make gl_Color.a changeble through programmable fragment shader?

2011-10-27 Thread wang shuiying

Hi, Robert

Thank you for your reply.

Sorry that I don't think I  expressed my problem clearly.

I want to simulate  radar using osg camera. The important codes are:

osg::Program * program = new osg::Program();
 
program-addShader(osg::Shader::readShaderFile(osg::Shader::FRAGMENT, 
fragmentShaderPath));
 
program-addShader(osg::Shader::readShaderFile(osg::Shader::VERTEX, 
vertexShaderPath));


osg::Image * image = new osg::Image();
 image-allocateImage(xRes, yRes, 1, GL_RGBA, 
GL_FLOAT);


  osg::ref_ptrosg::CameraNode camera(new 
osg::CameraNode());
  
camera-setComputeNearFarMode(osg::Camera::DO_NOT_COMPUTE_NEAR_FAR);

  camera-setViewport(0, 0, xRes, yRes);
  camera-setClearColor(osg::Vec4(1000.0f, 1000.0f, 
1000.0f, 1000.0f));
  camera-setClearMask(GL_COLOR_BUFFER_BIT | 
GL_DEPTH_BUFFER_BIT);
   
camera-setReferenceFrame(osg::Transform::ABSOLUTE_RF);
   camera-setProjectionMatrixAsFrustum(-tan( 
viewYaw), tan(viewYaw), -tan(viewPitch), tan(viewPitch), zNear,zFar);
   
camera-setViewMatrix(osg::Matrix::lookAt(osg::Vec3d(0, 0, 1.0f), 
osg::Vec3d(-10.0f, 0.0f, 0), osg::Vec3d(0, 0, 1)));
   
camera-setRenderOrder(osg::CameraNode::PRE_RENDER);
   
camera-setRenderTargetImplementation(renderTargetImplementation);
   camera-attach(osg::CameraNode::COLOR_BUFFER, 
image);
   
camera-getOrCreateStateSet()-setAttribute(program, 
osg::StateAttribute::ON);



in Fragment shader, I write, e.g:

gl_FragColor = vec4(0.1,0.2,0.3,0.5);

when I get the four elements through image attached to the camera, the 
first three elements are just as expected to be 0.1,0.2,0.3 
respectively, but the fourth element is 1.  No matter what value I 
assign to the fourth element, it keeps to be 1.


I cannot figure out why this happens.

So does it still have something do do with graphics context when it 
comes to Pre-render camera?


Thank you very much!

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] Is the displayProblem of osgtext already fixed?

2011-10-27 Thread wang shuiying

Hallo,

 The osgText on my program worked fine with old osgVersion, but now 
the text doesn't show up with osg 3.0.0. I notice that this problem was 
already discussed in July, but it seemed there was no fixed solution. 
What should I do now? Can anybody give any advice?



Thank you very much in advance!

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] how to make gl_Color.a changeble through programmable fragment shader?

2011-10-26 Thread wang shuiying

Hello,

I try  to write information of other things rather than that of color
into gl_FragColor through programmable fragment shader.

e.g. gl_FragColor = vec4(0.1,0.2,0.3,0.2);

However,  whatever value I set gl_color.a(the fourth element above)
through fragment shader, I get the value of 1  when access the frame
buffer.

I tried to work out this problem by adding following  code :

   osg::ColorMask* cm = new osg::ColorMask(true,true,true,true);


NodeOfRenderingObject-getOrCreateStateSet()-setAttributeAndModes(cm,osg::StateAttribute::ON); 




NodeOfRenderingCamera-getOrCreateStateSet()-setAttributeAndModes(cm,osg::StateAttribute::ON); 



But the corresponding value of  gl_color.a in frame buffer remains to be 1.


Could anybody give any advice?

Thank you very much in advance!

Best regards

Shuiying
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org