Hello Gerrit,
sure, that is fine with me. Can you give a description of the parts of that system and what roles they play?I really would like to see only one instance in there.basically they all follow the VRML/X3D model. You have your time sensor and interpolator elements and connect the via field connectors (akaroutes). Let me take this part out of the CSM dir where it lives right now and repackage this into a separate contrib dir to untangle things a little.
hm, the interpolators unfortunately combine the constant data for the animation with the changing data for playback. That means if I need to play the same animation (started at different times) for two different characters I need to duplicate the whole keyframe data, or am I missing something ? That is why there is the difference between an AnimationTemplate (with ATracks) and an Animation (with AChannels), one just stores data, the other is a "cursor" into that data - similar to how Cal3d splits things with its Core and non-Core types.
We specifically need to handle key frame animation for vertex skinning for characters. Any hints how existing parts are best used/extended to support that are also very welcome.If you can work with the std vrml/x3d interpolators (IIRC pos/scalar/ori/coordinate), which I hope as skin+bones are inboth, the basics should all be there.
yes, a bone is essentially just a coordinate system. One thing to consider though: there is often more than one animation applied to a skeleton. For that case you need to accumulate all input for one bone in some way, either by keeping track if this is the first change to a bone in this frame and making it absolute and all subsequent ones relative or accumulating all changes into a temporary and then set it once all animations are applied.
The two tricky bit's left are the global frame handler which updates the time and something that makes sure the skin+bone stuff is evaluated only once a frame. For the frame handler I have to see if we can handle it like the vrml loader which can be extended so it can live outside the osg core (I don't want to have fileIO to dependent on a contrib lib) or if we have to push this into the core.
agreed on the dependency. Why does the frame handler need to be extensible (perhaps it must be, I just don't understand the reason yet)? Grepping for framehandler only turned up the call from the CSMGLUTWindow to CSM::frame, which seems to only update time and trigger the SensorTask/TimeSensors. For time I think there needs to be a way that the user can supply it, in case there is other stuff in the application that has to run off the same clock (maybe that is already possible?). For example in a VRJuggler clustered app we'd like to feed time from a device into the animation since it is guaranteed to be in sync on all nodes.
Short question, what is the grouping (for example AnimationTemplate) for ?. Just to deal with a complex animation through one object ?
yes, primarily. Given that a human character model has 20-30 bones I consider it somewhat essential that I can start the "walk" animation with just one call. Animation (the playback object for an ATemplate) is also the level where the time scale, the playback mode (once, loop, swing) and direction (fwd, bwd) are set.
Cheers,
Carsten
PS: I've attached a tarball with the anim classes in case you want to
look at it. Don't bother with compiling it though there is probably some
other stuff missing.
anim.tar.bz2
Description: application/bzip
------------------------------------------------------------------------------ Let Crystal Reports handle the reporting - Free Crystal Reports 2008 30-Day trial. Simplify your report design, integration and deployment - and focus on what you do best, core application coding. Discover what's new with Crystal Reports now. http://p.sf.net/sfu/bobj-july
_______________________________________________ Opensg-users mailing list [email protected] https://lists.sourceforge.net/lists/listinfo/opensg-users
