Hi Dominik,
Warning... long mail ahead...
On Tue, 2004-01-27 at 22:05, Dominik Rau wrote:
> Hi Rasmus.
>
> Rasmus Agerholm wrote:
>
> >I don't know for sure, what would be the best way from Max to OpenSG,
> >but the Max VRML exporter is pretty good.
> >
> >
> Yes, I thought so, too.
>
> >I use Gerrit's loader and haven't really touched the "original". My
> >VRMLAnimation lib is based on Gerrit's loader. So to use that you need
> >to enable this loader by --enable-gv-beta.
> >
> >
> After fighting a bit with some include paths and the files in the
> Gerrit/Base directory it compiles and the tutorials are working again.
Sorry, forgot to tell you that after configure you should run
Common/prep_gv_beta :)
> >My enhanced version of Gerrit's loader simply loads all VRML routes
> >(animations) specified in the file. And since the only sensor I have
> >inplemented yet is the TimeSensor, the default behavior is to play all
> >animations from frame 1, but you can take control of the execution
> >sequences your self. It should be pretty easy.
> >
> >
> I compiled your library without any problems and compiled the
> animExample.cpp from your website. Unfortunately, it seems not to
> work: The chopper is black and does not move at all. Any idea what the
> problem could be?
The blackness (total lack of textures I would guess) is probably caused
by the loader's inability to find the textures. If this is the case, it
can be solved by executing the example in the chopper directory (the
texture paths are relative).
Why the chopper isn't moving... hmmm... the example made a log file
(log.txt) which may have some answers, can you send that to me off the
list (off course you're welcome to have a look in there as well ;-) )
> Another question: Do I understand you right, that you suggest me to
> save all animations in one file (for example: walk frame 0-20, jump
> 21-40...) and to switch between animations by jumping to another frame?
No, not quite. Well, IIRC the VRML loader doesn't support inline files,
so you have to save everything in one file. But all animations in VRML
can be independent of each other and only dependent on the same sensor,
eg. time sensor. When our artist exports stuff from Max to VRML
everything tends to be controlled by a single time sensor, but I guess
that the artist can control if there should be more sensors.
> Could you give me some examples / code snipplets / a full featured
> documentation including tutorials ;) how I have to do this? (if the
> animExample.cpp works... )
Here goes (one possible way):
What needs to be done is baking all animations into keyframe animations
and then exported to VRML. Then you have one file with possibly multible
animations and a number of sensors (at least one).
Before loading the file you register all sensor types:
------------------------------------------------------
// register all sensor types
OSG::FieldContainerType *fct;
for (OSG::UInt32 i = 0; i < OSG::TypeFactory::the()->getNumTypes(); ++i)
{
fct = OSG::FieldContainerFactory::the()->findType(i);
if(fct &&
(fct->isDerivedFrom(OSG::Sensor::getClassType())))
{
_aspect->registerCreated(
*fct,
OSG::osgTypedMethodFunctor2ObjPtrCPtrRef
<bool, YourClass,
OSG::FieldContainerPtr, OSG::RemoteAspect *
>(this, &YourClass::createdFunction));
}
}
------------------------------------------------------
,where YourClass::createFunction looks something like this:
------------------------------------------------------
bool YourClass::createdFunction(OSG::FieldContainerPtr
&fcp,OSG::RemoteAspect *)
{
_sensors.push_back(pSensor);
return true;
}
------------------------------------------------------
,where _sensors is declared in YourClass as this:
------------------------------------------------------
typedef std::vector<OSG::SensorPtr> SensorCoreVec;
SensorCoreVec _sensors;
------------------------------------------------------
Then load the file:
------------------------------------------------------
OSG::NodePtr file = OSG::NullFC;
OSG::VRMLToOSGAction aVRMLToOSG;
OSG::VRMLLoader *pFile = new OSG::VRMLLoader();
OSG::VRMLAnimator *pVRMLAnimator = new OSG::VRMLAnimator(pFile);
pFile->scanFile(filename.c_str());
aVRMLToOSG.setNameNodeMap(pFile->getNameNodeMap());
aVRMLToOSG.setDataTransferMode(OSG::VRMLToOSGAction::SwapData);
aVRMLToOSG.apply(pFile->getFileTree());
file = aVRMLToOSG.getRoot();
file->updateVolume();
------------------------------------------------------
And last you initiate the VRML routes:
------------------------------------------------------
bool has_animation = pVRMLAnimator->init(file);
------------------------------------------------------
Now _sensors contains all sensors in the scene and has_animation
indicates whether the file contains VRML animations or not. If the file
is exported so that each individual animation (which may involve many
interpolators) is controlled by an independent sensor, you're more or
less there: call evaluate() on the sensor controlling the animation you
want to run... well, maybe you need to fiddle a little bit with the
time, loop etc. values of the sensor to make sure that the animation
starts from the right keyframe.
If you have only one sensor, you can get the target interpolators that
the sensor evaluates (check evaluate() in OSGTimeSensor.cpp) and by some
naming convention and OSG::getName(FC) find out which interpolators
belong together and the 1) create a sensor for each group or 2) evaluate
the groups yourself... maybe 1) is easiest?
Hope it helps
/Rasmus
--
Rasmus Agerholm
Research Assistant
VR Media Lab (+45) 9635 8792
Niels Jernes Vej 14
DK-9220 Aalborg
Phone: (+45) 9635 8784
Fax: (+45) 9815 2444
http://www.vrmedialab.dk/~rasta
-------------------------------------------------------
The SF.Net email is sponsored by EclipseCon 2004
Premiere Conference on Open Tools Development and Integration
See the breadth of Eclipse activity. February 3-5 in Anaheim, CA.
http://www.eclipsecon.org/osdn
_______________________________________________
Opensg-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/opensg-users