I digress about my "delusions of grandeur" idea (I can code it
myself), but this sounds like a thread worth putting cents into.

> What is the Metaverse? (Matrix, Cyberspace, YouNameIt)
> In my opinion, it is not a protocol/file-format/system, it is a feature.

~Well-played! =D

> <background style=hidden> ;) </background>
> The Metaverse/Cyberspace/Matrix in SF literature has the same quality as the 
> WWW - it is an integrator for a number of different spatial content and 
> applications. This is something we don't have today, regardless of what LL or 
> anybody claims.  VRML, Collada, Flux, even VOS... interesting building blocks 
> already out there, but no Metaverse by themself.

~As I recall, the reason VRML didn't make it wasn't because the
machine language wasn't easy to understand, but because most of the
applications using it at the time had a proprietary framework, and the
language itself didn't support such advanced features as you see in
HTML and other Hypertext-rendering languages. [You couldn't "post a
link to another world" to a "search engine/directory world", and
expect people to show up in your world, no matter how expansive or
popular it was, thus, developers couldn't really collaborate to make a
single cohesive 3d universe]. Besides that, there was no real VRML
browser worth it's weight. Controls in most VRML applications were
clunky. At least, that's what it seemed like to me.

> I don't believe that a single protocol, data model, or system could ever 
> solve all problems and accomplish every goal. Like in the traditional 
> Internet, the requirements differ too much among 3D applications. When you 
> chat with friends in a virtual pub, you need a different protocol than when 
> you go to the arcade to play an FPS game. When you do virtual flight 
> training, you need a different data model than when you study surgery at 
> Virtual Medical School. And so on...

~Ah, how true! Different games have different rules. First person
shooters require WASD/Mouse interactions, and for some applications,
an entirely different control-set is preferable. Some apps demand a
greater level of detail, and some prefer wider aspect ratios.

> In my opinion, if we want a Metaverse (do we?), we need something that can 
> get us from application A to B without too much of a hiccup. It likely has to 
> support multiple protocols and data models for this purpose. It will not 
> deliver 90% right from the start, and I'd settle for a lot less. After all, 
> you cannot really compare Mosaic to Mozilla.

~I understand that TerAngreal was simply one example of a client, and
nothing we're really "tied to", in the same way we are tied to VOS,
and the concept of "Interreality" as our Meta-Verse.

> The question to ask of every protocol/data model/tool thus is, how well does 
> it serve integration? Or in other words: How metaverse is it?

The TerAngreal browser, I feel, doesn't serve as an integrative
metaversal tool; because it uses proprietary OS-GUI-interface system
calls, to generate the 'chat area' as well as the 'user list', and
even the 'menus'.

In order to make a browser that could be called truly integrative, one
would have to recode the VOS/Interreality specifications, including
procedural UI elements which render themselves using the 3d engine
itself, remain respective to the user (rendered on camera
level/post-rendered above each world's 3d elements), and unviewable by
other users. Even the interface to pick a username would necessarily
need to be rendered within the GL workspace. In addition, the
workspace should be software-rendered, scaled down to lower-end
platforms, until the user tells the workspace that the user has an
expanded graphics card, or unless we auto-detect for a capable card.
At that point, the user should be allowed, through the 3d interface
itself, to choose his or her control scheme, aspect ratio, level of
detail, field depth, optional shading, shadows, texturing, particle
effects, and all those other fancy things. Thus, the only thing
showing up in any window created by our team should be the 3d-rendered
area. The only OS-specific function calls we would need would be ones
that restart the program window with the new settings.

IMHO.

-Steve

_______________________________________________
vos-d mailing list
vos-d@interreality.org
http://www.interreality.org/cgi-bin/mailman/listinfo/vos-d

Reply via email to