The software could be run in a CAVE environment or in a multi-projection
system.
The user would command an Jmol instance with mouse. The other instances
would be synchronized with the first. For now do not worry about the
synchronism because it is already running!
The code changes were made in the Jmol Viewer.java.
These transformation matrices exist or have to be created? Where?
As explained previously, the problem is not limited to only display a
molecule from a different angle. One projection complements the others, ie,
a molecule can have a piece displayed in the left projection, nothing in
central and final part on the right; a single atom can't appear in two
projections simultaneously.
I'll try to explain it another way:
Imagine a water molecule. Initially I see the three atoms in the central
projection and nothing in the lateral projections. As you zoom, a hydrogen
atom is displayed in the left projection, the other on the right and the
oxygen on the central. If I keep zooming, the oxygen will occupy the entire
central projection and begin to invade the lateral projections (the
hydrogens were already behind the user and therefore are not displayed).
Am I clear?
The transformation matrices are the solution of this problem? How?
[]s
Moacyr
De: Robert Hanson [mailto:[email protected]]
Enviada em: sexta-feira, 23 de julho de 2010 00:22
Para: [email protected]
Assunto: Re: [Jmol-developers] RES: RES: Virtual Reality
Well, I think it's a simple matter. It would be just like stereo - we create
two buffers with a rotation between them. In this case you need three
buffers and a 4x4 matrix transformation, not just a 3x3 rotation. Should be
easy to implement. What I wasn't sure about was how you wanted to deliver
it.
Is this a real-time virtual reality cube/cave? Or is it something else?
How does the user's position get fed into the system?
What technical issues are you running into?
How much have you tweaked Jmol already?
more comments....
On Thu, Jul 22, 2010 at 9:52 PM, Moacyr Francischetti Corrêa
<[email protected]> wrote:
Moacyr:
I read http://chemapps.stolaf.edu/jmol/docs/misc/navigation.pdf and
concluded that there is no control of the camera position. Am I right?
Please tell me which part of code I could make changes to reposition the
camera (observer's position) relative to the molecule. It would be in
TransformManager? Or in TransformManager10? What are the variables involved?
Do nothing with TransformManange10. It's history. Make sure you are using
Jmol 12.0 and either work in TransformManager11, or better, perhaps overlay
that with a small TransformManager12 that has your options included.
You will basically want an enhanced navigation mode. It gives you the
realistic "in place" walk-through perspective you are looking for.
Camera positions -- this is, of course, just an illusion. I've recently
added more standard camera parameter calculation that I needed for the U3D
business to TransformManager -- getCameraFactors. They are pretty standard.
The camera position is basically just a position, a distance from a
reference point, and a quaternion (or 3x3 matrix) that describes the
orientation. Jmol calculates all of these and computes from that a 4x4
transformation matrix that takes you from Cartesian to screen coordinates.
It's pretty standard.
I think your process would just mimic what is being done in Viewer for
stereo. Take a look at how those images are created. Basically that's just
-- render the first image
-- rotate
-- render the second image
I think you will just do -- or already have done --
-- render the front view image (same as current)
-- rotate (navigate) 90 Y
-- render the left image
-- rotate (navigate) 180 Y
-- render the right image
Very simple. That sound about right?
Where do the images go to?
Bob
De: Robert Hanson [mailto:[email protected]]
Enviada em: quinta-feira, 22 de julho de 2010 19:57
Para: [email protected]
Assunto: Re: [Jmol-developers] RES: Virtual Reality
Can we start this over? I lost the sense of the thread. What exactly would
you like to be able to do. Suggest some command options.
On Thu, Jul 22, 2010 at 12:18 PM, Moacyr Francischetti Corrêa
<[email protected]> wrote:
Bob,
I read the pdf you've indicated and concluded that there is
no control of the camera position. Am I right?
Please tell me which part of code I could make changes to reposition the
camera (observer's position) relative to the molecule.
What are the variables involved?
Moacyr
De: Robert Hanson [mailto:[email protected]]
Enviada em: quinta-feira, 18 de março de 2010 11:21
Para: [email protected]
Assunto: Re: [Jmol-developers] Virtual Reality
Moacyr,
By the way, the way we would do this, I think, is just the same as we do
stereo -- rerender x times and capture the screen image each time, then put
those together for delivery. Be aware that Java has some memory size
limitations that could put a cap on buffer size. What sort of screen pixel
counts are we talking about here?
Bob
On Tue, Mar 9, 2010 at 12:40 PM, Moacyr Francischetti Corrêa
<[email protected]> wrote:
Is it possible to run Jmol in a virtual reality environment, such as a CAVE?
It would require that the software could generate 5 different views of the
molecule, one for each wall of the CAVE.
Any suggestion?
Moacyr
----------------------------------------------------------------------------
--
Download Intel® Parallel Studio Eval
Try the new software tools for yourself. Speed compiling, find bugs
proactively, and fine-tune applications for parallel performance.
See why Intel Parallel Studio got high marks during beta.
http://p.sf.net/sfu/intel-sw-dev
_______________________________________________
Jmol-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/jmol-developers
--
Robert M. Hanson
Professor of Chemistry
St. Olaf College
1520 St. Olaf Ave.
Northfield, MN 55057
http://www.stolaf.edu/people/hansonr
phone: 507-786-3107
If nature does not answer first what we want,
it is better to take what answer we get.
-- Josiah Willard Gibbs, Lecture XXX, Monday, February 5, 1900
----------------------------------------------------------------------------
--
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first
_______________________________________________
Jmol-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/jmol-developers
--
Robert M. Hanson
Professor of Chemistry
St. Olaf College
1520 St. Olaf Ave.
Northfield, MN 55057
http://www.stolaf.edu/people/hansonr
phone: 507-786-3107
If nature does not answer first what we want,
it is better to take what answer we get.
-- Josiah Willard Gibbs, Lecture XXX, Monday, February 5, 1900
----------------------------------------------------------------------------
--
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first
_______________________________________________
Jmol-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/jmol-developers
--
Robert M. Hanson
Professor of Chemistry
St. Olaf College
1520 St. Olaf Ave.
Northfield, MN 55057
http://www.stolaf.edu/people/hansonr
phone: 507-786-3107
If nature does not answer first what we want,
it is better to take what answer we get.
-- Josiah Willard Gibbs, Lecture XXX, Monday, February 5, 1900
------------------------------------------------------------------------------
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first
_______________________________________________
Jmol-developers mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/jmol-developers