David, that's really odd. I see my reply above 
yours on the blog view and it looks fine. Super weird.

This is my original message:

Henk,

Unfortunately, I don't have great answers to your
 questions, however, I am quite interested in the 
workflow you describe.

I've been tasked with a similar project, providing
 a B-Format mix for a 360 animation. How were 
you able to sync the head tracking/playstate 
between Kolor Eyes and reaper?

I can get playback to sync via a simple audio 
timecode track, but i'm having trouble getting 
headtracking working via the oculus rift dk2. 
I've considered using the DIY head-tracker that  
Matthias Kronlachner uses for his OSC pd patch here: 

http://www.matthiaskronlachner.com/?p=2091 , 

but i'd rather just use the DK2 if possible.

HID in puredata recognizes the oculus as a device,
 but i'm unable to access any data in pd from the 
oculus at this time.

I understand if you are reluctant to share your 
workflow solutions, but i've been banging my head 
up against this for a while now. ;) I'm a sound 
designer and audio mixer, not a programmer, and 
any help would be much appreciated.

Thanks!

Evan

EDIT:

Actually since writing this, i noticed the UDP control
 function in Kolor Eyes, which can send camera 
position and video playback position via UDP commands
 in JSON format. I assume this is how Henk 
is controlling 
reaper's playback and the B Format rotation. My question
 would still be how to get that data into 
reaper.

-Evan




_______________________________________________
Sursound mailing list
Sursound@music.vt.edu
https://mail.music.vt.edu/mailman/listinfo/sursound - unsubscribe here, edit 
account or options, view archives and so on.

Reply via email to