As I in the last year have had an interest in listening to tetramic FOA 
recordings over head phones with head tracking, and in that context been 
thinking of listening to other recordings with head tracking I want to comment.

> To me this demo is really cool since the auditory objects are nicely 
> externalized, even in the field of vision.
I think the formulation should be: The auditory objects are in vision and 
therefore  nicely externalized, in addition the sound scape is fixed when the 
head direction is changed.

Any chance of you sharing the Demo3 files and  a way of viewing / listening to 
the demo3 to us not having the  opportunity to participate in the AES 
conferences or travelling to Helsingfors/Helsinki?
Have you considered releasing at demo version of your DirAC software for head 
tracking?
  
I do not have an oculus rift but I and a few of us on this list can listen to 
head tracked ambisonics, and the video part can possibly be viewed in web based 
vr viewers.

Just for my curiosity, have you joined the video and sound in to a mkv or ogg 
format?


Bo-Erik

-----Original Message-----
From: Sursound [mailto:sursound-boun...@music.vt.edu] On Behalf Of Pulkki Ville
Sent: den 17 november 2014 08:33
To: sursound@music.vt.edu
Subject: Re: [Sursound] Oculus Rift Visual Demo + Ambisonic Audio Available?


Hello, 

Sampo mentioned that he heard  our demo at Aalto. Here is the title and the 
abstract of the demo, which we first showed in AES 55th conference on spatial 
audio. 


Demo 3: Head-mounted head-tracked audiovisual reproduction.

Olli Santala, Mikko-Ville Laitinen, Ville Pulkki and Olli Rummukainen.
Aalto University, Department of Signal Processing and Acoustics

Audiovisual scenes are reproduced with headphones and a head-mounted display in 
this demonstration. The sound has been recorded with a real A-format 
microphone, and it is reproduced using binaural DirAC, which utilizes DirAC 
processing, virtual loudspeakers and head tracking. The video has been recorded 
with the Ladybug3 camera, and it is displayed using the Oculus Rift. The 
listeners are allowed to turn their head in all directions. The auditory and 
the visual objects should match and be stable in the world coordinate system. 
Moreover, the reverberation should be perceived to be natural. The reproduced 
scenes include music, traffic, and sports recorded both indoors and outdoors.



See description of the audio rendering technique here:
Laitinen, M-V., and Ville Pulkki. "Binaural reproduction for directional audio 
coding." Applications of Signal Processing to Audio and Acoustics, 2009. 
WASPAA'09. IEEE Workshop on. IEEE, 2009.



To me this demo is really cool since the auditory objects are nicely 
externalized, even in the field of vision. The trick could to be that when the 
subject perceives the space visually, he adapts to the HRTFs used in the system 
fast. We also update the head position with rate of about 100 Hz, and then 
correspondingly update the video and audio. This prevents nausea, and also 
helps in externalization of headphone audio.


-Ville



_______________________________________________
Sursound mailing list
Sursound@music.vt.edu
https://mail.music.vt.edu/mailman/listinfo/sursound - unsubscribe here, edit 
account or options, view archives and so on.
_______________________________________________
Sursound mailing list
Sursound@music.vt.edu
https://mail.music.vt.edu/mailman/listinfo/sursound - unsubscribe here, edit 
account or options, view archives and so on.

Reply via email to