Hi & thank you for your reply, Your approach is basically the same as mine with a few differences. Because the Android *MediaRecorder *is not very flexible I decided to use OpenCV (more specifically JavaCV, which comes bundled with ffmpeg) for frame recording. Keeping all the sensor data in memory is no option for me, because I have to deal with a much higher sampling rate than 1 /sec which results in much more amount of data over time. I also take into consideration the *delay* between hitting the *start recording* button and the timestamp when recording actually happens.
There is obviously more advanced ways of doing this This was my question, because I think your/our approach does work well when sampling rate isn't that high, but with higher sampling rates I am afraid it will become too fuzzy. On Thursday, November 20, 2014 8:50:24 AM UTC+1, gjs wrote: > > Hi, > > I do the same thing in one of my apps, record video and at the same time > record sensor data to file. I start the sensor logging to file at the same > time I start the video recording, when the video recording is finished I > stop the sensor logging. I am only logging sensor data at 1 second > intervals. > > For playback, I load the sensor data from file into memory & overlay the > sensor data in a view on top of the video playback view, updating the > sensor data in the view (asynchronously) once each second. > > I generally find that there is a delay of about 2 seconds from the time > video recording is started to the time when video frames begin to get > written to file (record a video of a clock to work this timing out for > yourself). I then compensate for this delay during playback by advancing > the sensor data playback position accordingly. I also allow the user the > seek during playback by querying the current video playback (time) position > & then also 'seek' the sensor data similarly so that they remain in 'sync'. > > It's crude but it suits my purposes. There is obviously more advanced ways > of doing this such as using FFMPEG to burn your sensor data into the video > frames as an overlay but I have not gone that far as yet. > > Regards > > On Wednesday, November 19, 2014 5:52:41 PM UTC+11, crem wrote: >> >> Hi, >> >> In my application I need to capture/record video and simultaneously log >> sensor data which I receive via Bluetooth (about every 10ms), to a file. >> This works fine. >> Now I need to "*synchronize*" the video data and the received sensor >> data, i.e. I want to link the sensor data with a ms timestamp to the >> captured video, so that I know (more or less exactly) what sensor data >> belongs to what time in the video. >> In the best case I have a connection VideoFrame <--> Sensor data, but I >> know that this is hard to achieve since the frame rate of the video is not >> constant etc... >> >> Can you give some hints how I can implement this. >> Right now I am using a simple approach where I set a *timestamp (t1)* right >> after I start the recording of the video and before I start to receive >> sensor data, and a *timestamp (t2)* when the first sensor data arrived. >> Then i calcluate *t = t2 - t1* to know the time in the recorded video. >> >> Are there any more advanced approaches you can think of ? >> >> Kind regards. >> >> -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to android-developers@googlegroups.com To unsubscribe from this group, send email to android-developers+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/android-developers?hl=en --- You received this message because you are subscribed to the Google Groups "Android Developers" group. To unsubscribe from this group and stop receiving emails from it, send an email to android-developers+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/d/optout.