Hello, everyone:

I am working with the Parrot AR.Drone 2.0. I don't know if anyone is familiar with that device, but to make a long story short, it transmits video data from an on-board camera via TCP and the video is encoded with H.264. I am trying to make a program that continuously receives the TCP packets from the drone and decodes them in order to get the images sent from the camera.

Could anyone steer me in the possibly correct direction toward accomplishing this goal? I tried to do it based off of code like what can be found here http://libav.org/doxygen/master/libavcodec_2api-example_8c-example.html or here http://dranger.com/ffmpeg/tutorial01.html. However, I am getting nowhere. I guess getting frame data from network packets is not like getting it from an actual video file.

I am a complete novice when it comes to video, codecs, and streaming video over a network. If someone could help me figure out where to even begin I would be so appreciative.

Thanks!
_______________________________________________
Libav-user mailing list
Libav-user@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/libav-user

Reply via email to