Cory Papenfuss wrote:

TVout "solutions" are a wildcard with any card. Most suck and are unknown as far as how they operate internally.

I just recently found out that the TV-out of SiS chipsets (many, many HTPC barebones use them) are not capable of outputting interlaced material. Quite an important thing if you want picture perfect I'd say. I am trying to get the modeline right

... with a homebrew circuit to modulate RGB from the VGA at this proper frequency into NTSC Y/C and Composite. It is *NOT* a rate/scanline
converter which is what almost all tvout cards use. If you use that modeline on a VGA monitor, it won't like it.


I'm not sure what you mean by this.  What interlacing?  All I want
mythtv to do is record a signal (interlaced since it's standard NTSC)
and play it back exactly as it would have been sent to the TV directly;
so that means recorded interlaced and played back, interlaced.  Now that
may (or most likely) mean packing two interlaced fields into a frame,
but the video card should display the fields interlaced, one after the
other.

Which it can, but unless you are using production-quality hardware with video genlocking, what you record will not be exactly synced with what you get out.

I am trying to get the modeline right for my TV set so that I can start using my homebrew VGA to RGB converter. I made it because CRT-1 output (ie. the VGA connector) IS capable of outputting interlaced material. It seems that all you nVidia owners forget that their vsync is controlled by OpenGL, which in fact is not supported on every video card in Linux.


Oh and deinterlacing using bobdeint sucks donkey balls without vsync.

_______________________________________________
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users

Reply via email to