Re: [mythtv-users] liveTV using 100% CPU

2006-01-08 Thread Brian Wood


Brian Wood
[EMAIL PROTECTED]

Not sure if this is related, but:

Are you perchance using an nVidia card with the nVidia drivers? I  
have the same CPU usage problem and I've read that the latest nVidia  
binary drivers cause this problem. Of course, trying to regress back  
to an older driver results in a won't build problem here on a  
Gentoo system, but I am still working on it.


On Jan 8, 2006, at 9:01 AM, Marcel Janssen wrote:


Hi,

I upgrade my system via ATrpms to the latest mythTV RPMS.

Unfortunately now the CPU usage goes up to 100% when I watch live TV.
I have use XV controls switched on.

When I select output via XvMC the CPU usage becomes 50% (still way  
too high)

and sound has lots of dropouts.

Anyone knows how I can fix this or what options affect this  
performance ?


Before the upgrade everything was perfect and live TV hardly used  
any CPU. I'm

using ivtv with a PVR-350.



___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] liveTV using 100% CPU

2006-01-08 Thread Marcel Janssen
Hello Brian,

 Are you perchance using an nVidia card with the nVidia drivers? I
 have the same CPU usage problem and I've read that the latest nVidia
 binary drivers cause this problem. Of course, trying to regress back
 to an older driver results in a won't build problem here on a
 Gentoo system, but I am still working on it.

In that case I think that the problem is within mythTV itself because other 
applications (liek xine) that use XV have no issues.
Anyway, I will try to start an older kernel and older nvidia drivers just to 
see if that works. If so, I'll simply stick to that for now.

Thanks,
Marcel
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] liveTV using 100% CPU

2006-01-08 Thread Marcel Janssen
On Sunday 08 January 2006 19:02, Marcel Janssen wrote:
 Hello Brian,

  Are you perchance using an nVidia card with the nVidia drivers? I
  have the same CPU usage problem and I've read that the latest nVidia
  binary drivers cause this problem. Of course, trying to regress back
  to an older driver results in a won't build problem here on a
  Gentoo system, but I am still working on it.

 In that case I think that the problem is within mythTV itself because other
 applications (liek xine) that use XV have no issues.
 Anyway, I will try to start an older kernel and older nvidia drivers just
 to see if that works. If so, I'll simply stick to that for now.

I tried the old kernel + nvidia 7676 driver and the 100% CPU issue is gone.
I'm just not sure if it's the nvidia driver that causes it or the kernel 
(different audio driver perhaps because I seen some audio buffer related 
warnings earlier in mythfrontend).

Thanks for the hint.

regards,
Marcel
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] liveTV using 100% CPU

2006-01-08 Thread Michael T. Dean

Marcel Janssen wrote:


Hello Brian,
 


Are you perchance using an nVidia card with the nVidia drivers? I
have the same CPU usage problem and I've read that the latest nVidia
binary drivers cause this problem. Of course, trying to regress back
to an older driver results in a won't build problem here on a
Gentoo system, but I am still working on it.
   

In that case I think that the problem is within mythTV itself because other 
applications (liek xine) that use XV have no issues.
 


Are you using OpenGL vsync in xine?  I doubt it...

Mike
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] liveTV using 100% CPU

2006-01-08 Thread Brian Wood
Is there perhaps some way to prevent MythTV from using this  
feature? This is in fact what seems to be firewalling (in the  
automobile sense) the CPU.


I have issues that are preventing me from easily regressing to an  
older nVidia driver, and it would be nice to get the load down if I can.



On Jan 8, 2006, at 1:16 PM, Michael T. Dean wrote:


Marcel Janssen wrote:



Are you using OpenGL vsync in xine?  I doubt it...

Mike
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] liveTV using 100% CPU

2006-01-08 Thread Michael T. Dean

Brian Wood wrote:


On Jan 8, 2006, at 1:16 PM, Michael T. Dean wrote:


Are you using OpenGL vsync in xine?  I doubt it...


Is there perhaps some way to prevent MythTV from using this  
feature? This is in fact what seems to be firewalling (in the  
automobile sense) the CPU.



I have issues that are preventing me from easily regressing to an  
older nVidia driver, and it would be nice to get the load down if I can.
 


In SVN's frontend settings (Playback Settings, General playback (page 1)):

Enable OpenGL vertical sync for timing
If it is supported by your hardware/drivers, MythTV will use OpenGL 
vertical syncing for video timing, reducing frame jitter.


In MythTV 0.18 and 0.18.1, you have to recompile Myth without OpenGL 
vsync.  In MythTV 0.18-fixes SVN, you have the same option as SVN.  
(Note that if you have one you think is 0.18.2, it's probably 0.18-fixes 
SVN, so it probably has the option.)


Mike
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] liveTV using 100% CPU

2006-01-08 Thread Brian Wood

Hmm

I'm running 0.18.1 on a Gentoo system, and compiling outside the  
portage system usually leads to trouble. I'm not sure what USE  
flag indicated the use of vsync, nVidia probably did it.


I'm not sure which is going to be more of a pain, re-compiling MythTV  
or getting an older version of nVidia's drivers to work, seems the  
kernel API has changed and broken older video drivers.


I do not understand all this mucking around with a production  
kernel. These guys who insist it must be up to date are the same  
ones who keep checking for the fdiv bug, even on Xeon and Opteron  
CPUs :-)




In SVN's frontend settings (Playback Settings, General playback  
(page 1)):


Enable OpenGL vertical sync for timing
If it is supported by your hardware/drivers, MythTV will use OpenGL  
vertical syncing for video timing, reducing frame jitter.


In MythTV 0.18 and 0.18.1, you have to recompile Myth without  
OpenGL vsync.  In MythTV 0.18-fixes SVN, you have the same option  
as SVN.  (Note that if you have one you think is 0.18.2, it's  
probably 0.18-fixes SVN, so it probably has the option.)


___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] liveTV using 100% CPU

2006-01-08 Thread Michael T. Dean

Brian Wood wrote:

In SVN's frontend settings (Playback Settings, General playback  
(page 1)):


Enable OpenGL vertical sync for timing
If it is supported by your hardware/drivers, MythTV will use OpenGL  
vertical syncing for video timing, reducing frame jitter.


In MythTV 0.18 and 0.18.1, you have to recompile Myth without  OpenGL 
vsync.  In MythTV 0.18-fixes SVN, you have the same option  as SVN.  
(Note that if you have one you think is 0.18.2, it's  probably 
0.18-fixes SVN, so it probably has the option.)



Hmm

I'm running 0.18.1 on a Gentoo system, and compiling outside the  
portage system usually leads to trouble. I'm not sure what USE  
flag indicated the use of vsync, nVidia probably did it.


I'm not sure which is going to be more of a pain, re-compiling MythTV  
or getting an older version of nVidia's drivers to work, seems the  
kernel API has changed and broken older video drivers.
 

What about using 0.18-fixes SVN?  Don't know Gentoo, but 0.18-fixes SVN 
is very close to 0.18.1, so I'd guess other than getting different 
source code, everything would be identical.  And, with the -fixes 
branch, you can enable OpenGL vsync for the build and disable it at 
runtime with the setting.


Mike
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] liveTV using 100% CPU

2006-01-08 Thread Brian Wood
Probably a good idea, I think I'll see what the Gentoo folks say, but  
my options are to either build it outside Gentoo's  portage system  
or hack that system to build something other than what it thinks I  
should. Both of these actions have got me into serious trouble in the  
past.


Probably about the same amount of trouble either way.

The easy answers are to use the video out on my PVR-350, or revert to  
more primitive video H/W, but both are lousy answers.


As I said, if they'd leave the bloody kernel APIs alone I could just  
build the older nVidia driver. Whatever happened to the odd- 
numbered kernels being for tinkering?



On Jan 8, 2006, at 2:47 PM, Michael T. Dean wrote:



What about using 0.18-fixes SVN?  Don't know Gentoo, but 0.18-fixes  
SVN is very close to 0.18.1, so I'd guess other than getting  
different source code, everything would be identical.  And, with  
the -fixes branch, you can enable OpenGL vsync for the build and  
disable it at runtime with the setting.


___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] liveTV using 100% CPU

2006-01-08 Thread Michael T. Dean

Brian Wood wrote:

As I said, if they'd leave the bloody kernel APIs alone I could just  
build the older nVidia driver. Whatever happened to the odd- 
numbered kernels being for tinkering?


Well, you asked, so here's a great overview:  
http://kerneltrap.org/node/5040


For the details, read the links in the post.

Mike
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] liveTV using 100% CPU

2006-01-08 Thread William Pettersson
I'm running 0.18.1 on Gentoo as well.  Also on Nvidia.  I recently tried
upgrading to 1.0.8178 from 1.0.6629.  That also brought me up to 100%
CPU usage when watching pre-recorded TV, standard definition digital.  I
downgraded, and all my problems are solved, Mythtv wise.  I only
upgraded to see what difference it made, anyway.  Also, it doubled my
frame rate in WoW, so I'm sticking with these.

On Sun, 2006-01-08 at 14:01 -0700, Brian Wood wrote:
 Hmm
 
 I'm running 0.18.1 on a Gentoo system, and compiling outside the  
 portage system usually leads to trouble. I'm not sure what USE  
 flag indicated the use of vsync, nVidia probably did it.
 
 I'm not sure which is going to be more of a pain, re-compiling MythTV  
 or getting an older version of nVidia's drivers to work, seems the  
 kernel API has changed and broken older video drivers.
 
 I do not understand all this mucking around with a production  
 kernel. These guys who insist it must be up to date are the same  
 ones who keep checking for the fdiv bug, even on Xeon and Opteron  
 CPUs :-)
 
 
  In SVN's frontend settings (Playback Settings, General playback  
  (page 1)):
 
  Enable OpenGL vertical sync for timing
  If it is supported by your hardware/drivers, MythTV will use OpenGL  
  vertical syncing for video timing, reducing frame jitter.
 
  In MythTV 0.18 and 0.18.1, you have to recompile Myth without  
  OpenGL vsync.  In MythTV 0.18-fixes SVN, you have the same option  
  as SVN.  (Note that if you have one you think is 0.18.2, it's  
  probably 0.18-fixes SVN, so it probably has the option.)
 
 ___
 mythtv-users mailing list
 mythtv-users@mythtv.org
 http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


signature.asc
Description: This is a digitally signed message part
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


Re: [mythtv-users] liveTV using 100% CPU

2006-01-08 Thread Brian Wood
Well for the time being I've solved the problem by simply not loading  
GLX in my X-config file. Crude but effective. My CPU load is down  
to around 5% from the 95% that it was.


Still groused about the kernel thing. I guess one could use a 2.4  
series, but Gentoo does not support them on the amd64 arch.


I'm also not to crazy about being forced to use udev, this has caused  
me a lot of trouble as a lot of apps don't get along with that very  
well.


Oh Well, still beats Windoze :-)

Brian Wood
[EMAIL PROTECTED]



On Jan 8, 2006, at 3:08 PM, Michael T. Dean wrote:


Brian Wood wrote:

As I said, if they'd leave the bloody kernel APIs alone I could  
just  build the older nVidia driver. Whatever happened to the  
odd- numbered kernels being for tinkering?


Well, you asked, so here's a great overview:  http://kerneltrap.org/ 
node/5040


For the details, read the links in the post.

Mike
___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users


___
mythtv-users mailing list
mythtv-users@mythtv.org
http://mythtv.org/cgi-bin/mailman/listinfo/mythtv-users