Re: Input device problem

2008-12-15 Thread Tony Houghton
On Mon, 15 Dec 2008 13:26:57 +1000
Peter Hutterer peter.hutte...@who-t.net wrote:

 On Sun, Dec 14, 2008 at 10:25:29PM +, Tony Houghton wrote:
   On Sat, Dec 13, 2008 at 07:51:50PM +, Tony Houghton wrote:
I've got a DVB card (saa7146 chip) with a remote control which appears
as a Linux input device. The trouble is certain keys which correspond to
keys on an ordinary keyboard also appear on /dev/console when pressed,
which I don't want.
   
   try upgrading to xserver 1.5.2.
  
  This is happening even on the console though, not just in X.
 
 well, if it pretends to be a keyboard then it's only natural that the keys end
 up on the console. The only way to get around that is to grab the event device
 (google for EVIOCGRAB).

I thought I could prevent it by configuriong HAL to remove the
properties that say it's a keyboard though, but I failed.

  The trouble is X would probably crash or whatever it's doing before I
  got to see the result of evtest, although I suppose I could run it
  in a console or over ssh with DISPLAY set.
 
 evtest is a console program.

Oh, it logs console events, not X events?

-- 
TH * http://www.realh.co.uk
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg


Re: Samsung Syncmaster 741MP problem

2008-12-15 Thread Paul Menzel
Dear Mattias,


Am Samstag, den 13.12.2008, 19:03 +0100 schrieb Mattias:

 I have a problem with my Linux Box (Mint 5) regarding the screen monitor 
 (so I thought it is a problem of the XOrg Video). When i try to use full 
 screen applications, such as games, my screen gets black and says 
 (Resolution not optimal. Suggested: 1280x1024 60Hz). I am using the 
 NVIDIA Binary drivers (should be 177.xx).

I think you will not get any answers on this list, because you are using
the NVIDIA proprietary drivers, which cannot be supported because no
source code is available.


Thanks,

Paul


signature.asc
Description: Dies ist ein digital signierter Nachrichtenteil
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg

Re: xrandr not really changing resolution

2008-12-15 Thread Alex Deucher
On Mon, Dec 15, 2008 at 9:34 AM, Tony Houghton h...@realh.co.uk wrote:
 I've noticed that if I change resolution with xrandr the physical output
 appears to stay at the original resolution and the graphics card scales
 up the smaller resolution. Is this a quirk of NVidia's proprietary
 driver or a standard xrandr feature? Is there a way to force it to
 change the physical output, perhaps using VidModeExtension instead of
 xrandr? I want to be able to select a refresh rate as well as the
 resolution.

If you are using an LCD, it only has one fixed native mode and either
the monitor or the video card scales the image to the panel's native
mode.

Alex
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg


Re: xrandr not really changing resolution

2008-12-15 Thread strawks

On Mon, 15 Dec 2008 10:37:17 -0500, Alex Deucher alexdeuc...@gmail.com
wrote:
 On Mon, Dec 15, 2008 at 9:34 AM, Tony Houghton h...@realh.co.uk wrote:
 I've noticed that if I change resolution with xrandr the physical output
 appears to stay at the original resolution and the graphics card scales
 up the smaller resolution. Is this a quirk of NVidia's proprietary
 driver or a standard xrandr feature? Is there a way to force it to
 change the physical output, perhaps using VidModeExtension instead of
 xrandr? I want to be able to select a refresh rate as well as the
 resolution.
 
 If you are using an LCD, it only has one fixed native mode and either
 the monitor or the video card scales the image to the panel's native
 mode.

Maybe you can take a look at the FlatPanelProperties options of the
NVidia driver.

-- 
strawks

___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg


Re: very bad, and very weird, scrolling text performance xorg 1.5.3 intel 2.5.1 GM965

2008-12-15 Thread Lukas Hejtmanek
On Sun, Dec 14, 2008 at 06:48:52PM -0500, Thomas Jaeger wrote:
 You really need the glyph cache in the X server to get decent text
 performance out of the 2.5 intel driver.  The patches are pretty
 straightforward to backport, but it is my understanding that a 1.6
 server will be uploaded to jaunty soon, so you might want to wait for that.

I wonder whether the Intel driver can get at least close to Nvidia driver.

I ran x11perf on Nvidia system and Xserver *without* the glyph cache. Compared
to these numbers, the Intel driver seems to be a bit comic with 56k glyphs/sec
with EXA or 215k glyphs/sec with UXA.

x11perf -aa10text 

Sync time adjustment is 0.0603 msecs.

4000 reps @   0.0001 msec (699.0/sec): Char in 80-char aa line (Charter 
10)
4000 reps @   0.0001 msec (700.0/sec): Char in 80-char aa line (Charter 
10)
4000 reps @   0.0001 msec (694.0/sec): Char in 80-char aa line (Charter 
10)
4000 reps @   0.0001 msec (700.0/sec): Char in 80-char aa line (Charter 
10)
4000 reps @   0.0001 msec (699.0/sec): Char in 80-char aa line (Charter 
10)
2 trep @   0.0001 msec (699.0/sec): Char in 80-char aa line 
(Charter 10)

-- 
Lukáš Hejtmánek
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg


nvidia's blob cluebar Was: xrandr not really changing resolution

2008-12-15 Thread Rui Tiago Cação Matos
2008/12/15 Tony Houghton h...@realh.co.uk:
 I've noticed that if I change resolution with xrandr the physical output
 appears to stay at the original resolution and the graphics card scales
 up the smaller resolution. Is this a quirk of NVidia's proprietary
 driver or a standard xrandr feature? Is there a way to force it to
 change the physical output, perhaps using VidModeExtension instead of
 xrandr? I want to be able to select a refresh rate as well as the
 resolution.

RTFM: ftp://download.nvidia.com/XFree86/Linux-x86/180.16/README/appendix-b.html

In particular, grep for FlatPanelProperties.

Someone should add a proeminent cluebar to
http://lists.freedesktop.org/mailman/listinfo/xorg so that people
using nvidia's blob go ask at
http://www.nvnews.net/vbulletin/forumdisplay.php?f=14 instead.
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg


Re: Input device problem

2008-12-15 Thread Matthew Garrett
On Mon, Dec 15, 2008 at 02:26:28PM +, Tony Houghton wrote:

 I thought I could prevent it by configuriong HAL to remove the
 properties that say it's a keyboard though, but I failed.

The Linux input layer will deliver the events to /dev/console unless the 
device is grabbed (which will disable just that device) or the console 
is put in raw mode (which will disable all devices). When you're using 
the kbd driver, the kernel is actually delivering a multiplexed stream 
of all keyboard-type devices. By removing the entries from hal you're 
preventing evdev from automatically picking up that device, but not 
doing anything to prevent them being delivered through the console 
device. THe kernel pays no attention to any information from hal.

-- 
Matthew Garrett | mj...@srcf.ucam.org
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg


Re: very bad, and very weird, scrolling text performance xorg 1.5.3 intel 2.5.1 GM965

2008-12-15 Thread Michel Dänzer
On Mon, 2008-12-15 at 10:57 +0100, Lukas Hejtmanek wrote:
 On Sun, Dec 14, 2008 at 06:48:52PM -0500, Thomas Jaeger wrote:
  You really need the glyph cache in the X server to get decent text
  performance out of the 2.5 intel driver.  The patches are pretty
  straightforward to backport, but it is my understanding that a 1.6
  server will be uploaded to jaunty soon, so you might want to wait for that.
 
 I wonder whether the Intel driver can get at least close to Nvidia driver.
 
 I ran x11perf on Nvidia system and Xserver *without* the glyph cache. Compared
 to these numbers, the Intel driver seems to be a bit comic with 56k glyphs/sec
 with EXA or 215k glyphs/sec with UXA.

Note that the glyph cache is specific to EXA, which the nvidia driver
doesn't use. UXA on the other hand may have the glyph cache in its copy
of the EXA code.

I'm not sure what's the bottleneck for the intel driver at this point,
the radeon driver easily gets 500k/s and beyond (which is still a far
cry from the numbers you measured, but already seems to mostly avoid it
being any practical limitation). Some of it may just be down to the
hardware (memory bandwidth, shader units etc.).


-- 
Earthling Michel Dänzer   |  http://tungstengraphics.com
Libre software enthusiast |  Debian, X and DRI developer

___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg

Re: S-video with Intel Graphics Card

2008-12-15 Thread R. G. Newbury
I am trying to clone my desktop to a television using an S-video cable. I'm
am using an 82852/855GM Intel graphics card. I have tried editing my
xorg.conf several times and it has never worked. The best I can get from my
television is a flicker, and editing my xorg.conf usually messes up my
computer monitor as well. My goal is to at least get some kind of picture on
the TV screen. I wouldn't really mind if it was black and white or didn't
use the whole screen, so long as it works. Can I get some help with this
please?

To start, make sure that your BIOS enables dual output through VGA and 
S-video ( often called TV-OUT). Test by booting with only one of the 
pair (VGA/S-video) plugged in. You *should* (cross fingers, throw salt 
over your left shoulder etc. etc.) get output during boot, even if it 
then gives a blank screen. THEN you can get into setting up xorg.conf 
for a dual output. Man intel and man xorg.conf will be your friend.

Geoff



___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg


Re: XInput2 MD without SD again

2008-12-15 Thread Christian Beier
On Mon, 15 Dec 2008 13:39:54 +1000
Peter Hutterer peter.hutte...@who-t.net wrote:

 On Mon, Dec 15, 2008 at 12:11:04AM +0100, Christian Beier wrote:
  I have (again) a question related to the MD/SD devices introduced with
  XI2. So far i was able to feed input into pointer MDs (without SDs
  attached) via XTestFakeDevice*, but now I'm stuck with the keyboard
  MDs. These accept keycodes via XTest, but are somehow stateless, as I
  cannot press Shift and enter capital letters. I remember that is so
  intentionally and that MDs adopt the state of the last connected SD.

 not quite, they aren't stateless, but their state will be overwritten whenever
 the SD sends an event. If you don't have SDs, you should be able to change
 them.


Okay. How would i do that? The thing is, i can send keys via XTest, but
my keyboard MD (without an attached SD) seems to not remember modifiers
like Shift or Ctrl. I send a Shift down, then the keycode for, say, 'k'
but i get 'k' instead of 'K' as output.

  1) Seems quite ugly to me. Is there a cleaner way to solve this? Like
 copying device capabilities?

 I think long term the best option would be to create a virtual SD and then
 control this virtual SD throug XTest and restrict the MDs to only ever react
 to events from SDs, but not to actually do anything themselves. Would that
 make sense?

I think so. If we view MDs as the interface to the user and SDs as
representing some real or virtual hardware, that would make sense.


  2) What's worse, when I now send input to my SD-less MD, it somehow
 re-attaches my real keyboards SD to itself again. Is this
 intentionally so?

 I don't understand what you mean in 2). Can you rephrase this please?

Sure. I was sending input via XTest to a keyboard MD without an attached
SD (lets call this one k_md), and got the problem i described above. I
then remembered reading that MDs adopt the capabilities of attached
SDs, so I thought k_md was lacking some and attached my real physical
keyboards SD to k_md, typed something including modifiers and voila,
afterwards i was able to send input via XTest to k_md and the problem
with the modifiers seemed solved. _But_, and this is 2) now: I then
re-attached my physical keyboards SD back to the VCK, but when I sent
input via XTest to k_md again (which is supposed to _not_ have an
attached SD by now), k_md now in fact _has_ my physical keyboards SD
attached again. k_md somehow stole it from the VCK. And I'm wondering
if this is the intended behaviour.

Cheers,
   Christian

 Cheers,
   Peter


--
what is, is;
what is not is possible.


pgpR9shbJ38Uk.pgp
Description: PGP signature
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg

Re: very bad, and very weird, scrolling text performance xorg 1.5.3 intel 2.5.1 GM965

2008-12-15 Thread Jeffrey Baker
On Mon, Dec 15, 2008 at 2:26 AM, Michel Dänzer
mic...@tungstengraphics.com wrote:
 On Mon, 2008-12-15 at 10:57 +0100, Lukas Hejtmanek wrote:
 On Sun, Dec 14, 2008 at 06:48:52PM -0500, Thomas Jaeger wrote:
  You really need the glyph cache in the X server to get decent text
  performance out of the 2.5 intel driver.  The patches are pretty
  straightforward to backport, but it is my understanding that a 1.6
  server will be uploaded to jaunty soon, so you might want to wait for that.

 I wonder whether the Intel driver can get at least close to Nvidia driver.

 I ran x11perf on Nvidia system and Xserver *without* the glyph cache. 
 Compared
 to these numbers, the Intel driver seems to be a bit comic with 56k 
 glyphs/sec
 with EXA or 215k glyphs/sec with UXA.

 Note that the glyph cache is specific to EXA, which the nvidia driver
 doesn't use. UXA on the other hand may have the glyph cache in its copy
 of the EXA code.

 I'm not sure what's the bottleneck for the intel driver at this point,
 the radeon driver easily gets 500k/s and beyond (which is still a far
 cry from the numbers you measured, but already seems to mostly avoid it
 being any practical limitation). Some of it may just be down to the
 hardware (memory bandwidth, shader units etc.).

I seriously doubt it, since performance was far better last week, last
month, and a year ago with the same hardware.  I think we can safely
pin this current problem on the software.

Even if it were limited at 50kglyphs/s, the 7m character output would
still come out in 140s, not 400s.

-jwb
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg


Re: [ANNOUNCE] radeonhd 1.2.4 Release

2008-12-15 Thread Matthias Hopf
On Dec 12, 08 18:52:55 +0100, Luca Tettamanti wrote:
 On Fri, Dec 12, 2008 at 5:53 PM, Gene Heskett gene.hesk...@verizon.net 
 wrote:
  On Friday 12 December 2008, Matthias Hopf wrote:
 Announcing the 1.2.4 of the xf86-video-radeonhd driver.
 
  I still don't have drm, so glxgears is only marginally faster (850 fps vs 
  800)
  than before on an RV610 (Diamond HD2400-Pro), and things like tvtime still
  cannot run.
 
 Still no DRM nor XV (which would use DRM anyway I believe) on R600 and newer.

And that's not even related to the X11 driver. As soon as we have a R6xx
DRI module for Mesa, it will probably just work (TM) with the current
X11 driver.

Fingers crossed, that is.

XV will happen in about the same timeframe. Again, fingers crossed.

And, PLEASE, don't keep xorg-announce included in the mail recipients
when replying to a announcement mail.

Matthias

-- 
Matthias Hopf mh...@suse.de  ____   __
Maxfeldstr. 5 / 90409 Nuernberg   (_   | |  (_   |__  m...@mshopf.de
Phone +49-911-74053-715   __)  |_|  __)  |__  R  D   www.mshopf.de
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg


Re: very bad, and very weird, scrolling text performance xorg 1.5.3 intel 2.5.1 GM965

2008-12-15 Thread Johannes Truschnigg
On Monday 15 December 2008, Jeffrey Baker wrote:
 I seriously doubt it, since performance was far better last week, last
 month, and a year ago with the same hardware.  I think we can safely
 pin this current problem on the software.

 Even if it were limited at 50kglyphs/s, the 7m character output would
 still come out in 140s, not 400s.

 -jwb

I must say I'm also very unhappy with the intel driver's degrading 
performance. I don't have any hard numbers, but the snappiness of esp. 
scrolling text has gone nothing but downward over the last months.

It's nice to see/read developers experimenting with new acceleration 
architectures and all, but if there is no net gain over XAA, what is it worth 
anyway? I certainly would not mine one of my host CPU's four cores at 50% 
load if my everdays' desktop tasks performance would be better than what it 
is.

I don't really want to believe it's the hardware that's too slow for what X is 
doing; last I checked, Windows GDI performance was top notch with Intel 
hardware.

I really love how Intel and its fine development team supports a free software 
stack even on the desktop side of things, and how these efforts benefit even 
users of hardware from different manufacturers. Yet on the other hand, I'd 
really appreciate more accurate/up-to-date docs about the good stuff that's 
in the making. It sucks to iterate over some half a dozen developer's blogs 
every second month or so just to grok why EXA is so much slower than XAA if 
$option is (not) set. There should be some kind of tweaking guide on 
intellinuxgraphics.org imho or at least some other definite resource with 
infos about what's recommended, and what's not. I'm tired of fiddling with my 
xorg.conf anew for every minor release that gets pushed out, just to have my 
system revert to 70-90% of the last driver revision's performance level.

Sorry for ranting, esp. at this length... I hope you don't take it the wrong 
way. I really appreciate all your work, I just wish it would be more visible 
for us, the end users.

(I've got an GMA X3100 in my desktop, and it's seemingly growing ever slower 
for months now, as well as a GMA X4500HD in my ThinkPad x200 which hardlocks 
the system when resuming from disk with the intel-module loaded, so please 
bear with me. Btw, don't you think a fix of the latter problem, which I do 
think exists, would warrant for a minor bugfix release? Please? 
Pretty-please? :))

-- 
with best regards:
- Johannes Truschnigg ( johannes.truschn...@gmx.at )

www: http://johannes.truschnigg.info/
phone: +43 650 2 17
jabber: johannes.truschn...@gmail.com

Please do not bother me with HTML-eMail or attachments. Thank you.


signature.asc
Description: This is a digitally signed message part.
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg

[ANNOUNCE] xrandr 1.2.99.3 Release

2008-12-15 Thread Matthias Hopf
Version 1.2.99.3 improvements over 1.2.3:

- Panning support
- Transformation support
- Various fixes


Git Shortlog:

Adam Jackson (1):
  Accept --props synonym for --prop

Alan Coopersmith (1):
  Man page typo fix

Egbert Eich (1):
  Fix for 64bit: feed a pointer to the right size variable to scanf().

Eric Piel (1):
  update the manpage

Hong Liu (1):
  Move outputs among crtcs as necessary. Fixes 14570

Julien Cristau (4):
  Manpage typo fixes
  Merge branch 'transform-proposal' of 
git.freedesktop.org:/git/xorg/app/xrandr
  Fix build outside of the source dir
  Require libXrandr 1.2.91

Keith Packard (15):
  Add output scaling using the 1.3 transform requests
  Manage transform filters. Use bilinear for non-identity scale.
  Transform mode bounds when computing sizes.
  Eliminate inverse matrix from randr transform protocol
  Add --transform to pass arbitrary transforms to the server
  Make screen undersize a warning instead of an error
  Add keystone.5c program to help compute transforms.
  Track toolkit name change (chrome-nichrome)
  Build and install xkeystone program from keystone.5c
  add --transform none to reset to identity
  Execute xrandr to set keystone correction
  Fix up xkeystone to use current screen/output settings
  Exit when select output is not available
  Check return value from XRRGetCrtcTransform
  Add --scale and --transform to --help output

Matthias Hopf (8):
  Add panning support.
  Bump to 1.2.99.2, RandR requirements to 1.2.99.2
  Add manpage entry.
  Only set transforms if actually changed.
  Panning tracking areas describe full screen if set to 0. Use it as 
default.
  Don't trash panning area, except if --panning or --fb is given.
  Add keystone.5c to EXTRA_DIST
  Bump to 1.2.99.3

Matthieu Herrb (1):
  Don't use GNU make only constructs.


http://xorg.freedesktop.org/releases/individual/app/xrandr-1.2.99.3.tar.gz
MD5:  50f0c7bfcd02a32f595d4cb81cdd5e5e  xrandr-1.2.99.3.tar.gz
SHA1: 555c0979ca199c6f127806895d10f2b9a8ee7287  xrandr-1.2.99.3.tar.gz

http://xorg.freedesktop.org/releases/individual/app/xrandr-1.2.99.3.tar.bz2
MD5:  f1a591364d915e6c924b373721a72b4a  xrandr-1.2.99.3.tar.bz2
SHA1: bc5ee2b8e35b02ce04afdef7b513fec7442a64a1  xrandr-1.2.99.3.tar.bz2


Matthias

-- 
Matthias Hopf mh...@suse.de  ____   __
Maxfeldstr. 5 / 90409 Nuernberg   (_   | |  (_   |__  m...@mshopf.de
Phone +49-911-74053-715   __)  |_|  __)  |__  R  D   www.mshopf.de


pgpYqMhE4v64c.pgp
Description: PGP signature
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg

Problem configuring LCD with 965GME

2008-12-15 Thread Marc Ferland
Hi,

I am currently trying to configure an LCD screen that has a native 1024x600 
resolution (OSD102TN43). This LCD doesn't return any EDID information.

Hardware spec:
(II) intel(0): Integrated Graphics Chipset: Intel(R) 965GME/GLE

Software spec:
X.Org X Server 1.5.3
Linux 2.6.27.4
intel driver 2.4.2

Running Xorg -configure results in a xorg.conf file that automatically falls 
back to 1024x768.

I tried to force the 1024x600 mode by generating a modeline using the LCD spec 
sheet. This gives me something like:

 Section Monitor
 Identifier   Monitor0
 VendorName   Monitor Vendor
 ModelNameMonitor Model
 DisplaySize  221.79 131.52
 UseModes 1024x600_g3
 EndSection

 Section Modes
 Identifier 1024x600_g3
 Modeline 1024x600_g3   51   1024 1184 1185 1344   600 612 613
 635  -hsync EndSection

 Section Device
 Identifier  Card0
 VendorName  Unknown Vendor
 Driver  intel
 BoardName   Unknown Board
 BusID   PCI:0:2:0
 Option  monitor-LVDS Monitor0
 EndSection

 Section Screen
 Identifier Screen0
 Device Card0
 MonitorMonitor0
 DefaultDepth 24
 SubSection Display
 Viewport   0 0
 Depth 24
 Modes 1024x600_g3
 Virtual 1024 768
 EndSubSection
 EndSection


Using this configuration, xrandr -q reports the following:
 Screen 0: minimum 320 x 200, current 1024 x 600, maximum 1024 x 768
 VGA disconnected (normal left inverted right x axis y axis)
 LVDS connected 1024x600+0+0 (normal left inverted right x axis y axis) 0mm
 x 0mm 1024x600_g359.8*+
1024x768   60.0 +   85.0 75.0 70.1 60.0
832x62474.6
800x60085.1 72.2 75.0 60.3 56.2
640x48085.0 72.8 75.0 59.9
720x40085.0
640x40085.1
640x35085.1

I also noted the following in Xorg.log:
 (II) intel(0): initializing int10
 (WW) intel(0): Bad V_BIOS checksum
 (II) intel(0): Primary V_BIOS segment is: 0xc000
 (II) intel(0): VESA BIOS detected
 (II) intel(0): VESA VBE Version 3.0
 (II) intel(0): VESA VBE Total Mem: 448 kB
 (II) intel(0): VESA VBE OEM: Intel(r)GM965/PM965/GL960 Graphics Chip
 Accelerated VGA BIOS (II) intel(0): VESA VBE OEM Software Rev: 1.0
 (II) intel(0): VESA VBE OEM Vendor: Intel Corporation
 (II) intel(0): VESA VBE OEM Product: Intel(r)GM965/PM965/GL960 Graphics
 Controller (II) intel(0): VESA VBE OEM Product Rev: Hardware Version 0.0
 (II) Loading sub module ddc
 (II) LoadModule: ddc
 (II) Module ddc already built-in
 (II) Loading sub module i2c
 (II) LoadModule: i2c
 (II) Module i2c already built-in
 (II) intel(0): Output VGA using monitor section Monitor0
 (II) intel(0): I2C bus CRTDDC_A initialized.
 (II) intel(0): Output LVDS using monitor section Monitor0
 (II) intel(0): I2C bus LVDSDDC_C initialized.
 (II) intel(0): Attempting to determine panel fixed mode.
 (II) intel(0): I2C device LVDSDDC_C:ddc2 registered at address 0xA0.
 (II) intel(0): I2C device LVDSDDC_C:ddc2 removed.
 (II) intel(0): Output TV using monitor section TV
 (**) intel(0): Option Ignore
 (II) intel(0): I2C device LVDSDDC_C:ddc2 registered at address 0xA0.
 (II) intel(0): I2C device LVDSDDC_C:ddc2 removed.
 (II) intel(0): Output VGA disconnected
 (II) intel(0): Output LVDS connected
 (II) intel(0): Using user preference for initial modes
 (II) intel(0): Output LVDS using initial mode 1024x600_g3
 (II) intel(0): Monitoring connected displays enabled
 (II) intel(0): detected 512 kB GTT.
 (II) intel(0): detected 508 kB stolen memory.
 (==) intel(0): video overlay key set to 0x101fe
 (==) intel(0): Will not try to enable page flipping
 (==) intel(0): Triple buffering disabled
 (==) intel(0): Intel XvMC decoder disabled
 (==) intel(0): Using gamma correction (1.0, 1.0, 1.0)
 (**) intel(0): Display dimensions: (221, 131) mm
 (**) intel(0): DPI set to (117, 148)

So if I'm reading this correctly, the output LVDS should be using the initial 
mode 1024x600_g3 which specifies a 1024x600 resolution. But by looking at my 
LCD, I see a stretched and fuzzy looking LCD screen (it's even worst than the 
default 1024x768).

Has anyone ever configured a 1024x600 LCD using modelines? I'm really 
wondering if the driver is using this mode, or maybe it falls back to a 
default mode when it cannot read the EDID info...

Any help is greatly appreciated!

Regards,
Marc

___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg


Re: XInput2 MD without SD again

2008-12-15 Thread Peter Hutterer
On Mon, Dec 15, 2008 at 03:01:36PM +0100, Christian Beier wrote:
 On Mon, 15 Dec 2008 13:39:54 +1000
 Peter Hutterer peter.hutte...@who-t.net wrote:
 
  On Mon, Dec 15, 2008 at 12:11:04AM +0100, Christian Beier wrote:
   I have (again) a question related to the MD/SD devices introduced with
   XI2. So far i was able to feed input into pointer MDs (without SDs
   attached) via XTestFakeDevice*, but now I'm stuck with the keyboard
   MDs. These accept keycodes via XTest, but are somehow stateless, as I
   cannot press Shift and enter capital letters. I remember that is so
   intentionally and that MDs adopt the state of the last connected SD.
 
  not quite, they aren't stateless, but their state will be overwritten 
  whenever
  the SD sends an event. If you don't have SDs, you should be able to change
  them.
 
 
 Okay. How would i do that? The thing is, i can send keys via XTest, but
 my keyboard MD (without an attached SD) seems to not remember modifiers
 like Shift or Ctrl. I send a Shift down, then the keycode for, say, 'k'
 but i get 'k' instead of 'K' as output.

that's a bug, please file it and assign it to me.

 
   2) What's worse, when I now send input to my SD-less MD, it somehow
  re-attaches my real keyboards SD to itself again. Is this
  intentionally so?
 
  I don't understand what you mean in 2). Can you rephrase this please?
 
 Sure. I was sending input via XTest to a keyboard MD without an attached
 SD (lets call this one k_md), and got the problem i described above. I
 then remembered reading that MDs adopt the capabilities of attached
 SDs, so I thought k_md was lacking some and attached my real physical
 keyboards SD to k_md, typed something including modifiers and voila,
 afterwards i was able to send input via XTest to k_md and the problem
 with the modifiers seemed solved. _But_, and this is 2) now: I then
 re-attached my physical keyboards SD back to the VCK, but when I sent
 input via XTest to k_md again (which is supposed to _not_ have an
 attached SD by now), k_md now in fact _has_ my physical keyboards SD
 attached again. k_md somehow stole it from the VCK. And I'm wondering
 if this is the intended behaviour.

that's not intended behaviour at all. when removing the last attached SD, the
MD should restore its original (hardcoded) capabilities. I remember this
working correctly with normal devices, so this bug may actually be in the
XTest code. Same as above - bugzilla, assign to me.

Cheers,
  Peter
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg


Re: S-video with Intel Graphics Card

2008-12-15 Thread S B
I do get output when I restart with it plugged in, but it is only through
the TV and not the computer (one or the other). Furthermore, it's black and
white and the image jumps vertically across the screen repeatedly. Using man
intel and man xorg.conf, I have written my own xorg.conf which *should*
work. The only thing I managed to do with it is keeping seeing what's on my
computer monitor, but still nothing with the TV. I have pasted this
xorg.conf here: http://pastebin.com/m5330834b. It's possible that I got
several things wrong and/or this configuration simply will not work. I am
currently using a normal S-video cord (which according to xrandr seems to be
detected as VGA too?), but I have ordered a VGA to S-video/RCA adapter, and
am planning to give it a try with that as well.


-

 Message: 1
 Date: Mon, 15 Dec 2008 09:51:27 -0500
 From: R. G. Newbury newb...@mandamus.org
 Subject: Re:  S-video with Intel Graphics Card
 To: xorg@lists.freedesktop.org
 Message-ID: 49466eef.4020...@mandamus.org
 Content-Type: text/plain; charset=us-ascii; format=flowed

 I am trying to clone my desktop to a television using an S-video cable.
 I'm
 am using an 82852/855GM Intel graphics card. I have tried editing my
 xorg.conf several times and it has never worked. The best I can get from my
 television is a flicker, and editing my xorg.conf usually messes up my
 computer monitor as well. My goal is to at least get some kind of picture
 on
 the TV screen. I wouldn't really mind if it was black and white or didn't
 use the whole screen, so long as it works. Can I get some help with this
 please?

 To start, make sure that your BIOS enables dual output through VGA and
 S-video ( often called TV-OUT). Test by booting with only one of the
 pair (VGA/S-video) plugged in. You *should* (cross fingers, throw salt
 over your left shoulder etc. etc.) get output during boot, even if it
 then gives a blank screen. THEN you can get into setting up xorg.conf
 for a dual output. Man intel and man xorg.conf will be your friend.

 Geoff

___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg

Re: Fwd: X.org PCI changes breaks support for Silicon Motion SM720 Lynx3DM card?

2008-12-15 Thread Francisco Jerez
Richard Schwarting aquari...@gmail.com writes:

 Suspending without X has the same effect as suspending with X.  A
 blank screen that slowly bleeds white across it.

 vbetool doesn't seem to help.  I will investigate this issue more
 outside of X.

 Do you still need anything re: the silicon motion driver?  I don't
 suffer corruption under EXA or XAA and until I have suspend and resume
 sorted out, I don't think there are any more issues on my system.

Hi. It looks like the register state gets screwed up when suspending. It
could be a hardware bug. 

Maybe, it could be workarounded by tweaking some of the memory control
registers, I haven't tried yet, it seems that vbetool vbestate restore
in text mode after resuming does the trick for me.


pgpcF0UOOeCDr.pgp
Description: PGP signature
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg

[ANNOUNCE] xf86-video-intel 2.5.99.1

2008-12-15 Thread Zhenyu Wang

Here's xf86-video-intel 2.6 rc1 release. The merges
of DRI2 and XvMC support for Gen4 chips are complete,
also with KMS, randr 1.3 support, UXA fixes and HDMI
audio support.

Please take a wide testing of this one, and report
feedbacks. Thanks!

Subject: [ANNOUNCE] xf86-video-intel 2.5.99.1
To: xorg-annou...@lists.freedesktop.org

Adam Jackson (1):
  Quirk: No LVDS on Dell Studio Hybrid

Bryce Harrington (2):
  PipeA quirk for Quanta/W251U (launchpad bug #244242)
  Pipe-A quirk for HP 2730p (bug #18852)

Carl Worth (13):
  Ignore intel_gtt binary
  Rename gen4_state_t to gen4_static_state_t
  Rename gen4_state_init to gen4_static_state_init
  Rename state_base_offset to static_state_offset in gen4_static_state_init
  Use consistent idiom for obtaining static_state
  Use buffer object for vertex buffer (in new gen4_dynamic_state)
  965: Move composite setup to new _emit_batch_header_for_composite
  Rename gen4_dynamic_state to gen4_vertex_buffer
  Unreference the vertex_buffer_bo in gen4_render_state_cleanup
  Use buffer objects for binding table and surface-state objects.
  i965: Add batch_flush_notify hook to create new vertex-buffer bo
  Don't smash fixed_mode if skip_panel_detect is set.
  Set vertex_buffer_bo to NULL after unreference.

Dave Airlie (2):
  Default kernel mode setting to off, add configure flag to enable
  uxa: don't call composite routines with no buffer.

Eric Anholt (26):
  Don't set up sarea or drm mappings in DRI2 mode.
  UXA: Re-enable non-965 render.
  DRI2: Emit the MI_FLUSH before flushing batch in swapbuffers.
  DRI2: Move pixmap pitch alignment for use with depth to pixmap create.
  Fix build failures that should have been in the previous merge commit.
  Remove the CheckDevices timer.
  Make I830FALLBACK debugging a runtime instead of compile-time option.
  i965: Support render acceleration with pixmaps in BOs.
  Remove DRI_MM defines which are always true now.
  UXA: Add support for tiled front/back/depth by cutting over to the GTT 
map.
  Re-enable composite accel on 965 with UXA.
  Enable tiling for DRI2 back/depth buffers.
  Move debug code for I965DisplayVideoTextured to separate functions.
  Move I965DisplayVideoTextured surface/sampler setup to separate functions.
  Move I965DisplayVideoTextured unit state setup to separate functions.
  Move i965 video cc state to BOs.
  Move i965 video vs/sf state to BOs.
  Stop allocating unused scratch space for i965 video.
  Move i965 video wm and sampler state to BOs.
  Move remaining i965 video programs to BOs.
  Move i965 video vertex data to BOs.
  Move i965 video surface state and binding table to BOs.
  Emit proper relocations to pixmaps in BOs in i965 video.
  Remove the extra memory allocation for 965 video state now that it's all 
in BOs.
  uxa: Add in EnableDisableFBAccess handling like examodule.c did.
  uxa: Reject solid/copy to under-8bpp destinations.

Jesse Barnes (3):
  Don't modify render standby if kernel mode setting is active
  Default to FULL_ASPECT panel fitting
  Make sure DRM library paths are included

Julien Cristau (1):
  Don't unconditionally define DRI2

Keith Packard (3):
  Fix mis-merge of DRI2 changes related to pI830-directRenderingType
  Use long crt hotplug activation time on GM45.
  Add RandR 1.3 panning support by supporting the crtc set_origin function

Kristian Høgsberg (6):
  Add DRI2 support.
  Fix broken test for DRI1 in DRI2 conversion.
  Update to DRI2 changes.
  Fix KMS compilation.
  Simplify crtc preinit a bit.
  Make sure DRI/DRI2 can initialize properly with KMS.

Ma Ling (1):
  enable Intel G35 SDVO HDMI audio output

Maxim Levitsky (1):
  Add an option to make the overlay be the first XV adaptor.

Paulo Cesar Pereira de Andrade (2):
  Export libIntelXvMC and libI80XvMC symbols.
  Include X11/Xfuncproto.h prior to including edid.h from the sdk.

Robert Lowery (1):
  TV: add support to set TV margins in xorg.conf

Wu Fengguang (3):
  introduce i830_hdmi_priv.has_hdmi_sink
  enable Intel G45 integrated HDMI audio output
  refresh batch_bo reference after intel_batch_flush()

Zhenyu Wang (24):
  Make IS_GM45 into IS_G4X define
  SDVO: fix wrong order of sdvo version's major/minor
  SDVO: add HDMI audio encrypt change bit for GetInterruptEventSource 
command
  SDVO: fix sdvo tv format and sdtv resolution request/reply definition
  SDVO: add GetScaledHDTVResolutionSupport command
  SDVO: add command for set monitor power state
  SDVO: fix more command definition errors
  TV: white space cleanup
  TV: fix default contrast and saturation modifier
  TV: save serveral TV_CTL register fields in mode set
  TV: fix timing parameters for PAL, 480p, 1080i
  TV: subcarrier fix for NTSC and PAL
 

Re: Font problems

2008-12-15 Thread Julien Cristau
On Mon, 2008-12-15 at 22:03 +0100, Lukas Hejtmanek wrote:
 I had no problem in 1.5.3. What am I missing?
 
configure --disable-builtin-fonts.

Cheers,
Julien
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg


Re: [ANNOUNCE] xf86-video-intel 2.5.99.1

2008-12-15 Thread Wang, Zhenyu Z
On 2008.12.15 19:09:24 -0700, Jeffrey Baker wrote:
 On Mon, Dec 15, 2008 at 5:43 PM, Zhenyu Wang zhenyu.z.w...@intel.com wrote:
 
  Here's xf86-video-intel 2.6 rc1 release. The merges
  of DRI2 and XvMC support for Gen4 chips are complete,
  also with KMS, randr 1.3 support, UXA fixes and HDMI
  audio support.
 
 
 Building this seems to require libdrm  2.4.1, which is not posted at
 http://dri.freedesktop.org/libdrm/.  What's the git revision for the
 required libdrm?
 

Sorry for not saying that. drm master should be fine. libdrm would have
new releases shortly.

-- 
Open Source Technology Center, Intel ltd.

$gpg --keyserver wwwkeys.pgp.net --recv-keys 4D781827


signature.asc
Description: Digital signature
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg

Upcoming X.org annual election

2008-12-15 Thread Carl Worth
The X.Org Foundation annual elections will begin in January 2009. We
have chosen to shedule the election at the beginning of the calendar
year to avoid some conflicts that resulted with the end-of-the-year
elections held previously.

Here is the schedule for the election:

Mon 2009-01-05 00:00 UTC- Nomination period begins
Sun 2009-01-18 23:59 UTC- Nomination period ends
Mon 2009-01-26  - Slate of candidates published
Fri 2009-01-30 23:59 UTC- Deadline for new membership applications
Mon 2009-02-02 00:00 UTC- Election period begins
Sun 2009-02-15 23:59 UTC- Election period ends
Fri 2009-02-20  - Results published to the membership

Please note that only current members of the foundation can vote. So
everyone planning to vote is required to either renew their membership
or submit a new membership request at:

http://members.x.org

To renew your membership, simply login to the membership system at the
above URL, click the Renew Your Membership button and ensure that
all of your user data is current, then click Renew.

This same membership system will be used for the actual election, so
please let us know as soon as possible if you have any problems
accessing it.

The Election Committee
X.Org Foundation



signature.asc
Description: This is a digitally signed message part
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg

Failure to detect TMDS-1 on intel GM45/ICH9

2008-12-15 Thread garrone
Hi,
 When first starting X up, the Xorg server never detects the TMDS-1 output 
channel,
on an intel GM45/I965 dual cpu ICH9 module motherboard, PCI-ID 8086:2a42. 
Instead,
xrandr indicates HDMI-1 as the detected device.

Using a debugger, it is seen that a failure in the I2C channel on the very 
first 
I2CAddress call to address 0x70 occurs. Indeed, with gdb, it is possible to 
jump back, successfully detect the device, and proceed. To recreate the bug, 
it is always necessary to reboot the machine.

To illustrate the error, following is an extract of the gdb log at the point 
where
the I2CAddress functions fails in the I2CPutByte call. At xf86i2c.c:line 250, 
it is returning
FALSE, and causing the transmission fail.

My xserver is at commit f1c9b5ab23...
My xf86-video-intel is at commit 30fb0ef53e1...

***
(gdb) n
250 r = FALSE;
(gdb) where
#0  I2CPutByte (d=0x82660e0, data=112 'p') at xf86i2c.c:250
#1  0x081eb93b in I2CAddress (d=0x82660e0, addr=112) at xf86i2c.c:336
#2  0x081eba7e in I2CWriteRead (d=0x82660e0, WriteBuffer=0xbfd27d60 , 
nWrite=1, ReadBuffer=0xbfd27e94 À\\bÀ\\bÀ\\b\r, nRead=1) at 
xf86i2c.c:416
#3  0x081ebb85 in xf86I2CWriteRead (d=0x82660e0, WriteBuffer=0xbfd27d60 , 
nWrite=1, ReadBuffer=0xbfd27e94 À\\bÀ\\bÀ\\b\r, nRead=1) at 
xf86i2c.c:448
#4  0x081ebc0a in xf86I2CReadByte (d=0x82660e0, subaddr=0 '\0', 
pbyte=0xbfd27e94 À\\bÀ\\bÀ\\b\r) at xf86i2c.c:466
#5  0xb7a26903 in i830_sdvo_read_byte_quiet (output=0x8264ee0, addr=0, 
ch=0xbfd27e94 À\\bÀ\\bÀ\\b\r) at i830_sdvo.c:172
#6  0xb7a29d53 in i830_sdvo_init (pScrn=0x82622c0, output_device=397632) at 
i830_sdvo.c:1827
#7  0xb7a09e3c in I830SetupOutputs (pScrn=0x82622c0) at i830_driver.c:916
#8  0xb7a0c372 in I830AccelMethodInit (pScrn=0x82622c0) at i830_driver.c:1595
#9  0xb7a0cc13 in I830PreInit (pScrn=0x82622c0, flags=0) at i830_driver.c:1861
#10 0x080bb715 in InitOutput (pScreenInfo=0x824f1c0, argc=8, argv=0xbfd281c4) 
at xf86Init.c:1007
#11 0x08068d34 in main (argc=8, argv=0xbfd281c4, envp=0xbfd281e8) at main.c:308
(gdb) l
235 b-I2CUDelay(b, b-RiseFallTime);
236 
237 r = I2CRaiseSCL(b, 1, b-HoldTime);
238 
239 if (r) {
240 for (i = d-AcknTimeout; i  0; i -= b-HoldTime) {
241 b-I2CUDelay(b, b-HoldTime);
242 b-I2CGetBits(b, scl, sda);
243 if (sda == 0) break;
244 }
245 
246 if (i = 0) {
247 I2C_TIMEOUT(ErrorF([I2CPutByte(%s, 0x%02x, %d, %d, %d) 
timeout], 
248b-BusName, data, d-BitTimeout, 
249d-ByteTimeout, d-AcknTimeout));
250 r = FALSE;
251 }
252 
253 I2C_TRACE(ErrorF(W%02x%c , (int) data, sda ? '-' : '+'));
254 }
255 
256 b-I2CPutBits(b, 0, 1);
257 b-I2CUDelay(b, b-HoldTime);
258 
259 return r;
260 }
261 
***
___
xorg mailing list
xorg@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/xorg