On a recent upgrade DVI-1 has gone away :
(==) RADEON(1): Depth 24, (--) framebuffer bpp 32
(II) RADEON(1): Pixel depth = 24 bits stored in 4 bytes (32 bpp pixmaps)
(==) RADEON(1): Default visual is TrueColor
(==) RADEON(1): RGB weight 888
(II) RADEON(1): Using 8 bits per RGB (8 bit DAC)
(--) RADEON(1): Chipset: "ATI Radeon HD 4350" (ChipID = 0x954f)
(II) RADEON(1): PCIE card detected
(WW) RADEON(1): Color tiling is not yet supported on R600/R700
(II) RADEON(1): KMS Color Tiling: disabled
(EE) RADEON(1): reusing fd for second head
(II) RADEON(1): Output DIN using monitor section Monitor2
(II) RADEON(1): EDID for output DIN
(II) RADEON(1): Output DIN disconnected
(WW) RADEON(1): No outputs definitely connected, trying again...
(II) RADEON(1): Output DIN disconnected
(WW) RADEON(1): Unable to find initial modes
(II) RADEON(1): mem size init: gart size :1fdff000 vram size: s:40000000
visible:3f8d4000
(II) RADEON(1): EXA: Driver will allow EXA pixmaps in VRAM
(==) RADEON(1): DPI set to (96, 96)
How do I force X to use DVI-1 for Monitor2? I tried this in the device
section per man xorg.conf:
Option "Monitor-DVI-1" "Monitor2"
Without any luck. I tried using the old working kernel and a new kernel
so I think this must be the xorg driver itself.
-Rolf
On 07/23/2010 06:43 PM, Rolf wrote:
Xorg - configure will produce a multihead conf file for me. I can load
X with radeon OR intel drives but not both. When loading with both I
get no Xorg.0.log file....
Suggestions?
-Rolf
_______________________________________________
xorg@lists.freedesktop.org: X.Org support
Archives: http://lists.freedesktop.org/archives/xorg
Info: http://lists.freedesktop.org/mailman/listinfo/xorg
Your subscription address: arch...@mail-archive.com