I'm working on my 80 year old mother's machine remotely and cannot see the screen. I have a few question about this new Open Source Radeon driver.
1) I'm seeing both radeon and radeonfb in memory. Is this OK, or does it imply a problem of some sort? Can xorg-server use radeon while the console is possibly using radeonfb or is the system misconfigured? [ 8.571434] Adding 2048276k swap on /dev/hda6. Priority:-1 extents:1 across:2048276k [ 20.010361] eth0: setting full-duplex. [ 26.961162] [drm] Initialized radeon 1.29.0 20080528 on minor 0 [ 27.327626] agpgart-ati 0000:00:00.0: AGP 3.0 bridge [ 27.327652] agpgart-ati 0000:00:00.0: putting AGP V3 device into 8x mode [ 27.327680] radeonfb 0000:01:05.0: putting AGP V3 device into 8x mode [ 27.470855] [drm] Setting GART location based on new memory map [ 27.470867] [drm] Loading R200 Microcode [ 27.470914] [drm] writeback test succeeded in 1 usecs [ 30.344023] eth0: no IPv6 routers present DesertFlower ~ # lsmod Module Size Used by radeon 115840 2 snd_atiixp 14096 0 snd_ac97_codec 89412 1 snd_atiixp ac97_bus 1316 1 snd_ac97_codec radeonfb 55956 0 fb_ddc 1668 1 radeonfb DesertFlower ~ # 2) From a distance I'm wondering what these last few lines of Xorg.0.log are telling me about the state the screen is likely in? This machine has both a VGA connector (in use) and an S-Video connector. (not in use) Is the 'Setting screen physical size to 320 x 240' an indication that if she was to look at the screen right now it's not running in the normal 1024x768 sort of mode? How can I tell at a distance what the screen resolution might be? (II) Initializing built-in extension COMPOSITE (II) Initializing built-in extension DAMAGE (II) Initializing built-in extension XEVIE drmOpenDevice: node name is /dev/dri/card0 drmOpenDevice: open result is 10, (OK) drmOpenByBusid: Searching for BusID pci:0000:01:05.0 drmOpenDevice: node name is /dev/dri/card0 drmOpenDevice: open result is 10, (OK) drmOpenByBusid: drmOpenMinor returns 10 drmOpenByBusid: drmGetBusid reports pci:0000:01:05.0 (II) AIGLX: enabled GLX_MESA_copy_sub_buffer (II) AIGLX: enabled GLX_SGI_swap_control and GLX_MESA_swap_control (II) AIGLX: enabled GLX_texture_from_pixmap with driver support (II) AIGLX: Loaded and initialized /usr/lib/dri/r200_dri.so (II) GLX: Initialized DRI GL provider for screen 0 (II) RADEON(0): Setting screen physical size to 320 x 240 (EE) config/hal: couldn't initialise context: (null) ((null)) disable TVDAC I do see this implying (to me) that maybe it's in a 1024 type mode: (==) RADEON(0): Using 16 bit depth buffer (II) RADEON(0): RADEONInitMemoryMap() : (II) RADEON(0): mem_size : 0x04000000 (II) RADEON(0): MC_FB_LOCATION : 0x0fff0c00 (II) RADEON(0): MC_AGP_LOCATION : 0xffffffc0 (II) RADEON(0): Depth moves disabled by default (II) RADEON(0): Using 8 MB GART aperture (II) RADEON(0): Using 1 MB for the ring buffer (II) RADEON(0): Using 2 MB for vertex/indirect buffers (II) RADEON(0): Using 5 MB for GART textures (II) RADEON(0): Memory manager initialized to (0,0) (1024,8191) (II) RADEON(0): Reserved area from (0,1024) to (1024,1026) (II) RADEON(0): Largest offscreen area available: 1024 x 7165 (II) RADEON(0): Will use front buffer at offset 0x0 (II) RADEON(0): Will use back buffer at offset 0x800000 (II) RADEON(0): Will use depth buffer at offset 0xa00000 (II) RADEON(0): Will use 53248 kb for textures at offset 0xc00000 Thanks, Mark