Re: [arch-general] xrandr with XPS 13" (3840x2160) HiDPI and 30" (2560x1600) LowDPI
Hi Tyler, I did a lot of experimenting a while back to find something that works well for me. I have a MacBook Pro with a HiDPI screen connected to two 1080p external monitors via ThunderBolt-to-DisplayPort adapters. I place my MacBook Pro to the right of the two external monitors. I also run i3. Here's what I found worked for me: .Xresources contains: Xft.dpi: 220 .xinitrc contains: xrdb -merge ~/.Xresources export QT_AUTO_SCREEN_SCALE_FACTOR=1 export GDK_SCALE=1.5 export GDK_DPI_SCALE=0.75 exec i3 xrandr command for external monitors: xrandr --output DP1 --scale 1.5x1.5 --auto --pos 0x0 --primary --output DP2 --scale 1.5x1.5 --auto --pos 2880x0 --output eDP1 --mode 1920x1200 --pos 5760x0 xrandr command for switching back to internal laptop screen: xrandr --output eDP1 --auto --primary --output DP1 --off --output DP2 --off I tried playing with panning but I found it easier to just set similar resolutions using whatever mode I want on the monitors and the proper scale option and then specifying the absolute positions with --pos. When external monitors are enabled, I reduce the resolution of my laptop screen because the laptop is on a stand further away from me so I need the text a bit larger. Good luck. Let us know how it goes. Thanks, Eric On Mon, Jul 30, 2018 at 11:18 PM Tyler wrote: > Hi, > > I am using ArchLinux with i3 on my Dell XPS 13" 9370 @ 3840x2160) HiDPI > and everything works great with: > > .Xresources: > > ! xft fonts > !-- > Xft.dpi: 220 > Xft.autohint: 0 > Xft.lcdfilter: lcddefault > Xft.hintstyle: hintfull > Xft.hinting: 1 > Xft.antialias: 1 > Xft.rgba: rgb > > ! urxvt > !- > URxvt*font: xft:DejaVu Sans Mono for Powerline:size=12: \ > minspace=False:antialias=true, \ > xft:Segoe UI Emoji:size=12:minspace=False:antialias=true > > URxvt*boldFont: xft:DejaVu Sans Mono for Powerline:size=12: \ > minspace=False:antialias=true, \ > xft:Segoe UI Emoji:size=12:minspace=False:antialias=true > > URxvt.letterSpace: -1 > > > > and in /etc/profile.d/hidpi.sh > > export GDK_SCALE=2 > export GDK_DPI_SCALE=0.5 > export QT_AUTO_SCREEN_SCALE_FACTOR=0 > export QT_SCREEN_SCALE_FACTORS=2 > export QT_QPA_PLATFORMTHEME=qt5ct > > > > However recently I bought a Dell DA300 Mobile Adapter and decided I > wanted to use my external Dell 30" 3000WFP (2560x1600) over DisplayPort. > > I found this blog article > https://blog.summercat.com/configuring-mixed-dpi-monitors-with-xrandr.html > which explains exactly what I am trying to do. The only difference is > the placement of the external screen (mine is on the left of the laptop) > and the screen size and resolution. > > In that article he uses his HiDPI on both screens and then scales down > on the external screen, which is why he doubles the resolution of the > external display. The reason he does that is so he doesn't have to touch > .Xresources or fiddle with toolkit scaling options. > > Using this command I was able to get it working great with a single > external monitor ie the Dell 30" 3000WFP @ (2560x1600) > > xrandr --dpi 220 --fb 5120x3200 \ > --output eDP1 --off \ > --output DP1 --scale 2x2 --panning 5120x3200 > > With the dual-monitor setup I'm struggling to understand the panning > option. I have looked at https://wiki.archlinux.org/index.php/xrandr and > the man file and still couldn't figure that track x track y part out. > It's not very easy to understand. > > So far for the external monitor I have this: > > xrandr --dpi 220 --fb 8960x5360 \ > --output eDP1 --mode 3840x2160 \ > --output DP1 --scale 2x2 --pos -2560x0 --panning 5120x3200+2560+0 > > What I want is to be able to access the whole area of both screens. In > his example he has a laptop at 3200x1800 and an external monitor at > 1920x1080 > > > Dual monitors > > > > When I want to use both monitors, this is the command I run: > > > > xrandr --dpi 276 --fb 7040x3960 \ > > --output eDP-1 --mode 3200x1800 \ > > --output DP-1-2 --scale 2x2 --pos 3200x0 --panning 3840x2160+3200+0 > > > > Here's an explanation of the options: > > > > Global options: > > --dpi 276 sets the DPI to 276. > > --fb 7040x3960 creates one screen with resolution 7040x3960. > This is the combined resolution of the two monitors. The high DPI monitor > has 3200x1800 resolution. The lower DPI monitor has 1920x1080 resolution, > but I double it as I scale it by 2 (see below). Combine these like so: > 3200+1920*2 x 1800+1080*2 = 7040x3960. Both monitors share this screen. > > High DPI monitor options (--output eDP-1): > > --mode 3200x1800 says to use resolution 3200x1800. This is the > default, but specifying it is necessary if the monitor is disabled (as it > is when using the external monitor by itself) as it enables the monitor. > > Lower D
Re: [arch-general] Broke Arch trying to change graphics driver
On Sat, May 5, 2018 at 2:45 AM mar77i via arch-general < arch-general@archlinux.org> wrote: > > When I tried "sudo modprobe nvidia" I got an "exec format error" > > message and the nvidia module refused to load. > > You need to build your driver for each kernel separately. That's why you > can get [0] as a package. > > cheers! > mar77i > > [0] https://www.archlinux.org/packages/testing/x86_64/nvidia-dkms/ > > Hmm, I never had to do that before. The nvidia modules get installed in the 4.16 extramodules directory so they only need to be compiled once for all 4.16 kernels. At least that is my understanding. I was running the nvidia package with 4.16.4 with no problems until I tried to reinstall. -Eric
Re: [arch-general] Broke Arch trying to change graphics driver
On Fri, May 4, 2018 at 1:34 PM Junayeed Ahnaf via arch-general < arch-general@archlinux.org> wrote: > > So here's what happened, I was facing issue with proprietary nVidia driver > (screen tearing) so I thought I want to switch to the free one, so I > uninstalled nvidia with "pacman -Rs nvidia" , then I thought might as well > try a different kernel and installed zen. Now when I boot into either > kernel I am seeing "Starting display manager" then a black screen. I can > login to tty2 , but no display comes up. > > I want to switch back to free driver, with preferably zen kernel. What > should I do? Thanks in advance. > I just hit a similar issue when trying to switch from nvidia to nouveau. I was trying to switch from i3/Xorg to sway/Wayland. The proprietary nvidia driver is unsupported with sway. I uninstalled the nvidia modules like you did, disabled the lightdm display manager, and booted into nouveau. It worked but was much too slow to be usable with my 4K display. I was getting major lag. It was like getting 1 HZ refresh rate moving my mouse pointer around. I switched back to i3/Xorg with the nvidia modules but when rebooting to test, it failed with a black screen like you got. When I tried "sudo modprobe nvidia" I got an "exec format error" message and the nvidia module refused to load. The nvidia module installed to the extramodules directory so I expected it to work with the 4.16.4 kernel I'm running even though it was built on 4.16.1, but apparently that is not the case. Uninstalling and reinstalling nvidia and nvidia-utils did not fix things. Eventually I gave up and did a "zfs rollback" to restore the previous system state and everything was fine after that. Sorry I don't have good solution for you, but I wanted to chime in that I also encountered the same problem and it was related to the nvidia modules not being loadable. Regards, Eric
Re: [arch-general] gnucash [aur]->[community]?
On Tue, Oct 10, 2017 at 4:45 PM, Morten Linderud wrote: > On Tue, Oct 10, 2017 at 04:34:35PM -0400, Eric Blau wrote: >> >> While it is true that webkitgtk2 has security vulnerabilities and >> should not be used for web browsing, web apps, etc., gnucash merely >> uses it to generate reports based on your own data. As such, it's >> likely not vulnerable to the same security issues as other web >> applications based on it. >> >> I know the developers are in the process of migrating away from it, >> but until that time, I think it should be supported and not dropped >> for the above reason. >> > > webkitgtk2 would have do be added back to the repos for this to happen, and > that > won't happen. It was a big deal to remove it in the first place. > > https://www.archlinux.org/todo/phasing-out-webkitgtk2/ OK, thanks for the response. It's a shame that gnucash is lumped with other packages with real attacks possible against them, but I understand why it had to be done. Hopefully gnucash can migrate off webkitgtk2 quickly and make it back in to the repos. -Eric
Re: [arch-general] gnucash [aur]->[community]?
On Tue, Oct 10, 2017 at 3:57 PM, Morten Linderud wrote: > On Tue, Oct 10, 2017 at 03:49:27PM -0400, Ido Rosen wrote: >> Gnucash has 44 votes on AUR. It's useful (and very old, stable) >> accounting/bookkeeping software. Would any TUs be willing to migrate it >> from AUR to [community]? >> >> https://aur.archlinux.org/packages/gnucash/ > > It was moved from [extra] on the 30th of june because it still depends on > webkitgtk2, which is flawed and has multiple security issues. > While it is true that webkitgtk2 has security vulnerabilities and should not be used for web browsing, web apps, etc., gnucash merely uses it to generate reports based on your own data. As such, it's likely not vulnerable to the same security issues as other web applications based on it. I know the developers are in the process of migrating away from it, but until that time, I think it should be supported and not dropped for the above reason. Regards, Eric
Re: [arch-general] Tobias Powalowski and his nonsensical maintenance decisions
On Fri, Apr 28, 2017 at 2:19 PM, Carsten Mattner wrote: > On Fri, Apr 28, 2017 at 5:11 PM, Eric Blau wrote: >> On Fri, Apr 28, 2017 at 12:29 PM, Carsten Mattner via arch-general >> wrote: >> >> There's a fix that's been submitted to the tip, but no effort has >> been made to patch the bug in the 4.10.x stable series. It seems the >> devs don't care about having a stable kernel to use, only about >> moving forward the tip and staying on the bleeding edge. Shouldn't >> at least showstopper kernel panics be patched to the "stable" >> release? >> >> I requested a fix on the tip to be patched in the 4.9.x stable >> series a couple months ago because I tested the fix myself and >> verified it "worked for me" but it was subsequently reverted. I'm >> sure I don't know enough about the i915 driver to be able to make >> these types of decisions about what should or should not be patched >> other than to help with testing, but it would be nice if the i915 >> dev team made an effort to propagate fixes to stable as well. > > It's possible that the fix causes other issues, but I've also seen > crash fixes take very long until landing in a stable release, > sometimes taking 2 or 3 releases, while refactorings are intertwined > with other fixes in stable releases. It looks odd. Yes, agreed here. The fix I requested to be patched to 4.9.x when it was the stable release back in the Feb/March timeframe was from September 2016 but still hadn't made it into a stable release 5 or 6 months later. > I wonder how the situation is with AMD and nVidia GPUs with open and > closed driver stacks. I don't have these problems with a NVIDIA GeForce GTX 970 on my desktop machine. > It seems that if you run GNOME3 with GTK3 under Wayland and only GTK3 > apps with GDK_BACKEND=wayland and no X app, then it works well, but > that's like forcing everyone to use just Android apps under ChromeOS. > > With libweston and libweston-desktop and further fixes in Xwayland, > maybe 2018 we will finally have what Wayland promised very long ago. I > wouldn't blame outsiders if they looked at Linux Desktop and thought > that there's too many variants and too much change with little > stabilization going on. A big reason why Linux Desktop seems like a lost cause. > Then there's outstandingly stable software like GNU Emacs, FVWM, xterm > or XMonad. Your config from a decade or two ago still works and with > minimal to none deprecation disruption. I prefer stable software that lets me get my job done like i3, vim, etc. I rarely have problems running the latest versions included in Arch. The kernel is another story altogether. I frequently have to switch between linux and linux-lts or build my own kernel with various patches in order for my machines to run stable. > So when it comes to open source video driver stacks, the best stragey > is running one of the last two generations of GPU (Broadwell and > Skylake) and always stay in thet range since older GPUs lose QA > coverage with new GPUs coming out. If the capabilities of a GPU are > clear and you cannot expect to have newer OpenGL support in a newer > Mesa, then it would make sense to have a stable but old i915 stack for > old GPUs that doesn't change vs new i915 stack for newer GPUs, but > Linux is a monolithic design without driver ABIs for good reasons that > show their disadvantage when QA is insufficient. My 2015 Broadwell-based MacBook Pro is not that old, yet I have i915 issues with it almost kernel release. -Eric
Re: [arch-general] Tobias Powalowski and his nonsensical maintenance decisions
On Fri, Apr 28, 2017 at 2:20 PM, Carsten Mattner wrote: > Eric, does it also fail in XFCE or GNOME3? Like I wrote, I've found > Plasma's compositor to be buggier. I use i3 with compton as a compositor. Maybe I would have better luck running 4.10.x without compton. I haven't tried that yet. I reverted back to linux-lts which seems to be running fine on my early 2015 MacBook Pro 12,x with the exception of a failed resume from hibernate every once and a while. -Eric
Re: [arch-general] Tobias Powalowski and his nonsensical maintenance decisions
On Fri, Apr 28, 2017 at 12:29 PM, Carsten Mattner via arch-general wrote: > > The constant churn of refactorings and whatnot makes it impossible for > all the hardware that say i915 supports to actually work reliably > across kernel releases. What used to work flawlessly in 4.1 can be > broken in 4.4 because the devs do not test with Intel GPUs older than > Gen7 for example, all the while claiming it's supported in the now > refactored but practically untested code. Carsten, I agree with you about i915. I've been hitting this kernel panic regularly, about once per day, freezing my entire machine: Bug 99295 - [Regression BDW] kernel panic in Intel i915 module, complete system freeze in 4.10-rc2 https://bugs.freedesktop.org/show_bug.cgi?id=99295 There's a fix that's been submitted to the tip, but no effort has been made to patch the bug in the 4.10.x stable series. It seems the devs don't care about having a stable kernel to use, only about moving forward the tip and staying on the bleeding edge. Shouldn't at least showstopper kernel panics be patched to the "stable" release? I requested a fix on the tip to be patched in the 4.9.x stable series a couple months ago because I tested the fix myself and verified it "worked for me" but it was subsequently reverted. I'm sure I don't know enough about the i915 driver to be able to make these types of decisions about what should or should not be patched other than to help with testing, but it would be nice if the i915 dev team made an effort to propagate fixes to stable as well. -Eric
Re: [arch-general] Screen lock stopped working
On Sun, Sep 18, 2016 at 1:22 AM, Ralf Mardorf wrote: > On Fri, 16 Sep 2016 12:14:30 -0400, David Rosenstrauch wrote: > >I'm running XFCE desktop btw. > > On Arch Linux using Xfce not necessarily explains what screen lock / > screensaver you are using. > Personally, I noticed that using xscreensaver, the screen locks but never enters standby, suspend, or off DPMS power saving modes. If I run "xset dpms force off" the monitor immediately powers off and comes back on when I hit a key or move the mouse. Also, xautolock is failing to detect inactivity and suspend my computer as I have it configured to do. This happened since upgrading from linux 4.6.4 to 4.7.2 and continues to occur with 4.7.4. I've noticed this on two separate computers, one running Intel graphics and the other running the nvidia driver. The version of xscreensaver has been unchanged for me at 5.35 since 1 July. -Eric