Hi, I'm using LTSP with VirtualGL on Ubuntu and it works great.
My approach was to start vglclient from an ldm script (/usr/share/ldm/rc.d/X69-servergl): if boolean_is_true "$SERVER_GL"; then vglclient -detach #Read virtualgl documentation for environment variables. yuv and pbo have been #the best performance boosters for my setup. VGL_COMPRESS=yuv VGL_READBACK=pbo CLIENT_ENV="$CLIENT_ENV VGL_COMPRESS=yuv VGL_READBACK=pbo SERVER_GL=$SERVER_GL " fi And put an xsession script on the server (/etc/X11/Xsession.d/90w-servergl) so the whole Xsession uses virtualgl: if [ -n "$SERVER_GL" ]; then STARTUP="/usr/bin/env LD_PRELOAD=libdlfaker.so:librrfaker.so $STARTUP" fi Regards. JIS. ------------------------------------------------------------------------------ This SF.net email is sponsored by Windows: Build for Windows Store. http://p.sf.net/sfu/windows-dev2dev _____________________________________________________________________ Ltsp-discuss mailing list. To un-subscribe, or change prefs, goto: https://lists.sourceforge.net/lists/listinfo/ltsp-discuss For additional LTSP help, try #ltsp channel on irc.freenode.net