Dear project manager, I am part of a student research group currently working with Project Chrono for a CUDA-based DEM simulation using Chrono,GPU. We are running our setup inside a Docker container (Windows) and are encountering shared library issues related to Vulkan bridge dependencies. The vulkan info fails inside the container with vkCreateInstance ERROR_INCOMPATIBLE_DRIVER, and the WSL host has no /usr/lib/wsl/lib/nvidia_icd.json present. CUDA works (nvidia-smi works), but Vulkan cannot create an instance. Below are our errors and results.
Thank you- Leah Richard ERROR_INCOMPATIBLE_DRIVER, Chrono VSG demos abort with vsg::Exception Environment • Host: Windows + WSL2 Ubuntu (WSLg enabled) • Workload: Project Chrono / chrono-wisc container image (chrono-wisc:benchmark3-prebuilt) • GPU: NVIDIA GPU available and usable for CUDA inside container (--gpus all works) • Goal: Run Chrono VSG visualization demos inside Docker with window output (Vulkan) What works (important) 1. CUDA compute works in-container for benchmark runs (FSI runs start and can complete for sane parameters). 2. Chrono demo initializes terrain/vehicle and reaches the visualization setup stage before aborting (so CPU-side setup is fine). What fails A) VSG/Vulkan demo aborts Running (inside container): cd /opt/chrono/binexport LD_LIBRARY_PATH=/opt/chrono/lib:$LD_LIBRARY_PATH./demo_VEH_CRMTerrain_WheeledVehicle Result: • Runs through terrain + vehicle creation • Fails at “Create path…” • Aborts with: ◦ terminate called after throwing an instance of 'vsg::Exception' ◦ Aborted (core dumped) B) Vulkan tools inside container cannot create a Vulkan instance Inside container: apt-get update && apt-get install -y vulkan-tools libvulkan1 vulkaninfo --summary Result: • Cannot create Vulkan instance • vkCreateInstance failed with ERROR_INCOMPATIBLE_DRIVER This strongly indicates the container’s Vulkan loader cannot access a compatible ICD/driver stack. Steps already attempted 1) Fixed command formatting / shell issues • Resolved bash: cd: too many arguments by using proper line breaks / &&. 2) Verified this is not a general “laptop can’t run it” issue • A CUDA-side benchmark was tested with different --d0_multiplier values. • d0_multiplier=1.0 succeeded with reasonable discretization: ◦ HSML=0.02, binSize0=0.04, gridSize=755 181 40 • Smaller multipliers caused “illegal memory access,” which correlated with grid explosion: ◦ d0_multiplier=0.1 → gridSize=7550 1815 403 → CUDA illegal memory access ◦ This appears to be parameter-driven indexing/overflow, not insufficient hardware. 3) Tried to wire WSLg display + Vulkan libraries into container Attempted Docker run with: • --device=/dev/dxg (present in WSL) • /dev/dri passthrough was attempted but failed: ◦ error adding custom device "/dev/dri": no such file or directory ◦ /dev/dri not present on this WSL setup • Mounted WSL libraries and set library path: ◦ -v /usr/lib/wsl:/usr/lib/wsl:ro ◦ LD_LIBRARY_PATH=/usr/lib/wsl/lib:/opt/chrono/lib:$LD_LIBRARY_PATH • Mounted X11 socket: ◦ -v /tmp/.X11-unix:/tmp/.X11-unix ◦ -e DISPLAY=$DISPLAY • Set NVIDIA container capabilities: ◦ NVIDIA_DRIVER_CAPABILITIES=compute,utility,graphics,display Despite this, Vulkan inside container still fails with ERROR_INCOMPATIBLE_DRIVER. 4) Investigated Vulkan ICD manifests on the WSL host On WSL host: • No files matching: ◦ /usr/lib/wsl/lib/*icd*.json • Found these ICD JSON files: ◦ /usr/share/vulkan/icd.d/intel_hasvk_icd.x86_64.json ◦ /usr/share/vulkan/icd.d/nouveau_icd.x86_64.json ◦ /usr/share/vulkan/icd.d/intel_icd.x86_64.json ◦ /usr/share/vulkan/icd.d/lvp_icd.x86_64.json ◦ /usr/share/vulkan/icd.d/radeon_icd.x86_64.json ◦ /usr/share/vulkan/icd.d/gfxstream_vk_icd.x86_64.json ◦ /usr/share/vulkan/icd.d/virtio_icd.x86_64.json Notably absent: nvidia_icd.json under /usr/lib/wsl/lib, which many WSL Vulkan setups rely on for NVIDIA Vulkan in WSL. 5) Proposed next attempt (not yet confirmed successful) Start container while: • mounting /usr/share/vulkan/icd.d • forcing Vulkan to use a specific ICD via VK_ICD_FILENAMES, e.g.: ◦ gfxstream_vk_icd.x86_64.json (WSLg-relevant) ◦ or virtio_icd.x86_64.json as fallback Example: docker run --rm -it --gpus all \ --device=/dev/dxg \ -e DISPLAY=$DISPLAY \ -v /tmp/.X11-unix:/tmp/.X11-unix \ -v /usr/lib/wsl:/usr/lib/wsl:ro \ -v /usr/share/vulkan/icd.d:/usr/share/vulkan/icd.d:ro \ -e VK_ICD_FILENAMES=/usr/share/vulkan/icd.d/gfxstream_vk_icd.x86_64.json \ -e LD_LIBRARY_PATH=/usr/lib/wsl/lib:/opt/chrono/lib:$LD_LIBRARY_PATH \ -e NVIDIA_DRIVER_CAPABILITIES=compute,utility,graphics,display \ chrono-wisc:benchmark3-prebuilt bash Then re-test: VK_LOADER_DEBUG=all vulkaninfo --summary Current diagnosis / suspected root cause • The container can’t create a Vulkan instance because the Vulkan loader does not see a compatible Vulkan driver/ICD in the container environment. • In WSL2, Vulkan support depends heavily on the Windows-side GPU driver + WSLg bridge components. The absence of an NVIDIA WSL ICD JSON (e.g., nvidia_icd.json) on the host suggests the WSL Vulkan driver stack may be incomplete/misaligned for NVIDIA Vulkan in this environment, or the container isn’t pointed at the correct ICD. • VSG fails because Vulkan cannot initialize → Chrono demo aborts with vsg::Exception. Evidence / error logs to include • From inside container: ◦ vkCreateInstance failed with ERROR_INCOMPATIBLE_DRIVER ◦ terminate called after throwing an instance of 'vsg::Exception' • From docker run attempt: ◦ /dev/dri: no such file or directory (WSL doesn’t expose /dev/dri) What we need from support / what we’re asking • How to properly enable Vulkan (WSLg) inside Docker on WSL2 with NVIDIA GPU. • Whether a missing/incorrect Vulkan ICD (e.g., no nvidia_icd.json under /usr/lib/wsl/lib) is expected, and what is the supported fix. • Recommended container runtime configuration for Vulkan + WSLg + NVIDIA (ICD paths, required mounts, env vars like VK_ICD_FILENAMES, etc.). -- You received this message because you are subscribed to the Google Groups "ProjectChrono" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion visit https://groups.google.com/d/msgid/projectchrono/0d4df355-f4f9-4bc7-a34e-5664405e0eeen%40googlegroups.com.
