Christian – happy to hear that that took care of the C::Sensor issue.

On a different note, this might be too late now, but if you are interested in 
C::Sensor (and I imagine control/autonomy) applications, we have a meeting here 
in Madison next month that focuses, among other things, on issues of autonomy 
and how Chrono simulation can help design autonomy solutions.
Here’s the link: https://sbel.wisc.edu/magic-2023/. Moving forward, I’ll add 
your email to the list of colleagues with whom we advertise these Chrono 
meetings.

Best of luck with your Chrono project.
Dan
---------------------------------------------
Bernard A. and Frances M. Weideman Professor
NVIDIA CUDA Fellow
Department of Mechanical Engineering
Department of Computer Science
University of Wisconsin - Madison
4150ME, 1513 University Avenue
Madison, WI 53706-1572
608 772 0914
http://sbel.wisc.edu/
http://projectchrono.org/
---------------------------------------------

From: [email protected] <[email protected]> On Behalf 
Of Christian Feller
Sent: Saturday, November 18, 2023 6:35 AM
To: ProjectChrono <[email protected]>
Subject: Re: [chrono] Re: Problems with Chrono OptiX sensors on Ampere GPUs

Hi all,

I upgraded to develop + required OptiX release and asked the affected colleague 
to update his GPU Driver (was older than 530). Et voila, everything runs as 
expected.

Interestingly, it seems that the key was the GPU driver. The "old" Chrono 7 
code with OptiX 7.2 is also running fine when using most recent NVIDIA driver. 
Sometimes the simple solution is the best...

Note, however, that on Pascal and Turing GPUs everything is running fine with 
older drivers. On my Pascal even with <= 500 or so. Ampere GPUs seem to be 
special in that regard (at least the RTX A1000).

Thanks,
Christian



Christian Feller 
<[email protected]<mailto:[email protected]>> schrieb 
am Do., 16. Nov. 2023, 09:47:
Hi Dan, Hi Jason,

thank you both for your answers and support. We are of course aware that our 
setup is a bit behind current development. But as we have built up our vehicle 
modeling toolchain with dependencies on Chrono 7.0 base classes, upgrading to 
develop would probably be some work. Before starting that endeavor, it would be 
cool to have some feedback such as

- a user/developer who can confirm that OptiX sensors work just fine with the 
above or a similar setup (--> we should look for errors on our end that are 
independent of Chrono or OptiX release)
- a user/developer experiences or has experienced the same and has a potential 
fix (--> we can stop looking for stuff that we are doing wrong)
- a user/developer who points us to a compatibility issue with Ampere GPUs and 
the above setup that we are not aware of (--> we can stop looking for other 
errors and go for newer releases)

That being said, I will take a stab at upgrading our toolchain to Chrono 
develop and newer CUDA + OptiX.

Thank you,
Christian
On Wednesday, November 15, 2023 at 9:56:53 PM UTC+1 JASON Z wrote:
Please consider upgrading Chrono to either the 8.0 Release or the main branch 
of the chrono repo.

There have been updates since 7.0.0, and we recently upgraded the Optix to 
7.7.0 in the main repo.

It will be difficult for us to debug on issues happening on previous versions.
On Wednesday, November 15, 2023 at 8:58:26 AM UTC-6 Christian Feller wrote:
Dear all,

we are observing some strange behavior when trying to run lidar sensors from 
the Chrono::Sensor module on PCs with Ampere GPU (compute capability 8.6).

Our setup is as follows
- Chrono 7.0.0
- OptiX 7.2.0
- CUDA 11.6.2
- USE_CUDA_NVRTC=OFF
- CUDA_ARCH_NAME = Auto

Everything is running fine on Pascal or Turing GPUs and we see the lidar point 
cloud data that we would expect. However, when a colleague of mine is building 
and running the same code on his PC with Ampere GPU, the model as such is 
running OK, but we get "flashing" lidar point cloud data. More precisely, the 
correct point cloud data is returned at discrete time points, but most of the 
time the sensor seems to be returning blank data (i.e., zero intensity returns 
everywhere). We get the same behavior when using CUDA NVRTC.

Is this a known issue or has anyone observed something similar? Might this 
problem be solved by upgrading to Chrono 8.0 and/or a more recent OptiX or CUDA 
release?

To my understanding, the above toolset should be compatible with Ampere 8.6 
architecture and the observed behavior appears to be really strange  :/

Any help is highly appreciated!

Thanks,
Christian


Christian Feller

Lead Simulation Engineer
Danfoss Power Solutions GmbH & Co. OHG
Virtual Solutions & Verification
DPS R&D / Software







--
You received this message because you are subscribed to a topic in the Google 
Groups "ProjectChrono" group.
To unsubscribe from this topic, visit 
https://groups.google.com/d/topic/projectchrono/ob5Xw3jkS-4/unsubscribe.
To unsubscribe from this group and all its topics, send an email to 
[email protected]<mailto:[email protected]>.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/projectchrono/f891e87b-3633-485d-90fe-3115cc9a219bn%40googlegroups.com<https://groups.google.com/d/msgid/projectchrono/f891e87b-3633-485d-90fe-3115cc9a219bn%40googlegroups.com?utm_medium=email&utm_source=footer>.
--
You received this message because you are subscribed to the Google Groups 
"ProjectChrono" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to 
[email protected]<mailto:[email protected]>.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/projectchrono/CAOtC80Ma6VZpKG1ukpwatvJ0_sAm9y-UmQcnv6kvoMC6wRpRkA%40mail.gmail.com<https://groups.google.com/d/msgid/projectchrono/CAOtC80Ma6VZpKG1ukpwatvJ0_sAm9y-UmQcnv6kvoMC6wRpRkA%40mail.gmail.com?utm_medium=email&utm_source=footer>.

-- 
You received this message because you are subscribed to the Google Groups 
"ProjectChrono" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/projectchrono/DM8PR06MB77038D5E3845930F801CEF10B1B6A%40DM8PR06MB7703.namprd06.prod.outlook.com.

Reply via email to