Sabrina – a few comments:

  *   With an SPH-based fluid solver (whether Chrono::SPH or DualSPHysics), the 
size of the solid phase elements will dictate the SPH spatial resolution 
(particle-spacing) due to the approach these codes take to enforce the phase 
coupling. With DEM simulations of the size you mention that is not a feasible 
approach.
  *   If you are interested in FSI problems with fewer and larger solid (rigid 
or flexible) objects, then an SPH solver for the fluid phase will work just 
fine. These are the type of problems we target with our current Chrono::FSI 
framework and Chrono::SPH fluid solver.  By the way, I am in contact with 
DualSPHysics folks and will work with them in bringing in DualSPHysics within 
the new Chrono::FSI framework.
  *   For FSI problems like yours (with a DEM solid phase), other fluid solvers 
and different phase-coupling techniques are necessary.  One such option would 
be OpenFOAM with IBM (immersed boundary method).
  *   We will start working soon on an extension of Chrono::FSI to allow 
coupling Chrono multibody systems to other fluid solvers (SPH, but also 
OpenFoam) by providing preCICE adapters. In principle, that approach could be 
extended to include a DEM solid phase (although I do not plan on doing that in 
the immediate future).
  *   It is also likely that we will look into providing an LBM option for 
Chrono::FSI.  I strongly oppose reinventing the wheel, so we will first look 
for an existing third-party alternative (and implement our own only if we do 
not find a suitable external solver).

I believe you underestimate the work required to implement a fluid solver from 
scratch and couple it with a multibody and/or DEM solver.  The work on the 
preCICE interfaces will start relatively soon (in a couple of months or so). I 
think you’re better off waiting for us to have an updated architecture of the 
Chrono::FSI framework and first implementations or preCICE adapters for Chrono 
MBD and Chrono::SPH, at which point you could look into providing a similar 
adapter for DEME.

--Radu

From: [email protected] <[email protected]> On Behalf 
Of Sabrina Lanfranco
Sent: Monday, 25 August 2025 02:19
To: Ruochun Zhang <[email protected]>
Cc: ProjectChrono <[email protected]>
Subject: Re: [chrono] DEM-Engine SPH model integration


Thank you very much for your replies and for the suggestions.

To give you an idea of the scale of my work: during my Master’s thesis I 
carried out simulations with between 500,000 and 1,000,000 elements. For the 
fluid extension, I expect the order of magnitude to be similar.

Addressing the point raised by Ruochun about fluid velocities: in my case I 
need to consider both the solar wind impacting the surface and the rocket 
exhaust gases. These are therefore regimes that can exceed Mach 1. Moreover, 
the focus of my work will not be so much on defining the fluid model itself, 
but rather on analyzing the interactions with the regolith. For this reason, 
and based on your suggestions, it seems to me that the most reasonable options 
are:

  *   to evaluate CFD solvers based on FVM or FEA, if I can find a package 
suitable for my case;
  *   or to consider developing an LBM solver, which from what you say could be 
easier to integrate.

In the coming months I will be focusing on reviewing the literature, and I will 
certainly keep in mind the solver you are developing, should it become 
available in the near future.

I would also like to ask for your advice on hardware. At the moment I am using 
a workstation with two RTX A4000 GPUs: with this setup, a simulation with about 
one million particles and a duration of 430 seconds takes around 6 days of 
computation. Looking ahead to future co-simulations, I have the possibility to 
upgrade: which GPUs would you recommend, while staying within a mid or mid-high 
range, without considering top-of-the-line models?

Thank you again for your availability and support.

Best regards,
Sabrina

Il giorno sab 23 ago 2025 alle ore 15:54 Ruochun Zhang 
<[email protected]<mailto:[email protected]>> ha scritto:
Hi Dan,

Yes I agree with that. It's only when the problem's scale is massive, you have 
to consider what method or tool would allow for a reasonable fluid 
representation at the cost of maybe 2 times the DEM system, not 10 times...

Speaking of using DEME along with another GPU tool like C::FSI, there's one 
thing I should've done long ago and that's allowing the user to select the 
device the solver runs on. Right now it just aggressively takes the two devices 
it first sees and uses them, perhaps the friendliest collaborator...

Ruochun
On Saturday, August 23, 2025 at 9:29:28 PM UTC+8 Dan Negrut wrote:
Ruochun – I read your email, super thoughtful.
Circling back to Sabrina – the question is also how big the problem is.
The FSI solver in Chrono has made huge progress over the last 12 months – Radu 
and Luning and Huzaifa can speak to that.
If we are talking about 1000 DEM particles here, then I think that the easiest 
way to go is to simply simulate the entire thing in Chrono: the DEM part, using 
DEME; the CFD side, in Chrono::FSI. The solution would be entirely on the GPU. 
We never optimized the communication for DEM-FSI “on-the-GPU” simulation since 
we’ve never been faced with such requests. But the big deal here is that the 
memory for DEME and FSI is GPU resident and therefore can draw on the TB/s 
bandwidth and small latency of device memory (compared to host-device traffic). 
I truly think that if there was funding to do this GPU-GPU, FSI-DEME 
co-simulation, a full blown Chrono solution would be top notch.
However, if for the problem at hand Sabrina needs say 1,000,000 DEM particles, 
that’s a different story. I think no matter what approach is taken in that 
case, it’s going to be really, really slow if one fully resolves the dynamics 
of both the particles and the fluid.
Dan
---------------------------------------------
Bernard A. and Frances M. Weideman Professor
NVIDIA CUDA Fellow
Department of Mechanical Engineering
Department of Computer Science
University of Wisconsin - Madison
4150ME, 1513 University Avenue
Madison, WI 53706-1572
608 772 0914<tel:(608)%20772-0914>
http://sbel.wisc.edu/
http://projectchrono.org/<https://urldefense.com/v3/__http:/projectchrono.org/__;!!Mak6IKo!OZCLjM79MLPOovNMJru89rK_Y1dm25qTlMX_9xxrR5uTgkEnFMEtWx94PdyzdwJM9Fyfz8afVBetMp3hJ34g$>
---------------------------------------------

From: [email protected]<mailto:[email protected]> 
<[email protected]<mailto:[email protected]>> On Behalf Of 
Ruochun Zhang
Sent: Saturday, August 23, 2025 8:00 AM
To: ProjectChrono 
<[email protected]<mailto:[email protected]>>
Subject: Re: [chrono] DEM-Engine SPH model integration


Hi Sabrina,

This is very ambitious indeed. I can comment on it based on what I know, and 
you can decide if it is relevant for your research.

First, I hope the APIs provided in DEM-Engine are enough to allow your 
inclusion of the thermal and electrostatic systems. It seems they are, but feel 
free to post threads here if you later discover other needs.

The biggest bottleneck in regolith–fluid simulations is the enormous scale 
required, and that's why Dan suggested using another, unified model for it. But 
since your focus is on building a comprehensive model, not an engineering 
solution for a particular problem, that's not an option, and I assume you'd 
want two-way coupling (i.e., as much coupling as possible) in your simulation. 
I'd also assume you don't need extreme fluid velocities, like over 0.3 Ma. Then 
the biggest question is: since your DEM side model is already heavy, how much 
emphasis would you like to put on the fluid part? Or, put another way, I think 
it's a problem of what fluid–solid ping-pong paradigm to use, not what package 
to use. One thing is for sure: none of the approaches will be “convenient” to 
make happen.

Using SPH is fine, but I suspect you'll need markers much smaller than DEM 
particles, so limiting the overall problem scale is important. It may face more 
challenges if the Reynolds number is high. Also, it would involve the 
integration between two GPU packages, which is a more serious software 
engineering task, and there might be people who have tried that on 
DualSPHysics' forum. I'd say if you go this route, you are certainly treating 
the fluid part no less seriously than the DEM part, and consulting the 
developers there beforehand is certainly needed.

FVM- or FEA-based CFD solvers are fine too, and I can imagine myself 
building/using a PETSc-based solver for this task. The key would be to update 
the DEM particle-represented boundary (if moving mesh) or track/mark the moving 
boundary-influenced nodes (if immersed boundary), which has very little to do 
with DEM itself — it only needs particle pos/vel info, which DEM-Engine can 
certainly supply. I'd probably recommend an immersed boundary approach for 
reasons I'll give in the LBM-related part. This is also how I imagined 
DEM-Engine users would do fluid co-simulation. As you will have a lot of things 
to do on the host anyway (mesh adjustment, node marking...), you'll use 
DEM-Engine's tracker to bring the information to the host, update the mesh and 
fluid solver, run it, and then feed the fluid force back to DEM-Engine. This 
should position you more as a user of computing packages, rather than a solver 
developer. This approach can be used regardless of whether you think the fluid 
is an emphasis, as you can always choose to use fewer features of the solver to 
make the fluid part easier and faster, or do the opposite. But you probably 
won’t modify the fluid solver that much, so there might be more restrictions in 
coding flexibility.

You can also write your own fluid solver, but I think most likely that means 
the fluid is not a main focus of the research you want to present. And if you 
do, like Dan said, I would say LBM is a good choice. I only recently became 
interested in LBM’s usage in related co-simulations. Two main benefits:

  1.  It's fully Eulerian, therefore easy to use alongside DEM, as the DEM 
particles are the only moving part. For the LBM part, you just mark the DEM 
particle-shadowed grid points as solid. It's similar to why I think the 
immersed boundary is better for your use case. The method is also in general 
easy to implement. You can literally ask ChatGPT to write one for you, after 
you read the basics of it.
  2.  It's massively parallel, and should go well with DEM-Engine on GPUs.

The downside is that LBM is certainly much less used and appreciated than, say, 
FVM. While it should be very serviceable for you, convincing the community 
might be another issue. You could, of course, implement a FVM solver yourself — 
it's again very doable if you don't aim too high. It really doesn't matter if 
it's fully on GPU and only exchanges device arrays with DEM (“implementing a 
fluid model within DEM itself”), or if it brings data to the host for 
processing; I think in the grand scheme of such an ambitious project, it's a 
minor issue and we can always resolve it later if it matters.

As for the software we can provide: Publishing a GPU-based LBM solver is a 
possibility in the longer term, but you have a PhD to finish, so it doesn't 
seem you can wait for us. You could write it yourself, as making one that is at 
least usable is not too hard. I do have a plan to provide a performance-centric 
FEA/FVM-based fluid solver on GPU relatively soon. If you are going to spend a 
couple of months looking into the DEM model before having to consider fluid, 
then the timing may line up. It should naturally go well in co-simulation with 
DEM-Engine or Chrono, as it's the same family and allows for step-wise 
advancing of the simulation, too. However, as it stands, we cannot make 
promises to you about a ready-to-use DEM-capable fluid solution right now.
Let me know if you have further questions,
Ruochun
On Friday, August 22, 2025 at 1:23:00 AM UTC+8 
[email protected]<mailto:[email protected]> wrote:
Thank you for your reply,
I was already aware of the possibility offered by Chrono, but I necessarily 
have to continue using DEM, since my entire master’s thesis was developed on 
it. By customizing the CUDA kernels, I was able to implement a thermal model 
and modify the electrostatic one. The goal was to build a comprehensive 
regolith model, not just a mechanical one, and moving to Chrono would mean 
losing this work. For my PhD, I will also need to extend what I have done so 
far to include interactions with plasma, which makes it essential to keep the 
electrostatic model. Thank you again for your response and for the suggestion 
regarding DEM-LBM. I now look forward to any comments from Ruochun as well.

Best regards,
Sabrina
Il giorno giovedì 21 agosto 2025 alle 18:46:13 UTC+2 Dan Negrut ha scritto:
Sabrina,
In theory, you can do this in the SPH solver in Chrono, hopefully my colleague 
Radu will comment on this. It’d require very large sim times because the number 
of SPH particles would be really large, which is needed to capture the dynamics 
of the grains.
Another way to do it is DEM-LBM. Chrono has no support for this, and no plan to 
implement in the immediate future. The sim times would probably be very long, 
but it’d be a nice approach. If Ruochun sees this, he might comment on this 
idea.
Lastly, you can homogenize this and represent the regolith–fluid interactions 
through a continuum and then use the CRM solver in Chrono. You’d need to have 
the right material model, which means that you’ll have to go beyond the 
hypo-elastoplastic material model that we have there right now (Drucker-Prager 
plasticity, with no cap).
Dan
---------------------------------------------
Bernard A. and Frances M. Weideman Professor
NVIDIA CUDA Fellow
Department of Mechanical Engineering
Department of Computer Science
University of Wisconsin - Madison
4150ME, 1513 University Avenue
Madison, WI 53706-1572
608 772 0914<tel:(608)%20772-0914>
http://sbel.wisc.edu/
http://projectchrono.org/<https://urldefense.com/v3/__http:/projectchrono.org/__;!!Mak6IKo!Nv_23PeePQg1wn3tIwGRXSublYqV9-esQ2vvdhMbDxIqUQX-Un8-oO32B-t1xcGxam4Z1U2J7cwU2t8$>
---------------------------------------------

From: [email protected]<mailto:[email protected]> 
<[email protected]<mailto:[email protected]>> On Behalf Of 
Sabrina Lanfranco
Sent: Thursday, August 21, 2025 11:36 AM
To: ProjectChrono 
<[email protected]<mailto:[email protected]>>
Subject: [chrono] DEM-Engine SPH model integration


Hello everyone,
I am currently using DEM-Engine to model planetary regolith in scenarios 
involving interactions with space exploration objects. I would now like to 
extend this modeling to include the study of regolith–fluid interactions.

In your opinion, what would be the most convenient approach: integrating DEM 
with a solver such as DualSPHysics, or directly implementing a fluid model 
within DEM itself? In both cases, this would require modifications to the DEM 
codebase. That is why I am writing here, hoping to get some feedback from the 
developers: perhaps there is already something undocumented, or maybe you have 
already considered an approach in this direction.
--
You received this message because you are subscribed to the Google Groups 
"ProjectChrono" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected]<mailto:[email protected]>.
To view this discussion visit 
https://groups.google.com/d/msgid/projectchrono/fa00b521-e8a1-4dd6-be54-7e37358cd230n%40googlegroups.com<https://urldefense.com/v3/__https:/groups.google.com/d/msgid/projectchrono/fa00b521-e8a1-4dd6-be54-7e37358cd230n*40googlegroups.com?utm_medium=email&utm_source=footer__;JQ!!Mak6IKo!NWYNqoiUNqQ1gbNA5zSq-hOk7tPnfhQg71_vbAlrHiTOtSqPfwXJytliXNnuGCi9xLpRYSrf2glwAO78v2X2$>.
--
You received this message because you are subscribed to the Google Groups 
"ProjectChrono" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected]<mailto:[email protected]>.
To view this discussion visit 
https://groups.google.com/d/msgid/projectchrono/f67977ca-5c2c-41b6-ad5e-ba3298ede1ban%40googlegroups.com<https://urldefense.com/v3/__https:/groups.google.com/d/msgid/projectchrono/f67977ca-5c2c-41b6-ad5e-ba3298ede1ban*40googlegroups.com?utm_medium=email&utm_source=footer__;JQ!!Mak6IKo!Nv_23PeePQg1wn3tIwGRXSublYqV9-esQ2vvdhMbDxIqUQX-Un8-oO32B-t1xcGxam4Z1U2JCVg1sNc$>.
--
You received this message because you are subscribed to the Google Groups 
"ProjectChrono" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to 
[email protected]<mailto:[email protected]>.
To view this discussion visit 
https://groups.google.com/d/msgid/projectchrono/b8b5fb57-3605-4a6b-8ab8-2f377a0583a7n%40googlegroups.com<https://urldefense.com/v3/__https:/groups.google.com/d/msgid/projectchrono/b8b5fb57-3605-4a6b-8ab8-2f377a0583a7n*40googlegroups.com?utm_medium=email&utm_source=footer__;JQ!!Mak6IKo!OZCLjM79MLPOovNMJru89rK_Y1dm25qTlMX_9xxrR5uTgkEnFMEtWx94PdyzdwJM9Fyfz8afVBetMu3OoyXT$>.
--
You received this message because you are subscribed to the Google Groups 
"ProjectChrono" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to 
[email protected]<mailto:[email protected]>.
To view this discussion visit 
https://groups.google.com/d/msgid/projectchrono/CAG%2BQucN7q8VgYh5%2ByJpZyS9jJj2CUZ-hHvKyo9CK02wW%3DF2t1Q%40mail.gmail.com<https://urldefense.com/v3/__https:/groups.google.com/d/msgid/projectchrono/CAG*2BQucN7q8VgYh5*2ByJpZyS9jJj2CUZ-hHvKyo9CK02wW*3DF2t1Q*40mail.gmail.com?utm_medium=email&utm_source=footer__;JSUlJQ!!Mak6IKo!OZCLjM79MLPOovNMJru89rK_Y1dm25qTlMX_9xxrR5uTgkEnFMEtWx94PdyzdwJM9Fyfz8afVBetMm8CMPTx$>.

-- 
You received this message because you are subscribed to the Google Groups 
"ProjectChrono" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/projectchrono/CH3PPF46CDC21852E492FD8CD7352EDD63BA73EA%40CH3PPF46CDC2185.namprd06.prod.outlook.com.

Reply via email to