OK,I know what you means. But I test these pipeline that data is not 
synchronized many times( about more than 700 times,I use periodical build to 
test it automatically). Shared data that is not synchronized is processed 
correctly every time. Do you really encounter the situation that shared data is 
corrupt in parallel task?

In some scenarios, I want to parallel build one more artifacts that used in 
different platforms from the same source code. And then I parallel test all 
artifacts in their corresponding platforms(One artifact might be used in one 
more platforms).

build platform: A B C
test platform: A(A1 A2 A3), B(B1 B2), C(C1 C2)

                
           -> A1
                  
  -> A -> A2
                   
|       -> A3
                  
/     
source code --> B -> B1
                  
\        -> B2
                
   |
            
        -> C -> C1
                
           -> C2
                
           -> C3

Pipeline code like this:

pipeline {
    agent any
    stages {
        stage('Parallel Build') {
            steps {
                
script {
                
    def builds = [:]
                
    def tests = [:]
                
    stash name: 'src', include: 'src/**'
                
    
                
    def build_action = { platform, tests ->
                
        
                
            //build in some slave 
node.
                
            node ("${platform}") {
                
                
//unstash need thread mutex lock?
                
                
unstash name: 'src'
                
                
                
                sh 
"make ${platform}"
                
                
                
                
//archive need thread mutex lock?
                
                
archive name: "./${platform}"
                
            }
                
            
                
            //nested parallel, 
outside of node block
                
            parallel tests
                
    }
                
    
                
    def test_action = { platform ->
                
        //test in some slave node.
                
        node ("${platform}") {
                
            sh "make 
${platform}-test"
                
        }
                
    }
                
    
                
    tests["A1"] = test_action("A1")
                
    tests["A2"] = test_action("A2")
                
    tests["A3"] = test_action("A3")
                
    builds["A"] = build_action("A",tests)

                
    tests = [:]
                
    tests["B1"] = test_action("B1")
                
    tests["B2"] = test_action("B2")
                
    builds["B"] = build_action("B",tests)
                
    
                
    tests = [:]
                
    tests["C1"] = test_action("C1")
                
    tests["C2"] = test_action("C2")
                
    builds["C"] = build_action("B",tests)
                
    
                
    parallel builds
                }
            }
        }
    }
}

I have been running this pipeline for some time. It works well. But I worry 
about the thread safety of unstash(and archive,or other commands). I want to 
find some evidence that Jenkins garantee these parallel thread safety and 
nested parallel is ok. Otherwise I must split these platforms into different 
build jobs(which I think it's more difficult to manage projects.) Because 
Jenkins must garantee parallel build jobs more safe? I just don't know the 
underlying works that Jenkins do about parallel.I'm just curious about these 
thread safety.If Jenkins take care of these synchronization,that is the best. 
If such, I don't need to care about these thread things. If not,I want to find 
correct ways to do parallel tasks.


------------------ 原始邮件 ------------------
发件人:                                                                            
                                            "jenkinsci-users"                   
                                                                 
<kuisathave...@gmail.com&gt;;
发送时间:&nbsp;2021年10月31日(星期天) 晚上7:19
收件人:&nbsp;"Jenkins Users"<jenkinsci-users@googlegroups.com&gt;;

主题:&nbsp;Re: thread safety of scripted pipeline parallel and usage of nested 
parallel



The following example should work (I did not test it), I my case I have used 
maps like the “result” variable, that it is a simple map not synchronized and 
stores data from all task but it is not read from the different tasks. The 
other two cases “data” and “mapSync” uses concurrent classes, they are thread 
safe and synchronized so you can share data across tasks, I dunno is they are 
in the allow list for pipeline, if not you have to approve it use in pipelines 
in the Jenkins config. Finally, the last part of the pipeline uses nested 
parallel task, from my experience is not a good idea, the parallel explosion of 
task is a little incontrolable and there are other solution like launch a job 
from those task and inside that job launch parallel task, in this way you only 
have 1 parallel level that is easy to control and understand when something 
goes wrong.

import groovy.transform.Field
import java.util.concurrent.atomic.AtomicInteger
import java.util.concurrent.ConcurrentHashMap


@Field def results = [:]
@Field AtomicInteger data = new AtomicInteger()
@Filed ConcurrentHashMap mapSync = new ConcurrentHashMap<String,Integer&gt;()


pipeline{
&nbsp; &nbsp; agent any
&nbsp; &nbsp; stages
&nbsp; &nbsp; {
&nbsp; &nbsp; &nbsp; &nbsp; stage(‘Parallel BuiLd’) {
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; steps {
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; script {
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; def i = 0
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; def 
builds = [:]
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
mapSync[“odd”] = 0
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
mapSync[“even”] = 0
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; stash 
name: ‘src’, include: ‘src/**’
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
//generate 1000 parallel block
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; for (i = 
0; i<1000; i++) {
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
&nbsp; // make the Map of Closure
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
&nbsp; builds[“$i”] = {
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
&nbsp; &nbsp; &nbsp; results[“$i”] = 1
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
&nbsp; &nbsp; &nbsp; data++
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
&nbsp; &nbsp; &nbsp; if(i%2==0){
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
&nbsp; &nbsp; &nbsp; &nbsp; mapSync[“odd”] = mapSync[“odd”]++ 
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
&nbsp; &nbsp; &nbsp; } else {
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
&nbsp; &nbsp; &nbsp; &nbsp; mapSync[“even”] = mapSync[“even”]++ 
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
&nbsp; &nbsp; &nbsp; }
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; 
&nbsp; }
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; }
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; parallel 
builds
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; println 
results.toString()
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; println 
data
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; println 
mapSync
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; }
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; }
&nbsp; &nbsp; &nbsp; &nbsp; }
&nbsp; &nbsp; }
}
El sábado, 30 de octubre de 2021 a las 18:44:57 UTC+2, abstrakta escribió:

Thanks for your reply.
So parallel directive is like spawning some Java threads?Do you have any 
pipeline code can demo this thread safety issue and how to fix it 
using&nbsp;Java types that are thread safe&nbsp;?
I guess that some directive "stash unstash archive" should be thread safety 
internally.Because I find some articles use parallel unstash in different slave 
node without thread protection.Is my guess correct?I can't find any other 
articles that discuss this parallel thread safety issue.

---Original---
From: "Ivan Fernandez Calvo"<kuisat...@gmail.com&gt;
Date: Sun, Oct 31, 2021 00:03 AM
To: "Jenkins Users"<jenkins...@googlegroups.com&gt;;
Subject: Re: thread safety of scripted pipeline parallel and usage of nested 
parallel



No, if you plan to use shared variables across parallel steps you should use 
Java types that are thread safe, if not you will have random/weird results. I 
have several pipelines that uses a map to store results in parallel steps 

El sábado, 30 de octubre de 2021 a las 14:18:29 UTC+2, abstrakta escribió:

Hi, Jenkins friends.I wish that I'm in the right place that post these Jenkins 
usage question.

I find that the Scriped Pipeline parallel works like threads.Unlike the 
Declarative Pipeline parallel,the Scriped Pipeline parallel just use one 
executor.Closure in parallel parameters works like a thread.
My question is:

1.Does Jenkins garantee the data thread safety of parallel closure internally?

2.Does I need to care about the thread safety of the commands that executes in 
scriped parallel closure?

3.Is there any limit of usage in the commands that executes in parallel?Can I 
use nested scripted parallel? Why the documentation of Declarative Pipeline 
parallel in Pipeline Syntax reference says that "Note that a stage must have 
one and only one of steps, stages, parallel, or matrix. It is not possible to 
nest a parallel or matrix block within a stage directive if that stage 
directive is nested within a parallel or matrix block itself."

I test some nested Pipeline code that might cause thread race condition many 
times.Jekins always give the right answer that shared data is modified 
correctly.Is this thread safety garanteed in the design of Jenkins parallel 
directive?

Pipeline code like this:

pipeline{
&nbsp;&nbsp; &nbsp;agent any
&nbsp;&nbsp; &nbsp;stages
&nbsp;&nbsp; &nbsp;{
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; stage('Parallel BuiLd') {
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; steps {
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
script {
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; def i = 0
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; def data = 0
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; def builds = [:]
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; stash name: 'src', include: 'src/**'
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; //generate 1000 parallel block
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; for (i = 0; i<1000; i++) {
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; // make the Map of Closure
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; builds["$i"] = {
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; //modify shared data, 
need thread mutex lock?
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; data++
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; //unstash or other 
command, need thread mutex lock?
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; unstah name: 'src'
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; def tests = [:]
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; // ... generate tests 
Map
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; // Can I use nested 
parallel?
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; parallel tests
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; }
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; }
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; parallel builds
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; 
&nbsp;&nbsp;&nbsp; println data //It does always print 1000
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; }
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp; }
&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp; }
&nbsp;&nbsp; &nbsp;}
}

The variable data is always modified to 1000. So Jenkins garantee the thread 
safety of parallel?

 


 -- 
 You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
 To unsubscribe from this group and stop receiving emails from it, send an 
email to jenkinsci-use...@googlegroups.com.
 To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/0712b8c5-42dc-439f-a017-2a5ca45ad1e9n%40googlegroups.com.
 

 

 -- 
 You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
 To unsubscribe from this group and stop receiving emails from it, send an 
email to jenkinsci-users+unsubscr...@googlegroups.com.
 To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/fed01b0e-a04c-448b-9f32-3d2b3ffd6c40n%40googlegroups.com.

-- 
You received this message because you are subscribed to the Google Groups 
"Jenkins Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to jenkinsci-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jenkinsci-users/tencent_579508DE2429C25B92D7BFFF6AED09115C08%40qq.com.

Reply via email to