ChaiBapchya commented on a change in pull request #16336: S3 upload artifacts
URL: https://github.com/apache/incubator-mxnet/pull/16336#discussion_r330252338
 
 

 ##########
 File path: ci/Jenkinsfile_utils.groovy
 ##########
 @@ -150,12 +149,11 @@ def collect_test_results_windows(original_file_name, 
new_file_name) {
     if (fileExists(original_file_name)) {
         bat 'xcopy ' + original_file_name + ' ' + new_file_name + '*'
         archiveArtifacts artifacts: new_file_name
-        if (env.BRANCH_NAME == "master") {
-          try {
-            s3Upload(file:new_file_name, 
bucket:env.MXNET_CI_UNITTEST_ARTIFACT_BUCKET, 
path:utils.get_git_commit_hash().trim()+"-"+env.BUILD_TAG+"/"+new_file_name)
-          } catch (Exception e) {
-            sh "S3 Upload failed ${e}"
-          }
+        try {
+          s3Upload(file:new_file_name, 
bucket:env.MXNET_CI_UNITTEST_ARTIFACT_BUCKET, 
path:env.BRANCH_NAME+"-"+utils.get_git_commit_hash().trim()+"-"+env.BUILD_TAG+"/"+new_file_name)
 
 Review comment:
   However, we would have to take care of unix-gpu, unix-cpu etc jobs
   We can have a directory structure
   ```
   PR-12345/22/
      -> unix-cpu/
           -> unittest.xml
           -> train.xml
      -> unix-gpu/
           -> unittest.xml
   ```
   So on so forth
   
   `PR-number/build-number/job-name/files`
   
   Or we could segregate them based on Jobs first and then by PRs followed by 
builds

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to