-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/30986/
-----------------------------------------------------------

Review request for Ambari and Vitalyi Brodetskyi.


Bugs: AMBARI-9618
    https://issues.apache.org/jira/browse/AMBARI-9618


Repository: ambari


Description
-------

Add checks in Ambaripreupload.py so that there is minimum chances of tarball
uploads in parallel.

We should add the following checks in:

    
    
    
    
    if dir exists wasb:///hdp/apps/{{ hdp_stack_version }}/mapreduce/:
     copy_tarballs_to_hdfs("/usr/hdp/current/hadoop-client/mapreduce.tar.gz", 
"wasb:///hdp/apps/{{ hdp_stack_version }}/mapreduce/", 
'hadoop-mapreduce-historyserver', params.mapred_user, params.hdfs_user, 
params.user_group)
    
    copy_tarballs_to_hdfs("/usr/hdp/current/sqoop-client/sqoop.tar.gz", 
"wasb:///hdp/apps/{{ hdp_stack_version }}/sqoop/", 
'hadoop-mapreduce-historyserver', params.mapred_user, params.hdfs_user, 
params.user_group)
    
    
    

We should do this with all copy tarballs to hdfs in the file - meaning checks
if the directories exist and then only copy the tarballs.

The reason for doing this is so that when Ambari is doing the install/upload
tarballs there is less chances of the ambaripreupload.py script creating a
race condition on tarball upload.


Diffs
-----

  ambari-server/src/main/resources/scripts/Ambaripreupload.py 89e0742 

Diff: https://reviews.apache.org/r/30986/diff/


Testing
-------

mvn clean test


Thanks,

Andrew Onischuk

Reply via email to