-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/46427/#review129761
-----------------------------------------------------------




ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/service_check.py
 (line 163)
<https://reviews.apache.org/r/46427/#comment193299>

    Doesn't suceed as smoke user.


- Swapan Shridhar


On April 20, 2016, 6:15 p.m., Swapan Shridhar wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/46427/
> -----------------------------------------------------------
> 
> (Updated April 20, 2016, 6:15 p.m.)
> 
> 
> Review request for Ambari, Alejandro Fernandez and Sumit Mohanty.
> 
> 
> Bugs: AMBARI-15985
>     https://issues.apache.org/jira/browse/AMBARI-15985
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> Service Checks for Hive Server Interactive and LLAP
>  - Checks Port for HSI.
>  - Issues DB queries to ascertaing liveness of LLAP.
> 
> 
> Diffs
> -----
> 
>   
> ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/files/hiveLlapSmoke.sh
>  PRE-CREATION 
>   
> ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/params_linux.py
>  2414e8b 
>   
> ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/service_check.py
>  251e71f 
> 
> Diff: https://reviews.apache.org/r/46427/diff/
> 
> 
> Testing
> -------
> 
> - Tested with HSI installed and not installed.
> - new serviece checks added pass.
> 
> - Python UT passes.
> 
> ----------------------------------------------------------------------
> Ran 261 tests in 6.542s
> 
> OK
> ----------------------------------------------------------------------
> Total run:994
> Total errors:0
> Total failures:0
> OK
> 
> 
> Service Check o/p:
> =================
> 
> 
> 2016-04-20 07:22:11,987 - Using hadoop conf dir: 
> /usr/hdp/current/hadoop-client/conf
> 2016-04-20 07:22:12,035 - call['ambari-python-wrap /usr/bin/hdp-select status 
> hive-server2'] {'timeout': 20}
> 2016-04-20 07:22:12,059 - call returned (0, 'hive-server2 - 2.5.0.0-157')
> 2016-04-20 07:22:12,064 - 
> 
> 
> 2016-04-20 07:22:12,064 - Running Hive Server checks
> 2016-04-20 07:22:12,065 - --------------------------
> 
> 2016-04-20 07:22:12,066 - Server Address List : ['c6402.ambari.apache.org'], 
> Port : 10000
> 2016-04-20 07:22:12,067 - Waiting for the Hive Server to start...
> 2016-04-20 07:22:12,067 - Execute['! beeline -u 
> 'jdbc:hive2://c6402.ambari.apache.org:10000/;transportMode=binary' -e '' 
> 2>&1| awk '{print}'|grep -i -e 'Connection refused' -e 'Invalid URL''] 
> {'path': ['/bin/', '/usr/bin/', '/usr/lib/hive/bin/', '/usr/sbin/'], 'user': 
> 'ambari-qa', 'timeout': 30}
> 2016-04-20 07:22:14,253 - Successfully connected to c6402.ambari.apache.org 
> on port 10000
> 2016-04-20 07:22:14,253 - Successfully stayed connected to Hive Server at 
> c6402.ambari.apache.org on port 10000 after 2.18644499779 seconds
> 2016-04-20 07:22:14,253 - 
> 
> 
> 2016-04-20 07:22:14,253 - Running Hive Server2 checks
> 2016-04-20 07:22:14,253 - --------------------------
> 
> 2016-04-20 07:22:14,255 - Server Address List : ['c6402.ambari.apache.org'], 
> Port : 10500
> 2016-04-20 07:22:14,255 - Waiting for the Hive Server2 to start...
> 2016-04-20 07:22:14,256 - Execute['! beeline -u 
> 'jdbc:hive2://c6402.ambari.apache.org:10500/;transportMode=binary' -e '' 
> 2>&1| awk '{print}'|grep -i -e 'Connection refused' -e 'Invalid URL''] 
> {'path': ['/bin/', '/usr/bin/', '/usr/lib/hive/bin/', '/usr/sbin/'], 'user': 
> 'ambari-qa', 'timeout': 30}
> 2016-04-20 07:22:16,689 - Successfully connected to c6402.ambari.apache.org 
> on port 10500
> 2016-04-20 07:22:16,689 - Successfully stayed connected to Hive Server2 at 
> c6402.ambari.apache.org on port 10500 after 2.43372511864 seconds
> 2016-04-20 07:22:16,689 - 
> 
> 
> 2016-04-20 07:22:16,689 - Running LLAP checks
> 2016-04-20 07:22:16,689 - -------------------
> 
> 2016-04-20 07:22:16,690 - File['/var/lib/ambari-agent/tmp/hiveLlapSmoke.sh'] 
> {'content': StaticFile('hiveLlapSmoke.sh'), 'mode': 0755}
> 2016-04-20 07:22:16,692 - checked_call['hostid'] {}
> 2016-04-20 07:22:16,696 - checked_call returned (0, 'a8c06640')
> 2016-04-20 07:22:16,697 - checked_call['env JAVA_HOME=/usr/jdk64/jdk1.8.0_60 
> /var/lib/ambari-agent/tmp/hiveLlapSmoke.sh llap_smoke_ida8c06640_date222016 
> prepare'] {'logoutput': True, 'try_sleep': 5, 'wait_for_finish': True, 
> 'tries': 2, 'user': 'hive', 'stderr': -1, 'path': ['/usr/sbin', 
> '/usr/local/bin', '/bin', '/usr/bin', 
> '/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin']}
> which: no hbase in 
> (/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/sbin:/usr/local/bin:/bin:/usr/bin:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin)
> WARNING: Use "yarn jar" to launch YARN applications.
> Java HotSpot(TM) 64-Bit Server VM warning: Using the ParNew young collector 
> with the Serial old collector is deprecated and will likely be removed in a 
> future release
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in 
> [jar:file:/usr/hdp/2.5.0.0-157/hive2/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in 
> [jar:file:/usr/hdp/2.5.0.0-157/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
> explanation.
> SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
> 
> Logging initialized using configuration in 
> jar:file:/usr/hdp/2.5.0.0-157/hive2/lib/hive-common-2.1.0.2.5.0.0-157.jar!/hive-log4j2.properties
> OK
> Time taken: 19.924 seconds
> Query ID = hive_20160420072245_8d9ea0e3-430a-4a12-a929-692af32c5dd2
> Total jobs = 1
> Launching Job 1 out of 1
> 
> 
> Status: Running (Executing on YARN cluster with App id 
> application_1461121720303_0015)
> 
> Map 1: -/-    Reducer 2: 0/2  
> Map 1: 0/1    Reducer 2: 0/2  
> Map 1: 0(+1)/1        Reducer 2: 0/2  
> Map 1: 1/1    Reducer 2: 0/2  
> Map 1: 1/1    Reducer 2: 0(+1)/2      
> Map 1: 1/1    Reducer 2: 1(+1)/2      
> Map 1: 1/1    Reducer 2: 2/2  
> Status: DAG finished successfully in 2.39 seconds
> 
> 
> Query Execution Summary
> ----------------------------------------------------------------------------------------------
> OPERATION                            DURATION
> ----------------------------------------------------------------------------------------------
> Compile Query                           3.79s
> Prepare Plan                            0.83s
> Submit Plan                             1.55s
> Start                                   1.95s
> Finish                                  2.39s
> ----------------------------------------------------------------------------------------------
> 
> Task Execution Summary
> ----------------------------------------------------------------------------------------------
>   VERTICES   DURATION(ms)  CPU_TIME(ms)  GC_TIME(ms)  
> INPUT_RECORDS  OUTPUT_RECORDS
> ----------------------------------------------------------------------------------------------
>      Map 1         266.00             0            0              2           
>     2
>  Reducer 2        1027.00             0            0              2           
>     0
> ----------------------------------------------------------------------------------------------
> 
> LLAP IO Summary
> ----------------------------------------------------------------------------------------------
>   VERTICES ROWGROUPS  META_HIT  META_MISS  DATA_HIT  DATA_MISS  
> ALLOCATION     USED  TOTAL_IO
> ----------------------------------------------------------------------------------------------
>      Map 1         0         0          0        0B         0B          0B    
>    0B     0.00s
> ----------------------------------------------------------------------------------------------
> 
> 
> Loading data to table default.llap_smoke_ida8c06640_date222016
> Table default.llap_smoke_ida8c06640_date222016 stats: [numFiles=2, numRows=2, 
> totalSize=672, rawDataSize=204]
> OK
> Time taken: 11.164 seconds
> OK
> 2
> Time taken: 1.016 seconds, Fetched: 1 row(s)
> 2016-04-20 07:22:58,631 - checked_call returned (0, '2', 'which: no hbase in 
> (/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/sbin:/usr/local/bin:/bin:/usr/bin:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin)\nWARNING:
>  Use "yarn jar" to launch YARN applications.\nJava HotSpot(TM) 64-Bit Server 
> VM warning: Using the ParNew young collector with the Serial old collector is 
> deprecated and will likely be removed in a future release\nSLF4J: Class path 
> contains multiple SLF4J bindings.\nSLF4J: Found binding in 
> [jar:file:/usr/hdp/2.5.0.0-157/hive2/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]\nSLF4J
 : Found binding in 
[jar:file:/usr/hdp/2.5.0.0-157/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]\nSLF4J:
 See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.\nSLF4J: Actual binding is of type 
[org.apache.logging.slf4j.Log4jLoggerFactory]\n\nLogging initialized using 
configuration in 
jar:file:/usr/hdp/2.5.0.0-157/hive2/lib/hive-common-2.1.0.2.5.0.0-157.jar!/hive-log4j2.properties\nOK\nTime
 taken: 19.924 seconds\nQuery ID = 
hive_20160420072245_8d9ea0e3-430a-4a12-a929-692af32c5dd2\nTotal jobs = 
1\nLaunching Job 1 out of 1\n\n\nStatus: Running (Executing on YARN cluster 
with App id application_1461121720303_0015)\n\nMap 1: -/-\tReducer 2: 
0/2\t\nMap 1: 0/1\tReducer 2: 0/2\t\nMap 1: 0(+1)/1\tReducer 2: 0/2\t\nMap 1: 
1/1\tReducer 2: 0/2\t\nMap 1: 1/1\tReducer 2: 0(+1)/2\t\nMap 1: 1/1\tReducer 2: 
1(+1)/2\t\nMap 1: 1/1\tReducer 2: 2/2\t\nStatus: DAG finished successfully in 
2.39 seconds\n\n\nQuery Execution Summary\n-----------------------
 
-----------------------------------------------------------------------\n\x1b[2K\x1b[36;1mOPERATION
                            
DURATION\n\x1b[22;0m----------------------------------------------------------------------------------------------\nCompile
 Query                           3.79s\nPrepare Plan                            
0.83s\nSubmit Plan                             1.55s\nStart                     
              1.95s\nFinish                                  
2.39s\n----------------------------------------------------------------------------------------------\n\nTask
 Execution 
Summary\n----------------------------------------------------------------------------------------------\n\x1b[2K\x1b[36;1m
  VERTICES   DURATION(ms)  CPU_TIME(ms)  GC_TIME(ms)  INPUT_RECORDS  
OUTPUT_RECORDS\n\x1b[22;0m----------------------------------------------------------------------------------------------\n
     Map 1         266.00             0            0              2             
  2\n Reducer
  2        1027.00             0            0              2               
0\n----------------------------------------------------------------------------------------------\n\nLLAP
 IO 
Summary\n----------------------------------------------------------------------------------------------\n\x1b[2K\x1b[36;1m
  VERTICES ROWGROUPS  META_HIT  META_MISS  DATA_HIT  DATA_MISS  ALLOCATION     
USED  
TOTAL_IO\n\x1b[22;0m----------------------------------------------------------------------------------------------\n
     Map 1         0         0          0        0B         0B          0B      
 0B     
0.00s\n----------------------------------------------------------------------------------------------\n\n\nLoading
 data to table default.llap_smoke_ida8c06640_date222016\nTable 
default.llap_smoke_ida8c06640_date222016 stats: [numFiles=2, numRows=2, 
totalSize=672, rawDataSize=204]\nOK\nTime taken: 11.164 seconds\nOK\nTime 
taken: 1.016 seconds, Fetched: 1 row(s)')
> 2016-04-20 07:22:58,632 - 
> 
> 
> 2016-04-20 07:22:58,632 - Running HCAT checks
> 2016-04-20 07:22:58,632 - -------------------
> 
> 2016-04-20 07:22:58,633 - checked_call['hostid'] {}
> 2016-04-20 07:22:58,664 - checked_call returned (0, 'a8c06640')
> 2016-04-20 07:22:58,665 - File['/var/lib/ambari-agent/tmp/hcatSmoke.sh'] 
> {'content': StaticFile('hcatSmoke.sh'), 'mode': 0755}
> 2016-04-20 07:22:58,671 - Execute['env JAVA_HOME=/usr/jdk64/jdk1.8.0_60 
> /var/lib/ambari-agent/tmp/hcatSmoke.sh hcatsmokeida8c06640_date222016 prepare 
> true'] {'logoutput': True, 'path': ['/usr/sbin', '/usr/local/bin', '/bin', 
> '/usr/bin', 
> '/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin'],
>  'tries': 3, 'user': 'ambari-qa', 'try_sleep': 5}
> WARNING: Use "yarn jar" to launch YARN applications.
> OK
> t2
> t3
> Time taken: 2.882 seconds
> WARNING: Use "yarn jar" to launch YARN applications.
> OK
> Time taken: 1.913 seconds
> WARNING: Use "yarn jar" to launch YARN applications.
> OK
> Time taken: 2.447 seconds
> 2016-04-20 07:23:22,748 - ExecuteHadoop['fs -test -e 
> /apps/hive/warehouse/hcatsmokeida8c06640_date222016'] {'logoutput': True, 
> 'bin_dir': 
> '/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin',
>  'user': 'hdfs', 'conf_dir': '/usr/hdp/current/hadoop-client/conf'}
> 2016-04-20 07:23:22,750 - Execute['hadoop --config 
> /usr/hdp/current/hadoop-client/conf fs -test -e 
> /apps/hive/warehouse/hcatsmokeida8c06640_date222016'] {'logoutput': True, 
> 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hdfs', 'path': 
> ['/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin']}
> 2016-04-20 07:23:26,116 - Execute[' /var/lib/ambari-agent/tmp/hcatSmoke.sh 
> hcatsmokeida8c06640_date222016 cleanup true'] {'logoutput': True, 'path': 
> ['/usr/sbin', '/usr/local/bin', '/bin', '/usr/bin', 
> '/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin'],
>  'tries': 3, 'user': 'ambari-qa', 'try_sleep': 5}
> WARNING: Use "yarn jar" to launch YARN applications.
> OK
> Time taken: 2.41 seconds
> 2016-04-20 07:23:33,199 - 
> 
> 
> 2016-04-20 07:23:33,200 - Running WEBHCAT checks
> 2016-04-20 07:23:33,200 - ---------------------
> 
> 2016-04-20 07:23:33,201 - File['/var/lib/ambari-agent/tmp/templetonSmoke.sh'] 
> {'content': StaticFile('templetonSmoke.sh'), 'mode': 0755}
> 2016-04-20 07:23:33,208 - 
> File['/var/lib/ambari-agent/tmp/idtest.ambari-qa.1461137013.2.pig'] {'owner': 
> 'hdfs', 'content': Template('templeton_smoke.pig.j2')}
> 2016-04-20 07:23:33,209 - Writing 
> File['/var/lib/ambari-agent/tmp/idtest.ambari-qa.1461137013.2.pig'] because 
> it doesn't exist
> 2016-04-20 07:23:33,211 - Changing owner for 
> /var/lib/ambari-agent/tmp/idtest.ambari-qa.1461137013.2.pig from 0 to hdfs
> 2016-04-20 07:23:33,212 - 
> HdfsResource['/tmp/idtest.ambari-qa.1461137013.2.pig'] {'security_enabled': 
> False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': 
> [EMPTY], 'source': 
> '/var/lib/ambari-agent/tmp/idtest.ambari-qa.1461137013.2.pig', 'dfs_type': 
> '', 'default_fs': 'hdfs://c6402.ambari.apache.org:8020', 
> 'hdfs_resource_ignore_file': 
> '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 
> 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 
> 'hdfs', 'owner': 'ambari-qa', 'hadoop_conf_dir': 
> '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': 
> ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', 
> u'/mr-history/done', u'/app-logs', u'/tmp']}
> 2016-04-20 07:23:33,216 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 
> 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET 
> '"'"'http://c6402.ambari.apache.org:50070/webhdfs/v1/tmp/idtest.ambari-qa.1461137013.2.pig?op=GETFILESTATUS&user.name=hdfs'"'"'
>  1>/tmp/tmpoaRcSF 2>/tmp/tmpR8QhoN''] {'logoutput': None, 'quiet': False}
> 2016-04-20 07:23:33,248 - call returned (0, '')
> 2016-04-20 07:23:33,249 - Creating new file 
> /tmp/idtest.ambari-qa.1461137013.2.pig in DFS
> 2016-04-20 07:23:33,249 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 
> 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT -T 
> /var/lib/ambari-agent/tmp/idtest.ambari-qa.1461137013.2.pig 
> '"'"'http://c6402.ambari.apache.org:50070/webhdfs/v1/tmp/idtest.ambari-qa.1461137013.2.pig?op=CREATE&user.name=hdfs&overwrite=True'"'"'
>  1>/tmp/tmpxQAovU 2>/tmp/tmpxJuz4Q''] {'logoutput': None, 'quiet': False}
> 2016-04-20 07:23:33,391 - call returned (0, '')
> 2016-04-20 07:23:33,393 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 
> 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT 
> '"'"'http://c6402.ambari.apache.org:50070/webhdfs/v1/tmp/idtest.ambari-qa.1461137013.2.pig?op=SETOWNER&user.name=hdfs&owner=ambari-qa&group='"'"'
>  1>/tmp/tmp4ldXiK 2>/tmp/tmpIAfHEU''] {'logoutput': None, 'quiet': False}
> 2016-04-20 07:23:33,444 - call returned (0, '')
> 2016-04-20 07:23:33,446 - 
> HdfsResource['/tmp/idtest.ambari-qa.1461137013.2.in'] {'security_enabled': 
> False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': 
> [EMPTY], 'source': '/etc/passwd', 'dfs_type': '', 'default_fs': 
> 'hdfs://c6402.ambari.apache.org:8020', 'hdfs_resource_ignore_file': 
> '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 
> 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 
> 'hdfs', 'owner': 'ambari-qa', 'hadoop_conf_dir': 
> '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': 
> ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', 
> u'/mr-history/done', u'/app-logs', u'/tmp']}
> 2016-04-20 07:23:33,447 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 
> 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET 
> '"'"'http://c6402.ambari.apache.org:50070/webhdfs/v1/tmp/idtest.ambari-qa.1461137013.2.in?op=GETFILESTATUS&user.name=hdfs'"'"'
>  1>/tmp/tmpUsJQfS 2>/tmp/tmppDeIX8''] {'logoutput': None, 'quiet': False}
> 2016-04-20 07:23:33,470 - call returned (0, '')
> 2016-04-20 07:23:33,471 - Creating new file 
> /tmp/idtest.ambari-qa.1461137013.2.in in DFS
> 2016-04-20 07:23:33,471 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 
> 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT -T /etc/passwd 
> '"'"'http://c6402.ambari.apache.org:50070/webhdfs/v1/tmp/idtest.ambari-qa.1461137013.2.in?op=CREATE&user.name=hdfs&overwrite=True'"'"'
>  1>/tmp/tmpVmK5JJ 2>/tmp/tmpfSyVVj''] {'logoutput': None, 'quiet': False}
> 2016-04-20 07:23:33,571 - call returned (0, '')
> 2016-04-20 07:23:33,572 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 
> 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT 
> '"'"'http://c6402.ambari.apache.org:50070/webhdfs/v1/tmp/idtest.ambari-qa.1461137013.2.in?op=SETOWNER&user.name=hdfs&owner=ambari-qa&group='"'"'
>  1>/tmp/tmpF_YVbM 2>/tmp/tmpKTftBM''] {'logoutput': None, 'quiet': False}
> 2016-04-20 07:23:33,600 - call returned (0, '')
> 2016-04-20 07:23:33,601 - HdfsResource[None] {'security_enabled': False, 
> 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 
> 'dfs_type': '', 'default_fs': 'hdfs://c6402.ambari.apache.org:8020', 
> 'hdfs_resource_ignore_file': 
> '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 
> 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 
> 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': 
> '/usr/hdp/current/hadoop-client/conf', 'immutable_paths': 
> [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp']}
> 2016-04-20 07:23:33,601 - 
> Execute['/var/lib/ambari-agent/tmp/templetonSmoke.sh c6402.ambari.apache.org 
> ambari-qa 50111 idtest.ambari-qa.1461137013.2.pig no_keytab false kinit 
> no_principal'] {'logoutput': True, 'path': 
> ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 3, 'try_sleep': 5}
> 
> 
> Thanks,
> 
> Swapan Shridhar
> 
>

Reply via email to