-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/46427/#review129787
-----------------------------------------------------------


Ship it!




Ship It!

- Sumit Mohanty


On April 20, 2016, 7:37 p.m., Swapan Shridhar wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/46427/
> -----------------------------------------------------------
> 
> (Updated April 20, 2016, 7:37 p.m.)
> 
> 
> Review request for Ambari, Alejandro Fernandez and Sumit Mohanty.
> 
> 
> Bugs: AMBARI-15985
>     https://issues.apache.org/jira/browse/AMBARI-15985
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> Service Checks for Hive Server Interactive and LLAP
>  - Checks Port for HSI.
>  - Issues DB queries to ascertaing liveness of LLAP.
> 
> 
> Diffs
> -----
> 
>   
> ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/files/hiveLlapSmoke.sh
>  PRE-CREATION 
>   
> ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/params_linux.py
>  2414e8b 
>   
> ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/service_check.py
>  251e71f 
> 
> Diff: https://reviews.apache.org/r/46427/diff/
> 
> 
> Testing
> -------
> 
> - Tested with HSI installed and not installed.
> - new serviece checks added pass.
> 
> - Python UT passes.
> 
> ----------------------------------------------------------------------
> Ran 261 tests in 6.542s
> 
> OK
> ----------------------------------------------------------------------
> Total run:994
> Total errors:0
> Total failures:0
> OK
> 
> 
> Service Check o/p:
> =================
> 
> 
> 2016-04-20 19:30:27,256 - Using hadoop conf dir: 
> /usr/hdp/current/hadoop-client/conf
> 2016-04-20 19:30:27,308 - call['ambari-python-wrap /usr/bin/hdp-select status 
> hive-server2'] {'timeout': 20}
> 2016-04-20 19:30:27,336 - call returned (0, 'hive-server2 - 2.5.0.0-157')
> 2016-04-20 19:30:27,341 - Running Hive Server checks
> 2016-04-20 19:30:27,342 - --------------------------
> 
> 2016-04-20 19:30:27,342 - Server Address List : ['c6402.ambari.apache.org'], 
> Port : 10000
> 2016-04-20 19:30:27,342 - Waiting for the Hive Server to start...
> 2016-04-20 19:30:27,343 - Execute['! beeline -u 
> 'jdbc:hive2://c6402.ambari.apache.org:10000/;transportMode=binary' -e '' 
> 2>&1| awk '{print}'|grep -i -e 'Connection refused' -e 'Invalid URL''] 
> {'path': ['/bin/', '/usr/bin/', '/usr/lib/hive/bin/', '/usr/sbin/'], 'user': 
> 'ambari-qa', 'timeout': 30}
> 2016-04-20 19:30:30,016 - Successfully connected to c6402.ambari.apache.org 
> on port 10000
> 2016-04-20 19:30:30,016 - Successfully stayed connected to 'Hive Server' on 
> host: c6402.ambari.apache.org and port 10000 after 2.6735098362 seconds
> 2016-04-20 19:30:30,016 - Running Hive Server2 checks
> 2016-04-20 19:30:30,016 - --------------------------
> 
> 2016-04-20 19:30:30,017 - Server Address List : ['c6402.ambari.apache.org'], 
> Port : 10500
> 2016-04-20 19:30:30,017 - Waiting for the Hive Server2 to start...
> 2016-04-20 19:30:30,018 - Execute['! beeline -u 
> 'jdbc:hive2://c6402.ambari.apache.org:10500/;transportMode=binary' -e '' 
> 2>&1| awk '{print}'|grep -i -e 'Connection refused' -e 'Invalid URL''] 
> {'path': ['/bin/', '/usr/bin/', '/usr/lib/hive/bin/', '/usr/sbin/'], 'user': 
> 'ambari-qa', 'timeout': 30}
> 2016-04-20 19:30:34,219 - Successfully connected to c6402.ambari.apache.org 
> on port 10500
> 2016-04-20 19:30:34,219 - Successfully stayed connected to 'Hive Server2' on 
> host: c6402.ambari.apache.org and port 10500 after 4.2018828392 seconds
> 2016-04-20 19:30:34,219 - Running LLAP checks
> 2016-04-20 19:30:34,219 - -------------------
> 
> 2016-04-20 19:30:34,221 - File['/var/lib/ambari-agent/tmp/hiveLlapSmoke.sh'] 
> {'content': StaticFile('hiveLlapSmoke.sh'), 'mode': 0755}
> 2016-04-20 19:30:34,224 - checked_call['hostid'] {}
> 2016-04-20 19:30:34,235 - checked_call returned (0, 'a8c06640')
> 2016-04-20 19:30:34,236 - Execute['env JAVA_HOME=/usr/jdk64/jdk1.8.0_60 
> /var/lib/ambari-agent/tmp/hiveLlapSmoke.sh /usr/hdp 
> llap_smoke_ida8c06640_date302016 prepare'] {'logoutput': True, 'try_sleep': 
> 5, 'wait_for_finish': True, 'tries': 1, 'user': 'hive', 'stderr': -1, 'path': 
> ['/usr/sbin', '/usr/local/bin', '/bin', '/usr/bin', 
> '/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin']}
> which: no hbase in 
> (/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/sbin:/usr/local/bin:/bin:/usr/bin:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin)
> WARNING: Use "yarn jar" to launch YARN applications.
> Java HotSpot(TM) 64-Bit Server VM warning: Using the ParNew young collector 
> with the Serial old collector is deprecated and will likely be removed in a 
> future release
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in 
> [jar:file:/usr/hdp/2.5.0.0-157/hive2/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in 
> [jar:file:/usr/hdp/2.5.0.0-157/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
> explanation.
> SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
> 
> Logging initialized using configuration in 
> jar:file:/usr/hdp/2.5.0.0-157/hive2/lib/hive-common-2.1.0.2.5.0.0-157.jar!/hive-log4j2.properties
> OK
> Time taken: 21.683 seconds
> Query ID = hive_20160420193104_47af1752-fbeb-4d7e-afd3-f680c12a0f5d
> Total jobs = 1
> Launching Job 1 out of 1
> 
> 
> Status: Running (Executing on YARN cluster with App id 
> application_1461173999701_0012)
> 
> Map 1: -/-    Reducer 2: 0/2  
> Map 1: 0/1    Reducer 2: 0/2  
> Map 1: 0(+1)/1        Reducer 2: 0/2  
> Map 1: 1/1    Reducer 2: 0/2  
> Map 1: 1/1    Reducer 2: 0(+1)/2      
> Map 1: 1/1    Reducer 2: 1(+1)/2      
> Map 1: 1/1    Reducer 2: 2/2  
> Status: DAG finished successfully in 4.81 seconds
> 
> 
> Query Execution Summary
> ----------------------------------------------------------------------------------------------
> OPERATION                            DURATION
> ----------------------------------------------------------------------------------------------
> Compile Query                           3.50s
> Prepare Plan                            0.99s
> Submit Plan                             1.67s
> Start                                   0.76s
> Finish                                  4.81s
> ----------------------------------------------------------------------------------------------
> 
> Task Execution Summary
> ----------------------------------------------------------------------------------------------
>   VERTICES   DURATION(ms)  CPU_TIME(ms)  GC_TIME(ms)  
> INPUT_RECORDS  OUTPUT_RECORDS
> ----------------------------------------------------------------------------------------------
>      Map 1        1082.00             0            0              2           
>     2
>  Reducer 2        2403.00             0            0              2           
>     0
> ----------------------------------------------------------------------------------------------
> 
> LLAP IO Summary
> ----------------------------------------------------------------------------------------------
>   VERTICES ROWGROUPS  META_HIT  META_MISS  DATA_HIT  DATA_MISS  
> ALLOCATION     USED  TOTAL_IO
> ----------------------------------------------------------------------------------------------
>      Map 1         0         0          0        0B         0B          0B    
>    0B     0.00s
> ----------------------------------------------------------------------------------------------
> 
> 
> Loading data to table default.llap_smoke_ida8c06640_date302016
> Table default.llap_smoke_ida8c06640_date302016 stats: [numFiles=2, numRows=2, 
> totalSize=672, rawDataSize=204]
> OK
> Time taken: 13.035 seconds
> OK
> 2
> Time taken: 2.177 seconds, Fetched: 1 row(s)
> 2016-04-20 19:31:21,178 - Running HCAT checks
> 2016-04-20 19:31:21,178 - -------------------
> 
> 2016-04-20 19:31:21,179 - checked_call['hostid'] {}
> 2016-04-20 19:31:21,208 - checked_call returned (0, 'a8c06640')
> 2016-04-20 19:31:21,209 - File['/var/lib/ambari-agent/tmp/hcatSmoke.sh'] 
> {'content': StaticFile('hcatSmoke.sh'), 'mode': 0755}
> 2016-04-20 19:31:21,224 - Execute['env JAVA_HOME=/usr/jdk64/jdk1.8.0_60 
> /var/lib/ambari-agent/tmp/hcatSmoke.sh hcatsmokeida8c06640_date312016 prepare 
> true'] {'logoutput': True, 'path': ['/usr/sbin', '/usr/local/bin', '/bin', 
> '/usr/bin', 
> '/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin'],
>  'tries': 3, 'user': 'ambari-qa', 'try_sleep': 5}
> WARNING: Use "yarn jar" to launch YARN applications.
> OK
> t2
> t3
> Time taken: 5.182 seconds
> WARNING: Use "yarn jar" to launch YARN applications.
> OK
> Time taken: 2.044 seconds
> WARNING: Use "yarn jar" to launch YARN applications.
> OK
> Time taken: 2.817 seconds
> 2016-04-20 19:31:58,584 - ExecuteHadoop['fs -test -e 
> /apps/hive/warehouse/hcatsmokeida8c06640_date312016'] {'logoutput': True, 
> 'bin_dir': 
> '/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin',
>  'user': 'hdfs', 'conf_dir': '/usr/hdp/current/hadoop-client/conf'}
> 2016-04-20 19:31:58,590 - Execute['hadoop --config 
> /usr/hdp/current/hadoop-client/conf fs -test -e 
> /apps/hive/warehouse/hcatsmokeida8c06640_date312016'] {'logoutput': True, 
> 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hdfs', 'path': 
> ['/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin']}
> 2016-04-20 19:32:01,625 - Execute[' /var/lib/ambari-agent/tmp/hcatSmoke.sh 
> hcatsmokeida8c06640_date312016 cleanup true'] {'logoutput': True, 'path': 
> ['/usr/sbin', '/usr/local/bin', '/bin', '/usr/bin', 
> '/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/sbin:/sbin:/usr/lib/ambari-server/*:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/var/lib/ambari-agent:/var/lib/ambari-agent:/usr/hdp/current/hive-client/bin:/usr/hdp/current/hadoop-client/bin'],
>  'tries': 3, 'user': 'ambari-qa', 'try_sleep': 5}
> WARNING: Use "yarn jar" to launch YARN applications.
> OK
> Time taken: 2.468 seconds
> 2016-04-20 19:32:08,730 - Running WEBHCAT checks
> 2016-04-20 19:32:08,730 - ---------------------
> 
> 2016-04-20 19:32:08,731 - File['/var/lib/ambari-agent/tmp/templetonSmoke.sh'] 
> {'content': StaticFile('templetonSmoke.sh'), 'mode': 0755}
> 2016-04-20 19:32:08,738 - 
> File['/var/lib/ambari-agent/tmp/idtest.ambari-qa.1461180728.73.pig'] 
> {'owner': 'hdfs', 'content': Template('templeton_smoke.pig.j2')}
> 2016-04-20 19:32:08,739 - Writing 
> File['/var/lib/ambari-agent/tmp/idtest.ambari-qa.1461180728.73.pig'] because 
> it doesn't exist
> 2016-04-20 19:32:08,743 - Changing owner for 
> /var/lib/ambari-agent/tmp/idtest.ambari-qa.1461180728.73.pig from 0 to hdfs
> 2016-04-20 19:32:08,744 - 
> HdfsResource['/tmp/idtest.ambari-qa.1461180728.73.pig'] {'security_enabled': 
> False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': 
> [EMPTY], 'source': 
> '/var/lib/ambari-agent/tmp/idtest.ambari-qa.1461180728.73.pig', 'dfs_type': 
> '', 'default_fs': 'hdfs://c6402.ambari.apache.org:8020', 
> 'hdfs_resource_ignore_file': 
> '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 
> 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 
> 'hdfs', 'owner': 'ambari-qa', 'hadoop_conf_dir': 
> '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': 
> ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', 
> u'/mr-history/done', u'/app-logs', u'/tmp']}
> 2016-04-20 19:32:08,751 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 
> 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET 
> '"'"'http://c6402.ambari.apache.org:50070/webhdfs/v1/tmp/idtest.ambari-qa.1461180728.73.pig?op=GETFILESTATUS&user.name=hdfs'"'"'
>  1>/tmp/tmpMdR_jN 2>/tmp/tmpJLxau4''] {'logoutput': None, 'quiet': False}
> 2016-04-20 19:32:08,795 - call returned (0, '')
> 2016-04-20 19:32:08,796 - Creating new file 
> /tmp/idtest.ambari-qa.1461180728.73.pig in DFS
> 2016-04-20 19:32:08,797 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 
> 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT -T 
> /var/lib/ambari-agent/tmp/idtest.ambari-qa.1461180728.73.pig 
> '"'"'http://c6402.ambari.apache.org:50070/webhdfs/v1/tmp/idtest.ambari-qa.1461180728.73.pig?op=CREATE&user.name=hdfs&overwrite=True'"'"'
>  1>/tmp/tmpgRJGcJ 2>/tmp/tmpavM5TI''] {'logoutput': None, 'quiet': False}
> 2016-04-20 19:32:08,960 - call returned (0, '')
> 2016-04-20 19:32:08,968 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 
> 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT 
> '"'"'http://c6402.ambari.apache.org:50070/webhdfs/v1/tmp/idtest.ambari-qa.1461180728.73.pig?op=SETOWNER&user.name=hdfs&owner=ambari-qa&group='"'"'
>  1>/tmp/tmpgRGtrp 2>/tmp/tmpD5KTED''] {'logoutput': None, 'quiet': False}
> 2016-04-20 19:32:09,041 - call returned (0, '')
> 2016-04-20 19:32:09,042 - 
> HdfsResource['/tmp/idtest.ambari-qa.1461180728.73.in'] {'security_enabled': 
> False, 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': 
> [EMPTY], 'source': '/etc/passwd', 'dfs_type': '', 'default_fs': 
> 'hdfs://c6402.ambari.apache.org:8020', 'hdfs_resource_ignore_file': 
> '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 
> 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 
> 'hdfs', 'owner': 'ambari-qa', 'hadoop_conf_dir': 
> '/usr/hdp/current/hadoop-client/conf', 'type': 'file', 'action': 
> ['create_on_execute'], 'immutable_paths': [u'/apps/hive/warehouse', 
> u'/mr-history/done', u'/app-logs', u'/tmp']}
> 2016-04-20 19:32:09,043 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 
> 'curl -sS -L -w '"'"'%{http_code}'"'"' -X GET 
> '"'"'http://c6402.ambari.apache.org:50070/webhdfs/v1/tmp/idtest.ambari-qa.1461180728.73.in?op=GETFILESTATUS&user.name=hdfs'"'"'
>  1>/tmp/tmpT7ncU1 2>/tmp/tmpBy8iMr''] {'logoutput': None, 'quiet': False}
> 2016-04-20 19:32:09,104 - call returned (0, '')
> 2016-04-20 19:32:09,105 - Creating new file 
> /tmp/idtest.ambari-qa.1461180728.73.in in DFS
> 2016-04-20 19:32:09,106 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 
> 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT -T /etc/passwd 
> '"'"'http://c6402.ambari.apache.org:50070/webhdfs/v1/tmp/idtest.ambari-qa.1461180728.73.in?op=CREATE&user.name=hdfs&overwrite=True'"'"'
>  1>/tmp/tmpnxRUA_ 2>/tmp/tmpNBhXEx''] {'logoutput': None, 'quiet': False}
> 2016-04-20 19:32:09,244 - call returned (0, '')
> 2016-04-20 19:32:09,246 - call['ambari-sudo.sh su hdfs -l -s /bin/bash -c 
> 'curl -sS -L -w '"'"'%{http_code}'"'"' -X PUT 
> '"'"'http://c6402.ambari.apache.org:50070/webhdfs/v1/tmp/idtest.ambari-qa.1461180728.73.in?op=SETOWNER&user.name=hdfs&owner=ambari-qa&group='"'"'
>  1>/tmp/tmpn_Z7pG 2>/tmp/tmpL668wn''] {'logoutput': None, 'quiet': False}
> 2016-04-20 19:32:09,273 - call returned (0, '')
> 2016-04-20 19:32:09,273 - HdfsResource[None] {'security_enabled': False, 
> 'hadoop_bin_dir': '/usr/hdp/current/hadoop-client/bin', 'keytab': [EMPTY], 
> 'dfs_type': '', 'default_fs': 'hdfs://c6402.ambari.apache.org:8020', 
> 'hdfs_resource_ignore_file': 
> '/var/lib/ambari-agent/data/.hdfs_resource_ignore', 'hdfs_site': ..., 
> 'kinit_path_local': 'kinit', 'principal_name': 'missing_principal', 'user': 
> 'hdfs', 'action': ['execute'], 'hadoop_conf_dir': 
> '/usr/hdp/current/hadoop-client/conf', 'immutable_paths': 
> [u'/apps/hive/warehouse', u'/mr-history/done', u'/app-logs', u'/tmp']}
> 2016-04-20 19:32:09,275 - 
> Execute['/var/lib/ambari-agent/tmp/templetonSmoke.sh c6402.ambari.apache.org 
> ambari-qa 50111 idtest.ambari-qa.1461180728.73.pig no_keytab false kinit 
> no_principal'] {'logoutput': True, 'path': 
> ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries': 3, 'try_sleep': 5}
> 
> 
> Thanks,
> 
> Swapan Shridhar
> 
>

Reply via email to