I think it's really a mortal blow to livy about the repl scene . What I can do
, I think is to monitoring spark metrics, when the driver's memory was used to
a high leve I will isolate the session.
2018-11-14
lk_hadoop
发件人:"Harsch, Tim"
发送时间:2018-11-14 05:52
主题:Re: about LI
tainer killed on request. Exit code is 143
Container exited with a non-zero exit code 143
Failing this attempt. Failing the application.
2018-11-12
lk_hadoop
发件人:"Rabe, Jens"
发送时间:2018-11-12 14:55
主题:RE: about LIVY-424
收件人:"user@livy.incubator.apache.org"
抄送:
Do y
Thank you j...@nanthrax.net , I have resolved it by config
livy.repl.enable-hive-context = true
2017-11-23
lk_hadoop
发件人:Jean-Baptiste Onofré <j...@nanthrax.net>
发送时间:2017-11-22 21:45
主题:Re: livy can't list databases
收件人:"user"<user@livy.incubator.apache.org>
抄送:
Hi
s like can't read the metadata,I have config livy with SPARK_HOME,and
run under yarn model,the hive-site.xml also cp to SPARK_HOME/conf/.
but't when I use spark-shell:
scala> spark.sql("show databases").show
+-+
| databaseName|
+-+
| default|
| tpcds_carbon|
|tpcds_carbon2|
| tpcds_indexr|
|tpcds_parquet|
| tpcds_source|
+-+
2017-11-22
lk_hadoop