William Zhou created SPARK-13107:
------------------------------------
Summary: [Spark SQL]'hiveconf' parameters in Beeline command can't
be got after enter Beeline session
Key: SPARK-13107
URL: https://issues.apache.org/jira/browse/SPARK-13107
Project: Spark
Issue Type: Bug
Components: Spark Shell
Affects Versions: 1.5.1
Reporter: William Zhou
[root@51-196-100-7 hive-testbench-hive13]# beeline --hiveconf
hive.exec.dynamic.partition.mode=nonstrict
Connected to: Spark SQL (version 1.5.1)
Driver: Hive JDBC (version 1.2.1.spark)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 1.2.1.spark by Apache Hive
0: jdbc:hive2://ha-cluster/default> use tpcds_bin_partitioned_orc_500;
+---------+--+
| result |
+---------+--+
+---------+--+
No rows selected (1.053 seconds)
0: jdbc:hive2://ha-cluster/default> insert overwrite table store_sales
partition (ss_sold_date)
0: jdbc:hive2://ha-cluster/default> select
0: jdbc:hive2://ha-cluster/default> ss.ss_sold_date_sk,
0: jdbc:hive2://ha-cluster/default> ss.ss_sold_time_sk,
0: jdbc:hive2://ha-cluster/default> ss.ss_item_sk,
0: jdbc:hive2://ha-cluster/default> ss.ss_customer_sk,
0: jdbc:hive2://ha-cluster/default> ss.ss_cdemo_sk,
0: jdbc:hive2://ha-cluster/default> ss.ss_hdemo_sk,
0: jdbc:hive2://ha-cluster/default> ss.ss_addr_sk,
0: jdbc:hive2://ha-cluster/default> ss.ss_store_sk,
0: jdbc:hive2://ha-cluster/default> ss.ss_promo_sk,
0: jdbc:hive2://ha-cluster/default> ss.ss_ticket_number,
0: jdbc:hive2://ha-cluster/default> ss.ss_quantity,
0: jdbc:hive2://ha-cluster/default> ss.ss_wholesale_cost,
0: jdbc:hive2://ha-cluster/default> ss.ss_list_price,
0: jdbc:hive2://ha-cluster/default> ss.ss_sales_price,
0: jdbc:hive2://ha-cluster/default> ss.ss_ext_discount_amt,
0: jdbc:hive2://ha-cluster/default> ss.ss_ext_sales_price,
0: jdbc:hive2://ha-cluster/default> ss.ss_ext_wholesale_cost,
0: jdbc:hive2://ha-cluster/default> ss.ss_ext_list_price,
0: jdbc:hive2://ha-cluster/default> ss.ss_ext_tax,
0: jdbc:hive2://ha-cluster/default> ss.ss_coupon_amt,
0: jdbc:hive2://ha-cluster/default> ss.ss_net_paid,
0: jdbc:hive2://ha-cluster/default> ss.ss_net_paid_inc_tax,
0: jdbc:hive2://ha-cluster/default> ss.ss_net_profit,
0: jdbc:hive2://ha-cluster/default> dd.d_date as ss_sold_date
0: jdbc:hive2://ha-cluster/default> from tpcds_text_500.store_sales ss
0: jdbc:hive2://ha-cluster/default> join tpcds_text_500.date_dim dd
0: jdbc:hive2://ha-cluster/default> on (ss.ss_sold_date_sk =
dd.d_date_sk);
Error: org.apache.spark.SparkException: Dynamic partition strict mode requires
at least one static partition column. To turn this off set
hive.exec.dynamic.partition.mode=nonstrict (state=,code=0)
0: jdbc:hive2://ha-cluster/default>
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]