spark-shell can't import the default hive-site.xml options probably.

2015-02-01 Thread guxiaobo1982
Hi, To order to let a local spark-shell connect to a remote spark stand-alone cluster and access hive tables there, I must put the hive-site.xml file into the local spark installation's conf path, but spark-shell even can't import the default settings there, I found two errors: property

Re: spark-shell can't import the default hive-site.xml options probably.

2015-02-01 Thread Denny Lee
Cool! For all the times i had been modifying the hive-site.xml I had only propped in the integer values - learn something new every day, eh?! On Sun Feb 01 2015 at 9:36:23 AM Ted Yu yuzhih...@gmail.com wrote: Looking at common/src/java/org/apache/hadoop/hive/conf/HiveConf.java :

Re: spark-shell can't import the default hive-site.xml options probably.

2015-02-01 Thread Denny Lee
I may be missing something here but typically when the hive-site.xml configurations do not require you to place s within the configuration itself. Both the retry.delay and socket.timeout values are in seconds so you should only need to place the integer value (which are in seconds). On Sun Feb

Re: spark-shell can't import the default hive-site.xml options probably.

2015-02-01 Thread Ted Yu
Looking at common/src/java/org/apache/hadoop/hive/conf/HiveConf.java : METASTORE_CLIENT_CONNECT_RETRY_DELAY(hive.metastore.client.connect.retry.delay, 1s, new TimeValidator(TimeUnit.SECONDS), Number of seconds for the client to wait between consecutive connection attempts), It