Hi,
To order to let a local spark-shell connect to a remote spark stand-alone
cluster and access hive tables there, I must put the hive-site.xml file into
the local spark installation's conf path, but spark-shell even can't import the
default settings there, I found two errors:
property
Cool! For all the times i had been modifying the hive-site.xml I had only
propped in the integer values - learn something new every day, eh?!
On Sun Feb 01 2015 at 9:36:23 AM Ted Yu yuzhih...@gmail.com wrote:
Looking at common/src/java/org/apache/hadoop/hive/conf/HiveConf.java :
I may be missing something here but typically when the hive-site.xml
configurations do not require you to place s within the configuration
itself. Both the retry.delay and socket.timeout values are in seconds so
you should only need to place the integer value (which are in seconds).
On Sun Feb
Looking at common/src/java/org/apache/hadoop/hive/conf/HiveConf.java :
METASTORE_CLIENT_CONNECT_RETRY_DELAY(hive.metastore.client.connect.retry.delay,
1s,
new TimeValidator(TimeUnit.SECONDS),
Number of seconds for the client to wait between consecutive
connection attempts),
It