You might need hive-site.xml
_
From: Peter Zhang <zhangju...@gmail.com>
Sent: Monday, January 18, 2016 9:08 PM
Subject: Re: SparkR with Hive integration
To: Jeff Zhang <zjf...@gmail.com>
Cc: <user@spark.apache.org>
Thanks,
Hi all,
http://spark.apache.org/docs/latest/sparkr.html#sparkr-dataframes
From Hive tables
You can also create SparkR DataFrames from Hive tables. To do this we will need
to create a HiveContext which can access tables in the Hive MetaStore. Note
that Spark should have been built with Hive
Thanks,
I will try.
Peter
--
Google
Sent with Airmail
On January 19, 2016 at 12:44:46, Jeff Zhang (zjf...@gmail.com) wrote:
Please make sure you export environment variable HADOOP_CONF_DIR which contains
the core-site.xml
On Mon, Jan 18, 2016 at 8:23 PM, Peter Zhang
Please make sure you export environment variable HADOOP_CONF_DIR which
contains the core-site.xml
On Mon, Jan 18, 2016 at 8:23 PM, Peter Zhang wrote:
> Hi all,
>
> http://spark.apache.org/docs/latest/sparkr.html#sparkr-dataframes
> From Hive tables
>