Hi ,

For creating a Hive table do i need to add hive-site.xml in spark/conf
directory.

On Fri, Mar 6, 2015 at 11:12 PM, Michael Armbrust <mich...@databricks.com>
wrote:

> Its not required, but even if you don't have hive installed you probably
> still want to use the HiveContext.  From earlier in that doc:
>
> In addition to the basic SQLContext, you can also create a HiveContext,
>> which provides a superset of the functionality provided by the basic
>> SQLContext. Additional features include the ability to write queries using
>> the more complete HiveQL parser, access to HiveUDFs, and the ability to
>> read data from Hive tables. To use a HiveContext, *you do not need to
>> have an existing Hive setup*, and all of the data sources available to a
>> SQLContext are still available. HiveContext is only packaged separately to
>> avoid including all of Hive’s dependencies in the default Spark build. If
>> these dependencies are not a problem for your application then using
>> HiveContext is recommended for the 1.2 release of Spark. Future releases
>> will focus on bringing SQLContext up to feature parity with a HiveContext.
>
>
> On Fri, Mar 6, 2015 at 7:22 AM, Yin Huai <yh...@databricks.com> wrote:
>
>> Hi Edmon,
>>
>> No, you do not need to install Hive to use Spark SQL.
>>
>> Thanks,
>>
>> Yin
>>
>> On Fri, Mar 6, 2015 at 6:31 AM, Edmon Begoli <ebeg...@gmail.com> wrote:
>>
>>>  Does Spark-SQL require installation of Hive for it to run correctly or
>>> not?
>>>
>>> I could not tell from this statement:
>>>
>>> https://spark.apache.org/docs/latest/sql-programming-guide.html#compatibility-with-apache-hive
>>>
>>> Thank you,
>>> Edmon
>>>
>>
>>
>

Reply via email to