1.But It's no way to set Storage Level through properties file in spark,
Spark provided "def persist(newLevel: StorageLevel)"
api only...

2015-10-23 19:03 GMT+08:00 Xuefu Zhang <xzh...@cloudera.com>:

> quick answers:
> 1. you can pretty much set any spark configuration at hive using set
> command.
> 2. no. you have to make the call.
>
>
>
> On Thu, Oct 22, 2015 at 10:32 PM, Jone Zhang <joyoungzh...@gmail.com>
> wrote:
>
>> 1.How can i set Storage Level when i use Hive on Spark?
>> 2.Do Spark have any intention of  dynamically determined Hive on
>> MapReduce or Hive on Spark, base on SQL features.
>>
>> Thanks in advance
>> Best regards
>>
>
>

Reply via email to