connection string / jdbc call
>>> / s3 client... You really don't want to use a straight .map(func). You'll
>>> end up instantiating a connection on every iteration.
>>>
>>> Hope this is somewhat helpful.
>>>
>>> Gary
>>
y iteration.
>>
>> Hope this is somewhat helpful.
>>
>> Gary
>>
>> On 21 September 2017 at 06:31, Weichen Xu
>> wrote:
>>
>>> Spark do not allow executor code using `sparkSession`.
>>> But I think you can move all json files to o
n:
>>
>> ```
>> spark.read.json("/path/to/jsonFileDir")
>> ```
>> But if you want to get filename at the same time, you can use
>> ```
>> spark.sparkContext.wholeTextFiles("/path/to/jsonFileDir")...
>> ```
>>
>> On
ep 21, 2017 at 9:18 PM, Riccardo Ferrari
> wrote:
>
>> Depends on your use-case however broadcasting
>> <https://spark.apache.org/docs/2.2.0/rdd-programming-guide.html#broadcast-variables>
>> could be a better option.
>>
>> On Thu, Sep 21, 2017 at 2:03 PM,
017 at 2:03 PM, Chackravarthy Esakkimuthu <
> chaku.mi...@gmail.com> wrote:
>
>> Hi,
>>
>> I want to know how to pass sparkSession from driver to executor.
>>
>> I have a spark program (batch job) which does following,
>>
>> #
t to know how to pass sparkSession from driver to executor.
>
> I have a spark program (batch job) which does following,
>
> #
>
> val spark = SparkSession.builder().appName("SampleJob").config("
> spark.master", "local") .getOrCr
Hi,
I want to know how to pass sparkSession from driver to executor.
I have a spark program (batch job) which does following,
#
val spark = SparkSession.builder().appName("SampleJob").config(
"spark.master", "local") .getOrCreate()
val df = this