rything is great. Success.
>>>>
>>>> I removed all jvms except jdk8 for compilation.
>>>>
>>>> I used jdk8 so I know which libraries where linked in the build process.
>>>> I also used my local version of maven. Not the apt install v
or higher , just jdk8.
>>>
>>> So anyway once in a while I do spark projects in scala with eclipse.
>>>
>>> For that I don't use maven or anything. I prefer to make use of build
>>> path
>>> And external jars. This way I know exactly which libraries
tly which libraries I am linking
>> to.
>>
>> creating a jar in eclipse is straight forward for spark_submit.
>>
>>
>> Anyway as you can see (below) I am pointing jupyter to find
>> spark.init('opt/spark').
>> That's OK everything is fine.
>>
>&g
> That's OK everything is fine.
>
> With the compiled version of spark there is a jar directory which I have
> been using in eclipse.
>
>
>
> With my own compiled from source version there is no jar directory.
>
>
> Where are all the jars gone ?.
>
>
>
> I
version of spark there is a jar directory which I
have been using in eclipse.
With my own compiled from source version there is no jar directory.
Where are all the jars gone ?.
I am not sure how findspark.init('/opt/spark') is locating the
libraries unless it is finding them from
/spark').
That's OK everything is fine.
With the compiled version of spark there is a jar directory which I have
been using in eclipse.
With my own compiled from source version there is no jar directory.
Where are all the jars gone ?.
I am not sure how findspark.init('/opt/spark