I still have to propagate the file into the directory somehow, and also
that's marked as only for legacy jobs (deprecated?), so no, I have not
experimented with it yet.

On Wed, Dec 2, 2015 at 12:53 AM Rishi Mishra <rmis...@snappydata.io> wrote:

> Did you try to use *spark.executor.extraClassPath*. The classpath
> resources will be accessible through the executors class loader which
> executes your job.
>
> On Wed, Dec 2, 2015 at 2:15 AM, Charles Allen <
> charles.al...@metamarkets.com> wrote:
>
>> Is there a way to pass configuration file resources to be resolvable
>> through the classloader?
>>
>> For example, if I'm using a library (non-spark) that can use a
>> some-lib.properties file in the classpath/classLoader, can I pass that file
>> so that when it tries to get the resource from the classloader it is able
>> to find it?
>>
>> One potential solution is to take the files and package them as resources
>> in a jar, and include the jar as part of the spark job, but that feels like
>> a hack instead of an actual solution.
>>
>> Is there any support or planned support for such a thing?
>>
>> https://github.com/apache/spark/pull/9118 seems to tackle a similar
>> problem in a hard-coded kind of way.
>>
>> Thank you,
>> Charles Allen
>>
>
>
>
> --
> Regards,
> Rishitesh Mishra,
> SnappyData . (http://www.snappydata.io/)
>
> https://in.linkedin.com/in/rishiteshmishra
>

Reply via email to