Hi thanks. This is part of the solution I found after writing the
question. The other part being is that I needed to write the input
stream to a temporary file. I would prefer not to write any temporary
file but the  ssl.keystore.location properties seems to expect a file
path.

On Tue, Dec 25, 2018 at 5:26 AM Anastasios Zouzias <zouz...@gmail.com> wrote:
>
> Hi Colin,
>
> You can place your certificates under src/main/resources and include them in 
> the uber JAR, see e.g. : 
> https://stackoverflow.com/questions/40252652/access-files-in-resources-directory-in-jar-from-apache-spark-streaming-context
>
> Best,
> Anastasios
>
> On Mon, Dec 24, 2018 at 10:29 PM Colin Williams 
> <colin.williams.seat...@gmail.com> wrote:
>>
>> I've been trying to read from kafka via a spark streaming client. I
>> found out spark cluster doesn't have certificates deployed. Then I
>> tried using the same local certificates I've been testing with by
>> packing them in an uber jar and getting a File handle from the
>> Classloader resource. But I'm getting a File Not Found exception.
>> These are jks certificates. Is anybody aware how to package
>> certificates in a jar with a kafka client preferably the spark one?
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>
>
> --
> -- Anastasios Zouzias

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to