Can you try replacing your code with the hdfs uri? like:
sc.textFile("hdfs://...").collect().foreach(println)
Thanks
Best Regards
On Tue, Sep 29, 2015 at 1:45 AM, Stephen Hankinson
wrote:
> Hi,
>
> Wondering if anyone can help me with the issue I am having.
>
> I am
That's strange, for some reason your hadoop configurations are not picked
up by spark.
Thanks
Best Regards
On Wed, Sep 30, 2015 at 9:11 PM, Stephen Hankinson
wrote:
> When I use hdfs://affinio/tmp/Input it gives the same error about
> UnknownHostException affinio.
>
>
Hi,
Wondering if anyone can help me with the issue I am having.
I am receiving an UnknownHostException when running a custom jar with Spark
on Mesos. The issue does not happen when running spark-shell.
My spark-env.sh contains the following:
export