Yes you were pointing to HDFS on a loopback address...
From: Jenna Hoole <jenna.ho...@gmail.com>
Sent: Monday, February 26, 2018 1:11:35 PM
To: Yinan Li; user@spark.apache.org
Subject: Re: Spark on K8s - using files fetched by init-container?
Oh,
Oh, duh. I completely forgot that file:// is a prefix I can use. Up and
running now :)
Thank you so much!
Jenna
On Mon, Feb 26, 2018 at 1:00 PM, Yinan Li wrote:
> OK, it looks like you will need to use
> `file:///var/spark-data/spark-files/flights.csv`
> instead. The
The files specified through --files are localized by the init-container
to /var/spark-data/spark-files by default. So in your case, the file should
be located at /var/spark-data/spark-files/flights.csv locally in the
container.
On Mon, Feb 26, 2018 at 10:51 AM, Jenna Hoole
This is probably stupid user error, but I can't for the life of me figure
out how to access the files that are staged by the init-container.
I'm trying to run the SparkR example data-manipulation.R which requires the
path to its datafile. I supply the hdfs location via --files and then the
full