Thanks for answer. I am now trying to set HADOOP_HOME but the issues still
persists. Also, I can see only windows-utils.exe in my HADDOP_HOME, but no
WINUTILS.EXE.
I do not have hadoop installed in my system, as I am not using HDFS, but I
am using Spark 1.3.1 prebuilt with Hadoop 2.6. AM I
This a hadoop-side stack trace
it looks like the code is trying to get the filesystem permissions by running
%HADOOP_HOME%\bin\WINUTILS.EXE ls -F
and something is triggering a null pointer exception.
There isn't any HADOOP- JIRA with this specific stack trace in it, so it's not
a
This code is in python. Also I tried with fwd slash at the end with same
result
On 26 Apr 2015 01:36, Jeetendra Gangele gangele...@gmail.com wrote:
also if this code is in scala why not val in newsY? is this define above?
loc = D:\\Project\\Spark\\code\\news\\jsonfeeds
newsY =
loc = D:\\Project\\Spark\\code\\news\\jsonfeeds\\
On 25 April 2015 at 20:49, Jeetendra Gangele gangele...@gmail.com wrote:
Hi Ayan can you try below line
loc = D:\\Project\\Spark\\code\\news\\jsonfeeds
On 25 April 2015 at 20:08, ayan guha guha.a...@gmail.com wrote:
Hi
I am facing this
Hi Ayan can you try below line
loc = D:\\Project\\Spark\\code\\news\\jsonfeeds
On 25 April 2015 at 20:08, ayan guha guha.a...@gmail.com wrote:
Hi
I am facing this weird issue.
I am on Windows, and I am trying to load all files within a folder. Here
is my code -
loc =
extra forward slash at the end. sometime I have seen this kind of issues
On 25 April 2015 at 20:50, Jeetendra Gangele gangele...@gmail.com wrote:
loc = D:\\Project\\Spark\\code\\news\\jsonfeeds\\
On 25 April 2015 at 20:49, Jeetendra Gangele gangele...@gmail.com wrote:
Hi Ayan can you try
also if this code is in scala why not val in newsY? is this define above?
loc = D:\\Project\\Spark\\code\\news\\jsonfeeds
newsY = sc.textFile(loc)
print newsY.count()
On 25 April 2015 at 20:08, ayan guha guha.a...@gmail.com wrote:
Hi
I am facing this weird issue.
I am on Windows, and I