Alex,
Since you are trying to interact with remote fs you also need to provide
the remote fs endpoint details. You can do it in two ways.
1. include remote fs url in the file path eg:
hdfs://:/user/alex/output/
2. add remote fs to the config used by the operator. For this, you can
override the ge
I suspect since the hdfs jar is marked provided it is not in the test
classpath. So the hdfs files stem implementation is not found. If you mark
it with the test scope it will be included. There may be other issues with
the path as chandni
On Apr 3, 2016 11:58 AM, "Chandni Singh" wrote:
> Since t
Since this is happening with the FileOutputOperator can you please let me
know what is the value of property 'filePath' in the configuration?
Thanks,
Chandni
On Sun, Apr 3, 2016 at 11:26 AM, Chandni Singh
wrote:
> Hi Alex,
>
> What is the input directory path that you have specified in the
> co
Hi Alex,
What is the input directory path that you have specified in the
configuration?
On Apr 3, 2016 7:57 AM, "McCullough, Alex"
wrote:
> I have the dependencies defined in the POM but get the same error when
> running in the IDE.
>
>
> org.apache.hadoop
> hadoop-hdfs
> 2.7.2
>
I have the dependencies defined in the POM but get the same error when running
in the IDE.
org.apache.hadoop
hadoop-hdfs
2.7.2
provided
org.apache.hadoop
hadoop-client
2.7.2
provided
org.apache.hadoop
hadoop-common
2.7.2
provided
On 4
Alex,
Local mode in the IDE is fully embedded. It does not use any external
Hadoop install. Please try launching the app through the CLI (in local
mode) or you need to include the HDFS dependencies in your application pom
(with provided scope).
Thanks
--
sent from mobile
On Apr 3, 2016 7:29 AM,
Hello All,
I am trying to get my local test application to run but when I try I get an
error. This test application is running on my local machine and attempting to
connect to a remote HDFS cluster.
I have made sure I have the same version of hadoop that is on the cluster
installed locally and