+1 on making this an Apex feature. It would be good to have all output
stores supported with this kind of functionality.
~Bhupesh
On Thu, Mar 31, 2016 at 11:15 AM, Priyanka Gugale
wrote:
> +1 for this feature for all output operators. I have faced this issue while
> working on CassandraOutput o
Alex,
Since you are trying to interact with remote fs you also need to provide
the remote fs endpoint details. You can do it in two ways.
1. include remote fs url in the file path eg:
hdfs://:/user/alex/output/
2. add remote fs to the config used by the operator. For this, you can
override the ge
I suspect since the hdfs jar is marked provided it is not in the test
classpath. So the hdfs files stem implementation is not found. If you mark
it with the test scope it will be included. There may be other issues with
the path as chandni
On Apr 3, 2016 11:58 AM, "Chandni Singh" wrote:
> Since t
Since this is happening with the FileOutputOperator can you please let me
know what is the value of property 'filePath' in the configuration?
Thanks,
Chandni
On Sun, Apr 3, 2016 at 11:26 AM, Chandni Singh
wrote:
> Hi Alex,
>
> What is the input directory path that you have specified in the
> co
Hi Alex,
If you can reproduce the error with 2 partitions, then the issue must be
something else. Can you share some info about the DAG? Are the HDHT
partitions parallel partitioned?
You can send me the info offline, if not suitable for mailing list.
Thanks
On Sun, Apr 3, 2016 at 7:20 AM, McCul
Hi Alex,
What is the input directory path that you have specified in the
configuration?
On Apr 3, 2016 7:57 AM, "McCullough, Alex"
wrote:
> I have the dependencies defined in the POM but get the same error when
> running in the IDE.
>
>
> org.apache.hadoop
> hadoop-hdfs
> 2.7.2
>
+1. Thank you, Thomas. I'll add my mentor sign-off after this makes it
into wiki.
--Chris Nauroth
On 4/2/16, 10:45 PM, "Thomas Weise" wrote:
>Below is the draft for what should be our final *incubator* report.
>
>Will wait for feedback till EOD Monday and then transfer to Wiki.
>
>Thanks
>
I have the dependencies defined in the POM but get the same error when running
in the IDE.
org.apache.hadoop
hadoop-hdfs
2.7.2
provided
org.apache.hadoop
hadoop-client
2.7.2
provided
org.apache.hadoop
hadoop-common
2.7.2
provided
On 4
Alex,
Local mode in the IDE is fully embedded. It does not use any external
Hadoop install. Please try launching the app through the CLI (in local
mode) or you need to include the HDFS dependencies in your application pom
(with provided scope).
Thanks
--
sent from mobile
On Apr 3, 2016 7:29 AM,
Hello All,
I am trying to get my local test application to run but when I try I get an
error. This test application is running on my local machine and attempting to
connect to a remote HDFS cluster.
I have made sure I have the same version of hadoop that is on the cluster
installed locally and
Hey Thomas,
It is a 15 node cluster, I included the cluster summary below. If I run on 1
partition no issues, other than it backs up a bit, but as soon as I move to
multiple partitions I get those errors. We are using a
AbstractFileInputOperator to read the file off of HDFS so I started with t
11 matches
Mail list logo