@data-artisans.com> wrote:
>
> Hi,
>
> Maybe this is an access rights issue? Could you try to create and write to
> same file (same directory) in some other way (manually?), using the same
> user and the same machine as would Flink job do?
>
> Maybe there will be some hint in h
Hi all,
I'm just trying to use an HDFS file as the sink for my flink stream job. I
use the following line to do so.
stream.writeAsText("hdfs://hadoop-master:9000/user/isuru/foo");
I have not set "fs.hdfs.hadoopconf" in my flink configuration as it should
work with the full hdfs file name
You can check your topic's setup or create another topic to try this out.
>
> Hope this will help you.
>
> Best Regards,
> Tony Wei
>
> 2017-08-29 12:26 GMT+08:00 Isuru Suriarachchi <isur...@gmail.com>:
>
>> Hi all,
>>
>> I'm trying to imp
Hi all,
I'm trying to implement a Flink consumer which consumes a Kafka topic with
3 partitions. I've set the parallelism of the execution environment to 3 as
I want to make sure that each Kafka partition is consumed by a separate
parallel task in Flink. My first question is whether it's always