kManager() which writes to the
> standard
> > out of the TaskManager processes, usually to .out files in the ./log
> > folder.
> >
> > Best, Fabian
> >
> > 2017-03-01 7:16 GMT+01:00 Pawan Manishka Gunarathna <
> > pawan.manis...@gmail.com>:
>
thods for DataSet (including print()) will just add
> operators to the plan but not really run it. If the DASInputFormat has no
> error, you can run the plan by calling environment.execute().
>
> Best,
> Xingcan
>
> On Wed, Mar 1, 2017 at 12:17 PM, Pawan Manishka Gunarathna <
Hi,
I have implemented a Flink InputFormat interface related to my datasource.
It have our own data type as *Record*. So my class seems as follows,
public class DASInputFormat implements InputFormat {
}
So when I executed the print() method, my console shows the Flink execution,
but nothing will
one because it is not easy to implement a generic method
> > automatically split a query into multiple (preferably equal-sized)
> partial
> > queries.
> >
> > Best, Fabian
> >
> > 2017-01-24 6:31 GMT+01:00 Pawan Manishka Gunarathna <
> > pawan.manis...@g
Hi,
I'm currently working on Flink InputFormat related implementation. So I
have written class as follows.
public class MyInputFormat extends RichInputFormat
implements ResultTypeQueryable {
But here I'm getting error with *Row *parameter. I can't import that
*org.apache.flink.types.Row.*
I have
and open() them. If there are no InputSplits there is
> no work to be done and open will not be called.
> You can tweak the behavior by implementing your own InputSplits and
> InputSplitAssigner which assigns exactly one input split to each task.
>
> Fabian
>
> 2017-01-23 8:44 GMT+01:00
Hi,
When we are implementing that Flink *InputFormat* Interface, if we have that*
input split creation* part in our data analytics server APIs can we
directly go to the second phase of the flink InputFormat Interface
execution.
Basically I need to know that can we read those InputSplits directly,
n/java/es/accenture/flink/Sources/KuduInputFormat.java
>
>
> El 19/1/17 6:03, "Pawan Manishka Gunarathna"
> escribió:
>
> Hi,
> When we are implementing that InputFormat Interface, if we have that
> Input
> split part in our data analytics server APIs can we
'JDBCInputFormat' in Flink. Can you provide some information
regarding how that JDBCInputFormat execution happens?
Thanks,
Pawan
On Mon, Jan 16, 2017 at 3:37 PM, Pawan Manishka Gunarathna <
pawan.manis...@gmail.com> wrote:
> Hi Fabian,
> Thanks for providing those information.
>
&g
u can also mix this; i.e define the InputSplit type (T) in
> your implemenetation, but leave
> OT to the user.
>
> Regards,
> Chesnay
>
>
>
> On 18.01.2017 04:52, Pawan Manishka Gunarathna wrote:
>
>> Hi,
>> Yeah I also wrote in the way you have written..
like this?
>
>public class MyInputFormat implements InputFormatInputSplit> {
>
>}
>
> Regards,
> Chesnay
>
> On 17.01.2017 04:18, Pawan Manishka Gunarathna wrote:
>
>> Hi,
>>
>> I'm currently working on Flink InputFormat Int
Hi,
I'm currently working on Flink InputFormat Interface implementation. I'm
writing a java program to read data from a file using InputputFormat
Interface. I used maven project and I have added following dependencies to
the pom.xml.
org.apache.flink
flink-core
1.1.4
gt; [1]
> https://github.com/apache/flink/blob/master/flink-core/
> src/main/java/org/apache/flink/api/common/io/InputFormat.java
>
>
> 2017-01-16 6:18 GMT+01:00 Pawan Manishka Gunarathna <
> pawan.manis...@gmail.com>:
>
> > Hi,
> >
> > we have a data an
Hi,
we have a data analytics server that has analytics data tables. So I need
to write a custom *Java* implementation for read data from that data source
and do processing (*batch* processing) using Apache Flink. Basically it's
like a new client connector for Flink.
So It would be great if you ca
14 matches
Mail list logo