la
module?
-Original Message-
From: Aljoscha Krettek [mailto:aljos...@apache.org]
Sent: Monday, November 28, 2016 3:00 PM
To: dev@flink.apache.org
Subject: Re: Move Row, RowInputFormat to core package
If we move it to core, we have to untangle it from Scala, as Timo said. The
reason is that we wou
What do you think about moving "Row" not into core module, but into Scala
module?
-Original Message-
From: Aljoscha Krettek [mailto:aljos...@apache.org]
Sent: Monday, November 28, 2016 3:00 PM
To: dev@flink.apache.org
Subject: Re: Move Row, RowInputFormat to core package
If
le
> (https://github.com/apache/flink/compare/master...tonycox:FLINK-2186-x)
> (https://travis-ci.org/tonycox/flink/builds/178846355)
>
> -Original Message-
> From: Flavio Pompermaier [mailto:pomperma...@okkam.it]
> Sent: Friday, November 25, 2016 5:59 PM
> To: de
om/apache/flink/compare/master...tonycox:FLINK-2186-x)
(https://travis-ci.org/tonycox/flink/builds/178846355)
-Original Message-
From: Flavio Pompermaier [mailto:pomperma...@okkam.it]
Sent: Friday, November 25, 2016 5:59 PM
To: dev@flink.apache.org
Subject: Re: Move Row, RowInputFormat to core package
Fully agree with Timo :)
On Fri, Nov 25, 2016 at 2:30 PM, Timo Walther wrote:
> Hi Anton,
>
> I would also support the idea of moving Row and RowTypeInfo to Flink core.
> I think there are many real-world use cases where a variable-length record
> that supports null values is required. However,
Hi Anton,
I would also support the idea of moving Row and RowTypeInfo to Flink
core. I think there are many real-world use cases where a
variable-length record that supports null values is required. However, I
think that those classes needs to be reworked before. They should not
depend on Sca
Hello,
In Scala case classes can store huge count of fields, it's really helpful for
reading wide csv files, but It uses only in table api.
what about this issue (https://issues.apache.org/jira/browse/FLINK-2186),
should we use table api in machine learning library?
To solve the issue #readC