[jira] [Commented] (FLINK-16726) ScalarFunction throws Given parameters of function 'func' do not match any signature.

2020-03-25 Thread Matrix42 (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-16726?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17067323#comment-17067323
 ] 

Matrix42 commented on FLINK-16726:
--

Thanks [~jark].

> ScalarFunction throws Given parameters of function 'func' do not match any 
> signature.
> -
>
> Key: FLINK-16726
> URL: https://issues.apache.org/jira/browse/FLINK-16726
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Planner
>Affects Versions: 1.10.0
> Environment: [^Flinktest.zip]
>Reporter: Matrix42
>Priority: Major
> Attachments: Flinktest.zip
>
>
> I write a ScalarFunction as follow:
>  
> {code:java}
> public class UDF3 extends ScalarFunction {
> public String eval(String s, int a, double d) {
> return s + a + d;
> }
> @Override
> public boolean isDeterministic() {
> return true;
> }
> @Override
> public TypeInformation getResultType(Class[] signature) {
> return Types.STRING;
> }
> @Override
> public TypeInformation[] getParameterTypes(Class[] signature) {
> return new TypeInformation[]{Types.STRING, Types.INT, Types.DOUBLE};
> }
> 
> }
> {code}
> I use it in sql `select func(s, 1,2.2) from source`, Flink throw exception as 
> follow:
>  
>  
> {noformat}
> Exception in thread "main" org.apache.flink.table.api.ValidationException: 
> SQL validation failed. Given parameters of function 'func' do not match any 
> signature. Exception in thread "main" 
> org.apache.flink.table.api.ValidationException: SQL validation failed. Given 
> parameters of function 'func' do not match any signature. Actual: 
> (java.lang.String, java.lang.Integer, java.math.BigDecimal) Expected: 
> (java.lang.String, int, double) at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:129)
>  at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.validate(FlinkPlannerImpl.scala:104)
>  at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:127)
>  at 
> org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:85)
>  at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:464)
>  at com.lorinda.template.TestUDF3.main(TestUDF3.java:40){noformat}
>  
> the full code is in the [^Flinktest.zip] , class name is 
> com.lorinda.template.TestUDF3
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-16726) ScalarFunction throws Given parameters of function 'func' do not match any signature.

2020-03-25 Thread Matrix42 (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-16726?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17066728#comment-17066728
 ] 

Matrix42 commented on FLINK-16726:
--

[~libenchao] It works.

> ScalarFunction throws Given parameters of function 'func' do not match any 
> signature.
> -
>
> Key: FLINK-16726
> URL: https://issues.apache.org/jira/browse/FLINK-16726
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Planner
>Affects Versions: 1.10.0
> Environment: [^Flinktest.zip]
>Reporter: Matrix42
>Priority: Major
> Attachments: Flinktest.zip
>
>
> I write a ScalarFunction as follow:
>  
> {code:java}
> public class UDF3 extends ScalarFunction {
> public String eval(String s, int a, double d) {
> return s + a + d;
> }
> @Override
> public boolean isDeterministic() {
> return true;
> }
> @Override
> public TypeInformation getResultType(Class[] signature) {
> return Types.STRING;
> }
> @Override
> public TypeInformation[] getParameterTypes(Class[] signature) {
> return new TypeInformation[]{Types.STRING, Types.INT, Types.DOUBLE};
> }
> 
> }
> {code}
> I use it in sql `select func(s, 1,2.2) from source`, Flink throw exception as 
> follow:
>  
>  
> {noformat}
> Exception in thread "main" org.apache.flink.table.api.ValidationException: 
> SQL validation failed. Given parameters of function 'func' do not match any 
> signature. Exception in thread "main" 
> org.apache.flink.table.api.ValidationException: SQL validation failed. Given 
> parameters of function 'func' do not match any signature. Actual: 
> (java.lang.String, java.lang.Integer, java.math.BigDecimal) Expected: 
> (java.lang.String, int, double) at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:129)
>  at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.validate(FlinkPlannerImpl.scala:104)
>  at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:127)
>  at 
> org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:85)
>  at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:464)
>  at com.lorinda.template.TestUDF3.main(TestUDF3.java:40){noformat}
>  
> the full code is in the [^Flinktest.zip] , class name is 
> com.lorinda.template.TestUDF3
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-16726) ScalarFunction throws Given parameters of function 'func' do not match any signature.

2020-03-25 Thread Matrix42 (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-16726?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17066695#comment-17066695
 ] 

Matrix42 commented on FLINK-16726:
--

Hello! [~jark] , can you take a look at this problem? My UDT need a double but 
Flink give me a decimal.Thanks.

> ScalarFunction throws Given parameters of function 'func' do not match any 
> signature.
> -
>
> Key: FLINK-16726
> URL: https://issues.apache.org/jira/browse/FLINK-16726
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Planner
>Affects Versions: 1.10.0
> Environment: [^Flinktest.zip]
>Reporter: Matrix42
>Priority: Major
> Attachments: Flinktest.zip
>
>
> I write a ScalarFunction as follow:
>  
> {code:java}
> public class UDF3 extends ScalarFunction {
> public String eval(String s, int a, double d) {
> return s + a + d;
> }
> @Override
> public boolean isDeterministic() {
> return true;
> }
> @Override
> public TypeInformation getResultType(Class[] signature) {
> return Types.STRING;
> }
> @Override
> public TypeInformation[] getParameterTypes(Class[] signature) {
> return new TypeInformation[]{Types.STRING, Types.INT, Types.DOUBLE};
> }
> 
> }
> {code}
> I use it in sql `select func(s, 1,2.2) from source`, Flink throw exception as 
> follow:
>  
>  
> {noformat}
> Exception in thread "main" org.apache.flink.table.api.ValidationException: 
> SQL validation failed. Given parameters of function 'func' do not match any 
> signature. Exception in thread "main" 
> org.apache.flink.table.api.ValidationException: SQL validation failed. Given 
> parameters of function 'func' do not match any signature. Actual: 
> (java.lang.String, java.lang.Integer, java.math.BigDecimal) Expected: 
> (java.lang.String, int, double) at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:129)
>  at 
> org.apache.flink.table.planner.calcite.FlinkPlannerImpl.validate(FlinkPlannerImpl.scala:104)
>  at 
> org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:127)
>  at 
> org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:85)
>  at 
> org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:464)
>  at com.lorinda.template.TestUDF3.main(TestUDF3.java:40){noformat}
>  
> the full code is in the [^Flinktest.zip] , class name is 
> com.lorinda.template.TestUDF3
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (FLINK-16727) cannot cast 2020-11-12 as class java.time.LocalDate

2020-03-25 Thread Matrix42 (Jira)


[ 
https://issues.apache.org/jira/browse/FLINK-16727?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17066487#comment-17066487
 ] 

Matrix42 commented on FLINK-16727:
--

Exception in thread "main" java.lang.AssertionError: cannot cast 2020-11-12 as 
class java.time.LocalDate
 at org.apache.calcite.sql.SqlLiteral.getValueAs(SqlLiteral.java:351)
 at 
org.apache.calcite.sql.SqlCallBinding.getOperandLiteralValue(SqlCallBinding.java:217)
 at 
org.apache.flink.table.planner.functions.utils.ScalarSqlFunction$$anon$1$$anonfun$1.apply(ScalarSqlFunction.scala:92)
 at 
org.apache.flink.table.planner.functions.utils.ScalarSqlFunction$$anon$1$$anonfun$1.apply(ScalarSqlFunction.scala:88)
 at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
 at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
 at scala.collection.immutable.Range.foreach(Range.scala:160)
 at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
 at scala.collection.AbstractTraversable.map(Traversable.scala:104)
 at 
org.apache.flink.table.planner.functions.utils.ScalarSqlFunction$$anon$1.inferReturnType(ScalarSqlFunction.scala:88)
 at org.apache.calcite.sql.SqlOperator.inferReturnType(SqlOperator.java:470)
 at org.apache.calcite.sql.SqlOperator.validateOperands(SqlOperator.java:437)
 at org.apache.calcite.sql.SqlFunction.deriveType(SqlFunction.java:303)
 at org.apache.calcite.sql.SqlFunction.deriveType(SqlFunction.java:219)
 at 
org.apache.calcite.sql.validate.SqlValidatorImpl$DeriveTypeVisitor.visit(SqlValidatorImpl.java:5599)
 at 
org.apache.calcite.sql.validate.SqlValidatorImpl$DeriveTypeVisitor.visit(SqlValidatorImpl.java:5586)
 at org.apache.calcite.sql.SqlCall.accept(SqlCall.java:139)
 at 
org.apache.calcite.sql.validate.SqlValidatorImpl.deriveTypeImpl(SqlValidatorImpl.java:1690)
 at 
org.apache.calcite.sql.validate.SqlValidatorImpl.deriveType(SqlValidatorImpl.java:1675)
 at org.apache.calcite.sql.SqlAsOperator.deriveType(SqlAsOperator.java:133)
 at 
org.apache.calcite.sql.validate.SqlValidatorImpl$DeriveTypeVisitor.visit(SqlValidatorImpl.java:5599)
 at 
org.apache.calcite.sql.validate.SqlValidatorImpl$DeriveTypeVisitor.visit(SqlValidatorImpl.java:5586)
 at org.apache.calcite.sql.SqlCall.accept(SqlCall.java:139)
 at 
org.apache.calcite.sql.validate.SqlValidatorImpl.deriveTypeImpl(SqlValidatorImpl.java:1690)
 at 
org.apache.calcite.sql.validate.SqlValidatorImpl.deriveType(SqlValidatorImpl.java:1675)
 at 
org.apache.calcite.sql.validate.SqlValidatorImpl.expandSelectItem(SqlValidatorImpl.java:478)
 at 
org.apache.calcite.sql.validate.SqlValidatorImpl.validateSelectList(SqlValidatorImpl.java:4104)
 at 
org.apache.calcite.sql.validate.SqlValidatorImpl.validateSelect(SqlValidatorImpl.java:3388)
 at 
org.apache.calcite.sql.validate.SelectNamespace.validateImpl(SelectNamespace.java:60)
 at 
org.apache.calcite.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:84)
 at 
org.apache.calcite.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:1007)
 at 
org.apache.calcite.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:967)
 at org.apache.calcite.sql.SqlSelect.validate(SqlSelect.java:216)
 at 
org.apache.calcite.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:942)
 at 
org.apache.calcite.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:649)
 at 
org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:125)
 at 
org.apache.flink.table.planner.calcite.FlinkPlannerImpl.validate(FlinkPlannerImpl.scala:104)
 at 
org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:127)
 at 
org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:85)
 at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:464)
 at com.lorinda.template.TestDateFunction.main(TestDateFunction.java:41)

Process finished with exit code 1

> cannot cast 2020-11-12 as class java.time.LocalDate
> ---
>
> Key: FLINK-16727
> URL: https://issues.apache.org/jira/browse/FLINK-16727
> Project: Flink
>  Issue Type: Bug
>  Components: Table SQL / Planner
>Affects Versions: 1.10.0
> Environment: [^Flinktest.zip]
>Reporter: Matrix42
>Priority: Major
> Attachments: Flinktest.zip
>
>
> I defined as ScalarFunction as follow:
>  
> {code:java}
> public class DateFunc extends ScalarFunction {
> public String eval(Date date) {
>   return date.toString();
> }
> @Override
> public TypeInformation getResultType(Class[] signatur

[jira] [Created] (FLINK-16727) cannot cast 2020-11-12 as class java.time.LocalDate

2020-03-23 Thread Matrix42 (Jira)
Matrix42 created FLINK-16727:


 Summary: cannot cast 2020-11-12 as class java.time.LocalDate
 Key: FLINK-16727
 URL: https://issues.apache.org/jira/browse/FLINK-16727
 Project: Flink
  Issue Type: Bug
  Components: Table SQL / Planner
Affects Versions: 1.10.0
 Environment: [^Flinktest.zip]
Reporter: Matrix42
 Attachments: Flinktest.zip

I defined as ScalarFunction as follow:

 
{code:java}
public class DateFunc extends ScalarFunction {


public String eval(Date date) {
  return date.toString();
}

@Override
public TypeInformation getResultType(Class[] signature) {
return Types.STRING;
}

   @Override
   public TypeInformation[] getParameterTypes(Class[] signature) {
  return new TypeInformation[]{Types.INT};
   }
}
{code}
I ues it in sql: `select func(DATE '2020-11-12') as a from source` , Flink 
throws 'cannot cast 2020-11-12 as class java.time.LocalDate '

 

The full code is in the [^Flinktest.zip] Main class is 
com.lorinda.template.TestDateFunction



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Created] (FLINK-16726) ScalarFunction throws Given parameters of function 'func' do not match any signature.

2020-03-23 Thread Matrix42 (Jira)
Matrix42 created FLINK-16726:


 Summary: ScalarFunction throws Given parameters of function 'func' 
do not match any signature.
 Key: FLINK-16726
 URL: https://issues.apache.org/jira/browse/FLINK-16726
 Project: Flink
  Issue Type: Bug
  Components: Table SQL / Planner
Affects Versions: 1.10.0
 Environment: [^Flinktest.zip]
Reporter: Matrix42
 Attachments: Flinktest.zip

I write a ScalarFunction as follow:

 
{code:java}
public class UDF3 extends ScalarFunction {

public String eval(String s, int a, double d) {
return s + a + d;
}

@Override
public boolean isDeterministic() {
return true;
}

@Override
public TypeInformation getResultType(Class[] signature) {
return Types.STRING;
}

@Override
public TypeInformation[] getParameterTypes(Class[] signature) {
return new TypeInformation[]{Types.STRING, Types.INT, Types.DOUBLE};
}

}
{code}
I use it in sql `select func(s, 1,2.2) from source`, Flink throw exception as 
follow:

 

 
{noformat}
Exception in thread "main" org.apache.flink.table.api.ValidationException: SQL 
validation failed. Given parameters of function 'func' do not match any 
signature. Exception in thread "main" 
org.apache.flink.table.api.ValidationException: SQL validation failed. Given 
parameters of function 'func' do not match any signature. Actual: 
(java.lang.String, java.lang.Integer, java.math.BigDecimal) Expected: 
(java.lang.String, int, double) at 
org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:129)
 at 
org.apache.flink.table.planner.calcite.FlinkPlannerImpl.validate(FlinkPlannerImpl.scala:104)
 at 
org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:127)
 at 
org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:85) 
at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:464)
 at com.lorinda.template.TestUDF3.main(TestUDF3.java:40){noformat}
 

the full code is in the [^Flinktest.zip] , class name is 
com.lorinda.template.TestUDF3

 

 

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Updated] (FLINK-11958) flink on windows yarn deploy failed

2019-03-19 Thread Matrix42 (JIRA)


 [ 
https://issues.apache.org/jira/browse/FLINK-11958?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matrix42 updated FLINK-11958:
-
Affects Version/s: 1.7.2
Fix Version/s: 1.8.1
   1.8.0
   1.7.3

> flink on windows yarn deploy failed
> ---
>
> Key: FLINK-11958
> URL: https://issues.apache.org/jira/browse/FLINK-11958
> Project: Flink
>  Issue Type: Bug
>  Components: Deployment / YARN
>Affects Versions: 1.7.2
>Reporter: Matrix42
>Assignee: Matrix42
>Priority: Major
> Fix For: 1.7.3, 1.8.0, 1.8.1
>
>
> Flink Version : 1.7.2
> Hadoop Version:2.7.5
> Yarn log:
> Application application_1551710861615_0002 failed 1 times due to AM Container 
> for appattempt_1551710861615_0002_01 exited with exitCode: 1
> For more detailed output, check application tracking 
> page:http://DESKTOP-919H80J:8088/cluster/app/application_1551710861615_0002Then,
>  click on links to logs of each attempt.
> Diagnostics: Exception from container-launch.
> Container id: container_1551710861615_0002_01_01
> Exit code: 1
> Stack trace: ExitCodeException exitCode=1:
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:585)
> at org.apache.hadoop.util.Shell.run(Shell.java:482)
> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:776)
> at 
> org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212)
> at 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
> at 
> org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:748)
> Shell output: 移动了 1 个文件。
> Container exited with a non-zero exit code 1
> Failing this attempt. Failing the application.
>  
> jobmanager.err:
> '$JAVA_HOME' 不是内部或外部命令,也不是可运行的程序或批处理文件。
> english: (Not internal or external commands, nor runnable programs or batch 
> files)
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11958) flink on windows yarn deploy failed

2019-03-18 Thread Matrix42 (JIRA)
Matrix42 created FLINK-11958:


 Summary: flink on windows yarn deploy failed
 Key: FLINK-11958
 URL: https://issues.apache.org/jira/browse/FLINK-11958
 Project: Flink
  Issue Type: Bug
  Components: Deployment / YARN
Reporter: Matrix42
Assignee: Matrix42


Flink Version : 1.7.2

Hadoop Version:2.7.5

Yarn log:

Application application_1551710861615_0002 failed 1 times due to AM Container 
for appattempt_1551710861615_0002_01 exited with exitCode: 1
For more detailed output, check application tracking 
page:http://DESKTOP-919H80J:8088/cluster/app/application_1551710861615_0002Then,
 click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_1551710861615_0002_01_01
Exit code: 1
Stack trace: ExitCodeException exitCode=1:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:585)
at org.apache.hadoop.util.Shell.run(Shell.java:482)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:776)
at 
org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212)
at 
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
at 
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Shell output: 移动了 1 个文件。
Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.

 

jobmanager.err:
'$JAVA_HOME' 不是内部或外部命令,也不是可运行的程序或批处理文件。

english: (Not internal or external commands, nor runnable programs or batch 
files)

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11140) Can not create a Path from an empty string while use BasePathBucketAssigner

2018-12-12 Thread Matrix42 (JIRA)
Matrix42 created FLINK-11140:


 Summary: Can not create a Path from an empty string while use 
BasePathBucketAssigner
 Key: FLINK-11140
 URL: https://issues.apache.org/jira/browse/FLINK-11140
 Project: Flink
  Issue Type: Bug
  Components: Streaming
Reporter: Matrix42
Assignee: Matrix42


while use BasePathBucketAssigner, Flink throw an exception:
{code:java}
Caused by: java.lang.IllegalArgumentException: Can not create a Path from an 
empty string
at org.apache.flink.core.fs.Path.checkAndTrimPathArg(Path.java:168)
at org.apache.flink.core.fs.Path.(Path.java:181)
at org.apache.flink.core.fs.Path.(Path.java:108)
at 
org.apache.flink.streaming.api.functions.sink.filesystem.Buckets.assembleBucketPath(Buckets.java:309)
at 
org.apache.flink.streaming.api.functions.sink.filesystem.Buckets.getOrCreateBucketForBucketId(Buckets.java:278)
at 
org.apache.flink.streaming.api.functions.sink.filesystem.Buckets.onElement(Buckets.java:265)
at 
org.apache.flink.streaming.api.functions.sink.filesystem.StreamingFileSink.invoke(StreamingFileSink.java:370)
at 
org.apache.flink.streaming.api.operators.StreamSink.processElement(StreamSink.java:56)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:579)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:554)
at 
org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:534)
at 
org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:718)
at 
org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:696)
at 
org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:104)
at 
org.apache.flink.streaming.api.functions.source.FromElementsFunction.run(FromElementsFunction.java:164)
at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:94)
at 
org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:58)
at 
org.apache.flink.streaming.runtime.tasks.SourceStreamTask.run(SourceStreamTask.java:99)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:300)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:704)
at java.lang.Thread.run(Thread.java:748)
{code}
reason:

BasePathBucketAssigner#getBucketId return an empty string.
{code:java}
@Override
public String getBucketId(T element, BucketAssigner.Context context) {
   return "";
}
{code}
while construct a Path, checkAndTrimPathArg methoad will check pathString, if 
pathString is empty will throw IllegalArgumentException.
{code:java}
public Path(String pathString) {
   pathString = checkAndTrimPathArg(pathString);

  ..
}{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] flink pull request #5801: [FLINK-9121] [flip6] Remove Flip6 prefixes and oth...

2018-07-15 Thread Matrix42
Github user Matrix42 commented on a diff in the pull request:

https://github.com/apache/flink/pull/5801#discussion_r202548776
  
--- Diff: 
flink-yarn/src/main/java/org/apache/flink/yarn/YarnClusterDescriptor.java ---
@@ -49,30 +54,45 @@ public YarnClusterDescriptor(
 
@Override
protected String getYarnSessionClusterEntrypoint() {
-   return YarnApplicationMasterRunner.class.getName();
+   return YarnSessionClusterEntrypoint.class.getName();
}
 
@Override
protected String getYarnJobClusterEntrypoint() {
-   throw new UnsupportedOperationException("The old Yarn 
descriptor does not support proper per-job mode.");
+   return YarnJobClusterEntrypoint.class.getName();
}
 
@Override
-   public YarnClusterClient deployJobCluster(
-   ClusterSpecification clusterSpecification,
-   JobGraph jobGraph,
-   boolean detached) {
-   throw new UnsupportedOperationException("Cannot deploy a 
per-job yarn cluster yet.");
+   public ClusterClient deployJobCluster(
+   ClusterSpecification clusterSpecification,
+   JobGraph jobGraph,
+   boolean detached) throws ClusterDeploymentException {
+
+   // this is required because the slots are allocated lazily
+   jobGraph.setAllowQueuedScheduling(true);
+
+   try {
+   return deployInternal(
+   clusterSpecification,
+   "Flink per-job cluster",
+   getYarnJobClusterEntrypoint(),
+   jobGraph,
+   detached);
+   } catch (Exception e) {
+   throw new ClusterDeploymentException("Could not deploy 
Yarn job cluster.", e);
+   }
}
 
@Override
-   protected ClusterClient 
createYarnClusterClient(AbstractYarnClusterDescriptor descriptor, int 
numberTaskManagers, int slotsPerTaskManager, ApplicationReport report, 
Configuration flinkConfiguration, boolean perJobCluster) throws Exception {
-   return new YarnClusterClient(
-   descriptor,
-   numberTaskManagers,
-   slotsPerTaskManager,
-   report,
+   protected ClusterClient createYarnClusterClient(
+   AbstractYarnClusterDescriptor descriptor,
+   int numberTaskManagers,
+   int slotsPerTaskManager,
+   ApplicationReport report,
+   Configuration flinkConfiguration,
+   boolean perJobCluster) throws Exception {
+   return new RestClusterClient<>(
--- End diff --

why don't return a YarnClusterClient here?


---


[GitHub] flink issue #5785: [FLINK-9108][docs] Fix invalid link

2018-04-04 Thread Matrix42
Github user Matrix42 commented on the issue:

https://github.com/apache/flink/pull/5785
  
@zentol Is this closed by accident?


---


[GitHub] flink pull request #5785: [FLINK-9108][docs] Fix invalid link

2018-03-29 Thread Matrix42
GitHub user Matrix42 opened a pull request:

https://github.com/apache/flink/pull/5785

[FLINK-9108][docs] Fix invalid link

## What is the purpose of the change

Fix invalid link


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Matrix42/flink FLINK-9108

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/5785.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5785


commit a1497d313ca79794a44c134a3a034064ac441977
Author: Matrix42 <934336389@...>
Date:   2018-03-29T10:47:55Z

[FLINK-9108][docs] Fix invalid link




---


[jira] [Created] (FLINK-9108) invalid ProcessWindowFunction link in Document

2018-03-29 Thread Matrix42 (JIRA)
Matrix42 created FLINK-9108:
---

 Summary: invalid ProcessWindowFunction link in Document 
 Key: FLINK-9108
 URL: https://issues.apache.org/jira/browse/FLINK-9108
 Project: Flink
  Issue Type: Bug
  Components: Documentation
Reporter: Matrix42
Assignee: Matrix42
 Attachments: QQ截图20180329184203.png

!QQ截图20180329184203.png!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] flink issue #5770: [FLINK-9093][document] add a backup js,in case of can't a...

2018-03-27 Thread Matrix42
Github user Matrix42 commented on the issue:

https://github.com/apache/flink/pull/5770
  
I saw there just a flink.js in the Flink project,so I didn't add it. I will 
add the jQuery file to the Flink website.


---


[GitHub] flink pull request #5770: [FLINK-9093][document] add a backup js,in case of ...

2018-03-26 Thread Matrix42
GitHub user Matrix42 opened a pull request:

https://github.com/apache/flink/pull/5770

[FLINK-9093][document] add a backup js,in case of can't access Google

## What is the purpose of the change

If cant't access Google the jquery.min.js can't be loaded,this will lead to 
the document can't be listed.

## Brief change log

add a backup js

## Verifying this change

*(Please pick either of the following options)*

tested in local html



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Matrix42/flink jquery

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/5770.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5770


commit 5a8718c83365224665633d9600cbd92aca758bb7
Author: Matrix42 <934336389@...>
Date:   2018-03-27T06:28:07Z

[FLINK-9093] add a backup js,in case of can't access Google




---


[jira] [Created] (FLINK-9093) If Google can't be accessed,the document can't be use

2018-03-26 Thread Matrix42 (JIRA)
Matrix42 created FLINK-9093:
---

 Summary: If Google can't be accessed,the document can't be use
 Key: FLINK-9093
 URL: https://issues.apache.org/jira/browse/FLINK-9093
 Project: Flink
  Issue Type: Improvement
  Components: Documentation
Reporter: Matrix42
Assignee: Matrix42
 Attachments: QQ截图20180327142447.png

!QQ截图20180327142447.png!

these links can't be visited.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] flink pull request #5592: [hotfix] fix javadoc link of ClusterClient#trigger...

2018-02-27 Thread Matrix42
GitHub user Matrix42 opened a pull request:

https://github.com/apache/flink/pull/5592

[hotfix] fix javadoc link of ClusterClient#triggerSavepoint

## What is the purpose of the change

fix javadoc link of ClusterClient#triggerSavepoint

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Matrix42/flink doc

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/5592.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5592


commit 09691377720361cd2d05007371dbc579aa876bc0
Author: Matrix42 <934336389@...>
Date:   2018-02-28T02:51:42Z

[hotfix] fix javadoc link of ClusterClient#triggerSavepoint




---


[GitHub] flink pull request #5574: [FLINK-8772][kafka] fix missing log parameter

2018-02-24 Thread Matrix42
GitHub user Matrix42 opened a pull request:

https://github.com/apache/flink/pull/5574

[FLINK-8772][kafka] fix missing log parameter


## Brief change log

fix FlinkKafkaConsumerBase missing log parameter


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Matrix42/flink kafka-connector

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/5574.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5574


commit 8bc5ee108c3223526ed1d0ceff032a7b63698520
Author: Matrix42 <934336389@...>
Date:   2018-02-24T13:52:44Z

[FLINK-8772][kafka] fix missing log parameter




---


[jira] [Created] (FLINK-8772) FlinkKafkaConsumerBase partitions discover missing a log parameter

2018-02-24 Thread Matrix42 (JIRA)
Matrix42 created FLINK-8772:
---

 Summary: FlinkKafkaConsumerBase partitions discover missing a log 
parameter
 Key: FLINK-8772
 URL: https://issues.apache.org/jira/browse/FLINK-8772
 Project: Flink
  Issue Type: Bug
  Components: Kafka Connector
Affects Versions: 1.4.0
Reporter: Matrix42
 Fix For: 1.4.2






--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[GitHub] flink pull request #5237: [hotfix][doc] fix typo in filesystems.md

2018-01-03 Thread Matrix42
GitHub user Matrix42 opened a pull request:

https://github.com/apache/flink/pull/5237

[hotfix][doc] fix typo in filesystems.md

## Brief change log

* fix typo in filesystems.md


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Matrix42/flink typo

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/5237.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5237


commit 2952f39004225632351e58421d19a791e80a67c3
Author: Matrix42 <934336389@...>
Date:   2018-01-03T15:32:45Z

[hotfix][doc] fix typo in filesystems.md




---


[GitHub] flink pull request #5180: [FLINK-8292] Remove unnecessary force cast in Data...

2018-01-02 Thread Matrix42
Github user Matrix42 closed the pull request at:

https://github.com/apache/flink/pull/5180


---


[GitHub] flink pull request #5180: [FLINK-8292] Remove unnecessary force cast in Data...

2017-12-19 Thread Matrix42
GitHub user Matrix42 opened a pull request:

https://github.com/apache/flink/pull/5180

[FLINK-8292] Remove unnecessary force cast in DataStreamSource

## What is the purpose of the change

Remove unnecessary force cast in DataStreamSource

## Brief change log

*(for example:)*
  - *The TaskInfo is stored in the blob store on job creation time as a 
persistent artifact*
  - *Deployments RPC transmits only the blob storage reference*
  - *TaskManagers retrieve the TaskInfo from the blob cache*


## Verifying this change

This change is already covered by existing tests, such as 
DataStreamTest.testParallelism()

## Does this pull request potentially affect one of the following parts:

  - Dependencies (does it add or upgrade a dependency): ( no)
  - The public API, i.e., is any changed class annotated with 
`@Public(Evolving)`: ( no)
  - The serializers: (no)
  - The runtime per-record code paths (performance sensitive): ( no )
  - Anything that affects deployment or recovery: JobManager (and its 
components), Checkpointing, Yarn/Mesos, ZooKeeper: (no)
  - The S3 file system connector: (no )

## Documentation

  - Does this pull request introduce a new feature? (no)
  - If yes, how is the feature documented? (not applicable / docs / 
JavaDocs / not documented)


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Matrix42/flink DataStreamSource

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/5180.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5180


commit d5c55d2a73fdbcc0dfadd05ad87203ca70d64eac
Author: Matrix42 <934336...@qq.com>
Date:   2017-12-19T08:20:39Z

[FLINK-8292] Remove unnecessary force cast in DataStreamSource




---


[jira] [Created] (FLINK-8292) Remove unnecessary force cast in DataStreamSource

2017-12-19 Thread Matrix42 (JIRA)
Matrix42 created FLINK-8292:
---

 Summary: Remove unnecessary force cast in DataStreamSource
 Key: FLINK-8292
 URL: https://issues.apache.org/jira/browse/FLINK-8292
 Project: Flink
  Issue Type: Improvement
  Components: DataStream API
Reporter: Matrix42
Priority: Trivial


In DataStreamSource there is a cast can be replaced by retuen `this`



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[jira] [Updated] (FLINK-8292) Remove unnecessary force cast in DataStreamSource

2017-12-19 Thread Matrix42 (JIRA)

 [ 
https://issues.apache.org/jira/browse/FLINK-8292?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matrix42 updated FLINK-8292:

Fix Version/s: 1.4.1
   1.5.0

> Remove unnecessary force cast in DataStreamSource
> -
>
> Key: FLINK-8292
> URL: https://issues.apache.org/jira/browse/FLINK-8292
> Project: Flink
>  Issue Type: Improvement
>  Components: DataStream API
>    Reporter: Matrix42
>Priority: Trivial
> Fix For: 1.5.0, 1.4.1
>
>
> In DataStreamSource there is a cast can be replaced by retuen `this`



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)


[GitHub] flink pull request #5164: [hotfix][javadoc] fix typo in StreamExecutionEnvir...

2017-12-13 Thread Matrix42
GitHub user Matrix42 opened a pull request:

https://github.com/apache/flink/pull/5164

[hotfix][javadoc] fix typo in StreamExecutionEnvironment javadoc

## What is the purpose of the change

fix typo in StreamExecutionEnvironment javadoc


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Matrix42/flink master

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/5164.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5164


commit ad4e99607d7f655a3a5a8dff94cef82606c3bcaf
Author: Matrix42 <934336...@qq.com>
Date:   2017-12-13T15:59:07Z

[hotfix][javadoc] fix typo in StreamExecutionEnvironment javadoc




---


[GitHub] flink pull request #5152: [hotfix][javadoc]fix spelling mistake in StreamEle...

2017-12-12 Thread Matrix42
GitHub user Matrix42 opened a pull request:

https://github.com/apache/flink/pull/5152

[hotfix][javadoc]fix spelling mistake in StreamElement javadoc

## What is the purpose of the change

fix spelling mistake in StreamElement javadoc

## Documentation

  - Does this pull request introduce a new feature? (no)
  - If yes, how is the feature documented? (JavaDocs )


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Matrix42/flink master

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/5152.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5152


commit e921d05fd9cfeb053cced067969e27f87da03225
Author: Matrix42 <934336...@qq.com>
Date:   2017-12-12T10:10:15Z

[hotfix][javadoc]fix spelling mistake in StreamElement javadoc




---


[GitHub] flink pull request #5151: [hotfix][javadoc]fix spelling mistake in StreamEle...

2017-12-12 Thread Matrix42
Github user Matrix42 closed the pull request at:

https://github.com/apache/flink/pull/5151


---


[GitHub] flink pull request #5151: [hotfix][javadoc]fix spelling mistake in StreamEle...

2017-12-12 Thread Matrix42
GitHub user Matrix42 opened a pull request:

https://github.com/apache/flink/pull/5151

[hotfix][javadoc]fix spelling mistake in StreamElement javadoc

## What is the purpose of the change

fix spelling mistake in StreamElement javadoc

## Documentation

  - Does this pull request introduce a new feature? (no)
  - If yes, how is the feature documented? ( JavaDocs )


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Matrix42/flink master

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/flink/pull/5151.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #5151


commit 75629f834eb66145aa0c9efe28bc5c603c76d530
Author: Matrix42 <934336...@qq.com>
Date:   2017-11-27T09:34:01Z

fix document

commit 5dd1678006ce4a9b06f393404062c9bfa7ff6284
Author: Matrix42 <934336...@qq.com>
Date:   2017-11-28T01:55:06Z

Merge branch 'master' of github.com:apache/flink

commit c47e0c6cca9d31f4642fcbf92da4b1877216d428
Author: Matrix42 <934336...@qq.com>
Date:   2017-11-29T03:33:07Z

Merge branch 'master' of github.com:apache/flink

commit 02ba3878195d213ec7ffdf70845e3776e05e320d
Author: Matrix42 <934336...@qq.com>
Date:   2017-12-11T07:44:36Z

Merge branch 'master' of github.com:apache/flink

commit e3d070e2dad84b7eb6dd8d8cbe8d4f04ae26a8a2
Author: Matrix42 <934336...@qq.com>
Date:   2017-12-12T10:10:15Z

[hotfix][javadoc]fix spelling mistake in StreamElement javadoc




---


[GitHub] flink pull request #5077: [docs] fix wrong package name

2017-11-27 Thread Matrix42
GitHub user Matrix42 opened a pull request:

https://github.com/apache/flink/pull/5077

[docs] fix wrong package name

*Thank you very much for contributing to Apache Flink - we are happy that 
you want to help us improve Flink. To help the community review your 
contribution in the best possible way, please go through the checklist below, 
which will get the contribution into a shape in which it can be best reviewed.*

*Please understand that we do not do this to make contributions to Flink a 
hassle. In order to uphold a high standard of quality for code contributions, 
while at the same time managing a large number of contributions, we need 
contributors to prepare the contributions well, and give reviewers enough 
contextual information for the review. Please also understand that 
contributions that do not follow this guide will take longer to review and thus 
typically be picked up with lower priority by the community.*

## Contribution Checklist

  - Make sure that the pull request corresponds to a [JIRA 
issue](https://issues.apache.org/jira/projects/FLINK/issues). Exceptions are 
made for typos in JavaDoc or documentation files, which need no JIRA issue.
  
  - Name the pull request in the form "[FLINK-] [component] Title of 
the pull request", where *FLINK-* should be replaced by the actual issue 
number. Skip *component* if you are unsure about which is the best component.
  Typo fixes that have no associated JIRA issue should be named following 
this pattern: `[hotfix] [docs] Fix typo in event time introduction` or 
`[hotfix] [javadocs] Expand JavaDoc for PuncuatedWatermarkGenerator`.

  - Fill out the template below to describe the changes contributed by the 
pull request. That will give reviewers the context they need to do the review.
  
  - Make sure that the change passes the automated tests, i.e., `mvn clean 
verify` passes. You can set up Travis CI to do that following [this 
guide](http://flink.apache.org/contribute-code.html#best-practices).

  - Each pull request should address only one issue, not mix up code from 
multiple issues.
  
  - Each commit in the pull request has a meaningful commit message 
(including the JIRA id)

  - Once all items of the checklist are addressed, remove the above text 
and this checklist, leaving only the filled out template below.


**(The sections below can be removed for hotfixes of typos)**

## What is the purpose of the change

*(For example: This pull request makes task deployment go through the blob 
server, rather than through RPC. That way we avoid re-transferring them on each 
deployment (during recovery).)*


## Brief change log

*(for example:)*
  - *The TaskInfo is stored in the blob store on job creation time as a 
persistent artifact*
  - *Deployments RPC transmits only the blob storage reference*
  - *TaskManagers retrieve the TaskInfo from the blob cache*


## Verifying this change

*(Please pick either of the following options)*

This change is a trivial rework / code cleanup without any test coverage.

*(or)*

This change is already covered by existing tests, such as *(please describe 
tests)*.

*(or)*

This change added tests and can be verified as follows:

*(example:)*
  - *Added integration tests for end-to-end deployment with large payloads 
(100MB)*
  - *Extended integration test for recovery after master (JobManager) 
failure*
  - *Added test that validates that TaskInfo is transferred only once 
across recoveries*
  - *Manually verified the change by running a 4 node cluser with 2 
JobManagers and 4 TaskManagers, a stateful streaming program, and killing one 
JobManager and two TaskManagers during the execution, verifying that recovery 
happens correctly.*

## Does this pull request potentially affect one of the following parts:

  - Dependencies (does it add or upgrade a dependency): (yes / no)
  - The public API, i.e., is any changed class annotated with 
`@Public(Evolving)`: (yes / no)
  - The serializers: (yes / no / don't know)
  - The runtime per-record code paths (performance sensitive): (yes / no / 
don't know)
  - Anything that affects deployment or recovery: JobManager (and its 
components), Checkpointing, Yarn/Mesos, ZooKeeper: (yes / no / don't know)
  - The S3 file system connector: (yes / no / don't know)

## Documentation

  - Does this pull request introduce a new feature? (yes / no)
  - If yes, how is the feature documented? (not applicable / docs / 
JavaDocs / not documented)


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Matrix42/flink master

Alternatively you can review and apply these changes as the patch at:

https://github.co