[ 
https://issues.apache.org/jira/browse/FLINK-9698?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16529406#comment-16529406
 ] 

Jeff Zhang commented on FLINK-9698:
-----------------------------------

Could anyone let me know why flink require case class must be static and 
globally accessible ? Because this similar code can work in spark, so I beleive 
it should be the same for flink. As both of them require to serialize these 
class to remote host and execute in remote side. 

> "case class must be static and globally accessible" is too constrained
> ----------------------------------------------------------------------
>
>                 Key: FLINK-9698
>                 URL: https://issues.apache.org/jira/browse/FLINK-9698
>             Project: Flink
>          Issue Type: Improvement
>            Reporter: Jeff Zhang
>            Priority: Major
>
> The following code can reproduce this issue. 
> {code}
> object BatchJob {
>   def main(args: Array[String]) {
>     // set up the batch execution environment
>     val env = ExecutionEnvironment.getExecutionEnvironment
>     val tenv = TableEnvironment.getTableEnvironment(env)
>     case class Person(id:Int, name:String)
>     val ds = env.fromElements(Person(1,"jeff"), Person(2, "andy"))
>     tenv.registerDataSet("table_1", ds);
>   }
> }
> {code}
> Although I have workaround to declare case class outside of the main method, 
> this workaround won't work in scala-shell. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to