Re: flink dataStream operate dataSet

2016-09-05 Thread Aljoscha Krettek
Hi,
right now it is not possible to mix the DataSet and the DataStream API. The
reason for the "task not serializable" error is that putting the DataSet
into the map function tries to serialize the DataSet, which is not possible.

Cheers,
Aljoscha

On Tue, 30 Aug 2016 at 16:31  wrote:

> Hi,
>  i have a problem,a dataStream read from rabbitMQ,and others data from
> a hbase table,which is a dataSet.Those two data from follow:
>
>  val words=connectHelper.readFromRabbitMq(...)  // words is
> DataStream[String]
>  val dataSet=HBaseWrite.fullScan()  //dataSet is
> DataSet[(int,String)]
>
>  words.map{ word =>
>  val res = dataSet.map{ y =>
>val score = computerScore(x,y)
>(word,score)
>   }
>  HBaseWrite.writeToTable(res,...,)
>  }
>
>the  error is task not serializable,what is the solution?
>   under a DataStream, how to operate a DataSet?
>
>
>
>


flink dataStream operate dataSet

2016-08-30 Thread rimin515
Hi, i have a problem,a dataStream read from rabbitMQ,and others data from a 
hbase table,which is a dataSet.Those two data from follow:
 val words=connectHelper.readFromRabbitMq(...)  // words is 
DataStream[String] val dataSet=HBaseWrite.fullScan()  //dataSet is 
DataSet[(int,String)]
 words.map{ word => val res = dataSet.map{ y => 
  val score = computerScore(x,y)   (word,score)  }  
   HBaseWrite.writeToTable(res,...,) }
   the  error is task not serializable,what is the solution?  under a 
DataStream, how to operate a DataSet?