Re: Error while reading from hadoop sequence file

2018-12-11 Thread Akshay Mendole
fo, or else what exactly is the exception > there. > > Best, > Stefan > > On 10. Dec 2018, at 13:35, Akshay Mendole wrote: > > Hi, >I have been facing issues while trying to read from a hdfs sequence > file. > > This is my code snippet > > D

Re: Error while reading from hadoop sequence file

2018-12-10 Thread Akshay Mendole
Could anyone please help me with this? Thanks, Akshay On Mon, 10 Dec 2018, 6:05 pm Akshay Mendole Hi, >I have been facing issues while trying to read from a hdfs sequence > file. > > This is my code snippet > > DataSource> input = env > .createInput(Hadoo

Error while reading from hadoop sequence file

2018-12-10 Thread Akshay Mendole
Hi, I have been facing issues while trying to read from a hdfs sequence file. This is my code snippet DataSource> input = env .createInput(HadoopInputs.readSequenceFile(Text.class, Text.class, ravenDataDir), TypeInformation.of(new TypeHint>() { })); Upon executing this

Re: OutOfMemoryError while doing join operation in flink

2018-11-27 Thread Akshay Mendole
cover this overhead memories, or set one slot for each task > manager. > > Best, > Zhijiang > > -- > 发件人:Akshay Mendole > 发送时间:2018年11月23日(星期五) 02:54 > 收件人:trohrmann > 抄 送:zhijiang ; user ; > Shreesha M

Re: OutOfMemoryError while doing join operation in flink

2018-11-22 Thread Akshay Mendole
our > join result is quite large. One record is 1 GB large. Try to decrease it or > give more memory to your TMs. > > Cheers, > Till > > On Thu, Nov 22, 2018 at 1:08 PM Akshay Mendole > wrote: > >> Hi Zhijiang, >> Thanks for the quick reply. My

OutOfMemoryError while doing join operation in flink

2018-11-21 Thread Akshay Mendole
Hi, We are converting one of our pig pipelines to flink using apache beam. The pig pipeline reads two different data sets (R1 & R2) from hdfs, enriches them, joins them and dumps back to hdfs. The data set R1 is skewed. In a sense, it has few keys with lot of records. When we converted the