Exception in using updateStateByKey

2015-04-27 Thread Sea
Hi, all: I use function updateStateByKey in Spark Streaming, I need to store the states for one minite, I set spark.cleaner.ttl to 120, the duration is 2 seconds, but it throws Exception Caused by: org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File does not exist:

Re: Exception in using updateStateByKey

2015-04-27 Thread Ted Yu
Which hadoop release are you using ? Can you check hdfs audit log to see who / when deleted spark/ck/hdfsaudit/ receivedData/0/log-1430139541443-1430139601443 ? Cheers On Mon, Apr 27, 2015 at 6:21 AM, Sea 261810...@qq.com wrote: Hi, all: I use function updateStateByKey in Spark Streaming, I

?????? Exception in using updateStateByKey

2015-04-27 Thread Sea
I make it to 240, it happens again when 240 seconds is reached. -- -- ??: 261810726;261810...@qq.com; : 2015??4??27??(??) 10:24 ??: Ted Yuyuzhih...@gmail.com; : ?? Exception in using updateStateByKey Yes??I can make

?????? Exception in using updateStateByKey

2015-04-27 Thread Sea
; : Re: Exception in using updateStateByKey Can you make the value for spark.cleaner.ttl larger ?Cheers On Mon, Apr 27, 2015 at 7:13 AM, Sea 261810...@qq.com wrote: my hadoop version is 2.2.0?? the hdfs-audit.log is too large?? The problem is that?? when the checkpoint info is deleted