yes I use version 1.6 , and thanks Ted
> Begin forwarded message:
>
> From: Robin East
> Subject: Re: spark graphx storage RDD memory leak
> Date: April 12, 2016 at 2:13:10 AM GMT+8
> To: zhang juntao
> Cc: Ted Yu , dev@spark.apache.org
>
> this looks like htt
, msg),
> sendMsg = sendMessage,
> mergeMsg = (a, b) => math.min(a, b))
> } // end of connectedComponents
> }
> thanks
> juntao
>
>
>> Begin forwarded message:
>>
>> From: Ted Yu mailto:yuzhih...@gmail.com>>
>> Subject: R
; math.min(attr, msg),
sendMsg = sendMessage,
mergeMsg = (a, b) => math.min(a, b))
} // end of connectedComponents
}
thanks
juntao
> Begin forwarded message:
>
> From: Ted Yu
> Subject: Re: spark graphx storage RDD memory leak
> Date: April 11, 2016 at 1:15:23 AM GMT
I see the following code toward the end of the method:
// Unpersist the RDDs hidden by newly-materialized RDDs
oldMessages.unpersist(blocking = false)
prevG.unpersistVertices(blocking = false)
prevG.edges.unpersist(blocking = false)
Wouldn't the above achieve same effect ?
hi experts,
I’m reporting a problem about spark graphx, I use zeppelin submit spark jobs,
note that scala environment shares the same SparkContext, SQLContext instance,
and I call Connected components algorithm to do some Business,
found that every time when the job finished, some graph storag