I think you'd have to say more about "stopped working". Is the GC
thrashing? does the UI respond? is the CPU busy or not?

On Mon, Mar 16, 2015 at 4:25 AM, Xi Shen <davidshe...@gmail.com> wrote:
> Hi,
>
> I am running k-means using Spark in local mode. My data set is about 30k
> records, and I set the k = 1000.
>
> The algorithm starts and finished 13 jobs according to the UI monitor, then
> it stopped working.
>
> The last log I saw was:
>
> [Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned
> broadcast 16
>
> There're many similar log repeated, but it seems it always stop at the 16th.
>
> If I try to low down the k value, the algorithm will terminated. So I just
> want to know what's wrong with k=1000.
>
>
> Thanks,
> David
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to