Re: How to get applicationId for yarn mode(both yarn-client and yarn-cluster mode)

2015-05-13 Thread thanhtien522
Earthson wrote Finally, I've found two ways: 1. search the output with something like Submitted application application_1416319392519_0115 2. use specific AppName. We could query the ApplicationID(yarn) Hi Eathson, Can you explain more about case 2? How can we query the ApplicationID by the

Re: Spark on YARN: java.lang.ClassCastException SerializedLambda to org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1

2015-01-22 Thread thanhtien522
Update: I deployed a stand-alone spark in localhost then set Master as spark://localhost:7077 and it met the same issue Don't know how to solve it. -- View this message in context:

Spark on YARN: java.lang.ClassCastException SerializedLambda to org.apache.spark.api.java.function.Function in instance of org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1

2015-01-20 Thread thanhtien522
Hi, I try to run Spark on YARN cluster by set master as yarn-client on java code. It works fine with count task by not working with other command. It threw ClassCastException: java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field

Re: No disk single pass RDD aggregation

2014-12-17 Thread thanhtien522
Jim Carroll wrote Okay, I have an rdd that I want to run an aggregate over but it insists on spilling to disk even though I structured the processing to only require a single pass. In other words, I can do all of my processing one entry in the rdd at a time without persisting anything.

Re: Submitting Spark job on Unix cluster from dev environment (Windows)

2014-11-09 Thread thanhtien522
yeah, It work. I turn off firewall on my windows machine and it work. Thanks so much. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Submitting-Spark-job-on-Unix-cluster-from-dev-environment-Windows-tp16989p18452.html Sent from the Apache Spark User List