unsubscribe

2016-03-15 Thread Netwaver
unsubscribe

Failed when starting Spark 1.5.0 standalone cluster

2015-09-09 Thread Netwaver
Hi Spark experts, I am trying to migrate my Spark cluster from 1.4.1 to latest 1.5.0 , but meet below issues when run start-all.sh script. Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/Main Caused by:

Re:Re: Failed when starting Spark 1.5.0 standalone cluster

2015-09-09 Thread Netwaver
t; <yuzhih...@gmail.com> 写道: See the following announcement: http://search-hadoop.com/m/q3RTtojAyW1dabFk On Wed, Sep 9, 2015 at 9:05 PM, Netwaver <wanglong_...@163.com> wrote: Hi Spark experts, I am trying to migrate my Spark cluster from 1.4.1 to latest

Re:Re: Possible issue for Spark SQL/DataFrame

2015-08-12 Thread Netwaver
On Mon, Aug 10, 2015 at 12:06 PM, Netwaver wanglong_...@163.com wrote: Hi Spark experts, I am now using Spark 1.4.1 and trying Spark SQL/DataFrame API with text file in below format id gender height 1 M 180 2

Possible issue for Spark SQL/DataFrame

2015-08-10 Thread Netwaver
Hi Spark experts, I am now using Spark 1.4.1 and trying Spark SQL/DataFrame API with text file in below format id gender height 1 M 180 2 F 167 ... ... But I meet

How can I know currently supported functions in Spark SQL

2015-08-06 Thread Netwaver
Hi All, I am using Spark 1.4.1, and I want to know how can I find the complete function list supported in Spark SQL, currently I only know 'sum','count','min','max'. Thanks a lot.

Re:Re: How can I know currently supported functions in Spark SQL

2015-08-06 Thread Netwaver
Thanks for your kindly help At 2015-08-06 19:28:10, Todd Nist tsind...@gmail.com wrote: They are covered here in the docs: http://spark.apache.org/docs/1.4.1/api/scala/index.html#org.apache.spark.sql.functions$ On Thu, Aug 6, 2015 at 5:52 AM, Netwaver wanglong_...@163.com wrote: Hi

spark streaming program failed on Spark 1.4.1

2015-08-03 Thread Netwaver
Hi All, I have a spark streaming + kafka program written by Scala, it works well on Spark 1.3.1, but after I migrate my Spark cluster to 1.4.1 and rerun this program, I meet below exception: ERROR scheduler.ReceiverTracker: Deregistered receiver for stream 0: Error starting