send this email to subscribe

2016-06-14 Thread Kun Liu
-- * Kun Liu* M.S in Computer Science New York University Phone: (917) 864-1016

Re: Spark Assembly jar ?

2016-06-14 Thread Egor Pahomov
It's strange for me, that having and support fat jar was never a important thing. We have next scenario - we have big application, where spark is just another library for data processing. So we can not create small jar and feed it to spark scripts - we need to call spark from application. And

Re: Utilizing YARN AM RPC port field

2016-06-14 Thread Mingyu Kim
Thanks for the pointers, Steve! The first option sounds like a the most light-weight and non-disruptive option among them. So, we can add a configuration that enables socket initialization, Spark AM will create a ServerSocket if the socket init is enabled and set it on SparkContext If

Re: spark-ec2 scripts with spark-2.0.0-preview

2016-06-14 Thread Shivaram Venkataraman
Can you open an issue on https://github.com/amplab/spark-ec2 ? I think we should be able to escape the version string and pass the 2.0.0-preview through the scripts Shivaram On Tue, Jun 14, 2016 at 12:07 PM, Sunil Kumar wrote: > Hi, > > The spark-ec2 scripts are

Re: Custom receiver to connect MySQL database

2016-06-14 Thread dvlpr
I have tried but it gives me an error. because something is missing in my code -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Custom-receiver-to-connect-MySQL-database-tp17895p17912.html Sent from the Apache Spark Developers List mailing list archive

Re: Spark Assembly jar ?

2016-06-14 Thread Reynold Xin
You just need to run normal packaging and all the scripts are now setup to run without the assembly jars. On Tuesday, June 14, 2016, Franklyn D'souza wrote: > Just wondering where the spark-assembly jar has gone in 2.0. i've been > reading that its been removed but

Spark Assembly jar ?

2016-06-14 Thread Franklyn D'souza
Just wondering where the spark-assembly jar has gone in 2.0. i've been reading that its been removed but i'm not sure what the new workflow is .

Re: Custom receiver to connect MySQL database

2016-06-14 Thread Matthias Niehoff
You must add an output operation your normal stream application that uses the receiver. Calling print() on the DStream will do the job 2016-06-14 9:29 GMT+02:00 dvlpr : > Hi folks, > I have written some codes for custom receiver to get data from MySQL db. > Belowed

Re: Databricks SparkPerf with Spark 2.0

2016-06-14 Thread Michael Armbrust
NoSuchMethodError always means that you are compiling against a different classpath than is available at runtime, so it sounds like you are on the right track. The project is not abandoned, we're just busy with the release. It would be great if you could open a pull request. On Tue, Jun 14,

Re: Return binary mode in ThriftServer

2016-06-14 Thread lalit sharma
+1 for bringing this back. Binary mode needs to be present for working with data visualization tools. --Regards, Lalit On Tue, Jun 14, 2016 at 7:07 PM, Raymond Honderdors < raymond.honderd...@sizmek.com> wrote: > I experienced something similar using Sprak+Microstretegy > > I reformatted the

RE: Return binary mode in ThriftServer

2016-06-14 Thread Raymond Honderdors
I experienced something similar using Sprak+Microstretegy I reformatted the commands file locally and recompiled spark 2.0, for the issue was resolved, but I am not sure I made the change in the correct direction 1.SPARK-14947 Raymond

Re: Return binary mode in ThriftServer

2016-06-14 Thread Chris Fregly
+1 on bringing it back. causing all sorts of problems on my end that was not obvious without digging in I was having problems building spark, as well, with the --hive-thriftserver flag. also thought I was doing something wrong on my end. > On Jun 13, 2016, at 9:11 PM, Reynold Xin

Re: [YARN] Small fix for yarn.Client to use buildPath (not Path.SEPARATOR)

2016-06-14 Thread Jacek Laskowski
Hi Steve and Sean, Didn't expect such a warm welcome from Sean and you! Since I'm with Spark on YARN these days, let me see what I can do to make it nicer. Thanks! I'm going to change Spark to use buildPath first. And then propose another patch to use Environment.CLASS_PATH_SEPARATOR instead.

Re: tpcds q1 - java.lang.NegativeArraySizeException

2016-06-14 Thread Ovidiu-Cristian MARCU
I confirm the same exception for other queries as well. I was able to reproduce it many times. Queries 1, 3 and 5 failed with the same exception. Queries 2 and 4 are running ok. I am using TPCDSQueryBenchmark and I have used the following settings:

Re: Databricks SparkPerf with Spark 2.0

2016-06-14 Thread Adam Roberts
Fixed the below problem, grepped for spark.version, noticed some instances of 1.5.2 being declared, changed to 2.0.0-preview in spark-tests/project/SparkTestsBuild.scala Next one to fix is: 16/06/14 12:52:44 INFO ContextCleaner: Cleaned shuffle 9 Exception in thread "main"

Re: Utilizing YARN AM RPC port field

2016-06-14 Thread Steve Loughran
On 14 Jun 2016, at 01:30, Mingyu Kim > wrote: Hi all, YARN provides a way for AppilcationMaster to register a RPC port so that a client outside the YARN cluster can reach the application for any RPCs, but Spark’s YARN AMs simply register a dummy

Re: [YARN] Small fix for yarn.Client to use buildPath (not Path.SEPARATOR)

2016-06-14 Thread Steve Loughran
if you want to be able to build up CPs on windows to run on a Linux cluster, or vice-versa, you really need to be using the Environment.CLASS_PATH_SEPARATOR field, "". This is expanded in the cluster, not in the client Although tagged as @Public, @Unstable, it's been in there sinceYARN-1824 &

Databricks SparkPerf with Spark 2.0

2016-06-14 Thread Adam Roberts
Hi, I'm working on having "SparkPerf" ( https://github.com/databricks/spark-perf) run with Spark 2.0, noticed a few pull requests not yet accepted so concerned this project's been abandoned - it's proven very useful in the past for quality assurance as we can easily exercise lots of Spark

Custom receiver to connect MySQL database

2016-06-14 Thread dvlpr
Hi folks, I have written some codes for custom receiver to get data from MySQL db. Belowed code: class CustomReceiver(url: String, username: String, password: String) extends Receiver[String](StorageLevel.MEMORY_AND_DISK_2) with Logging { case class customer(c_sk: Int, c_add_sk: Int, c_first: