Re: Apache Spark Installation error

2018-05-31 Thread Irving Duran
You probably want to recognize "spark-shell" as a command in your environment. Maybe try "sudo ln -s /path/to/spark-shell /usr/bin/spark-shell" Have you tried "./spark-shell" in the current path to see if it works? Thank You, Irving Duran On Thu, May 31, 2018 at 9:00 AM Remil Mohanan wrote:

Re: Spark Installation to work on Spark Streaming and MLlib

2016-06-10 Thread Ram Krishna
Thanks for suggestion. Can you suggest me from where and how I how to start from the scratch to work on Spark. On Fri, Jun 10, 2016 at 8:18 PM, Holden Karau wrote: > So that's a bit complicated - you might want to start with reading the > code for the existing algorithms

Re: Spark Installation to work on Spark Streaming and MLlib

2016-06-10 Thread Holden Karau
So that's a bit complicated - you might want to start with reading the code for the existing algorithms and go from there. If your goal is to contribute the algorithm to Spark you should probably take a look at the JIRA as well as the contributing to Spark guide on the wiki. Also we have a

Re: Spark Installation to work on Spark Streaming and MLlib

2016-06-10 Thread Ram Krishna
Hi All, How to add new ML algo in Spark MLlib. On Fri, Jun 10, 2016 at 12:50 PM, Ram Krishna wrote: > Hi All, > > I am new to this this field, I want to implement new ML algo using Spark > MLlib. What is the procedure. > > -- > Regards, > Ram Krishna KT > > > > > >

Re: Spark Installation to work on Spark Streaming and MLlib

2016-06-10 Thread Holden Karau
Hi Ram, Not super certain what you are looking to do. Are you looking to add a new algorithm to Spark MLlib for streaming or use Spark MLlib on streaming data? Cheers, Holden On Friday, June 10, 2016, Ram Krishna wrote: > Hi All, > > I am new to this this field, I

Spark Installation to work on Spark Streaming and MLlib

2016-06-10 Thread Ram Krishna
Hi All, I am new to this this field, I want to implement new ML algo using Spark MLlib. What is the procedure. -- Regards, Ram Krishna KT

Spark installation

2015-02-10 Thread King sami
Hi, I'm new in Spark. I want to install it on my local machine (Ubunti 12.04) Could you help me please to install step by step Spark on may machine and run some Scala programms. Thanks

Re: Spark installation

2015-02-10 Thread Mohit Singh
For local machine, I dont think there is any to install.. Just unzip and go to $SPARK_DIR/bin/spark-shell and that will open up a repl... On Tue, Feb 10, 2015 at 3:25 PM, King sami kgsam...@gmail.com wrote: Hi, I'm new in Spark. I want to install it on my local machine (Ubunti 12.04) Could

Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-27 Thread varun sharma
This works for me: export MAVEN_OPTS=-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m mvn -DskipTests clean package -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Installation-Maven-PermGen-OutOfMemoryException-tp20831p20868.html Sent

Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-24 Thread Vladimir Protsenko
Medasani From: so...@cloudera.com Date: Tue, 23 Dec 2014 15:39:59 + Subject: Re: Spark Installation Maven PermGen OutOfMemoryException To: gdm...@outlook.com CC: protsenk...@gmail.com; user@spark.apache.org The text there is actually unclear. In Java 8, you still need to set

Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-24 Thread Sean Owen
, Vladimir Protsenko 2014-12-23 19:45 GMT+04:00 Guru Medasani gdm...@outlook.com: Thanks for the clarification Sean. Best Regards, Guru Medasani From: so...@cloudera.com Date: Tue, 23 Dec 2014 15:39:59 + Subject: Re: Spark Installation Maven PermGen OutOfMemoryException To: gdm

Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-24 Thread Vladimir Protsenko
Subject: Re: Spark Installation Maven PermGen OutOfMemoryException To: gdm...@outlook.com CC: protsenk...@gmail.com; user@spark.apache.org The text there is actually unclear. In Java 8, you still need to set the max heap size (-Xmx2g). The optional bit is the -XX:MaxPermSize=512M

Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Vladimir Protsenko
? export MAVEN_OPTS=`-Xmx=1500m -XX:MaxPermSize=512m -XX:ReservedCodeCacheSize=512m` doesn't help. Waht is a straight forward way to start using Spark? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Installation-Maven-PermGen-OutOfMemoryException-tp20831

Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Sean Owen
-XX:ReservedCodeCacheSize=512m` doesn't help. Waht is a straight forward way to start using Spark? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Installation-Maven-PermGen-OutOfMemoryException-tp20831.html Sent from the Apache Spark User List mailing

RE: Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Guru Medasani
Subject: Re: Spark Installation Maven PermGen OutOfMemoryException To: protsenk...@gmail.com CC: user@spark.apache.org You might try a little more. The official guidance suggests 2GB: https://spark.apache.org/docs/latest/building-spark.html#setting-up-mavens-memory-usage On Tue, Dec 23

Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Sean Owen
The text there is actually unclear. In Java 8, you still need to set the max heap size (-Xmx2g). The optional bit is the -XX:MaxPermSize=512M actually. Java 8 no longer has a separate permanent generation. On Tue, Dec 23, 2014 at 3:32 PM, Guru Medasani gdm...@outlook.com wrote: Hi Vladimir,

RE: Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Guru Medasani
Thanks for the clarification Sean. Best Regards,Guru Medasani From: so...@cloudera.com Date: Tue, 23 Dec 2014 15:39:59 + Subject: Re: Spark Installation Maven PermGen OutOfMemoryException To: gdm...@outlook.com CC: protsenk...@gmail.com; user@spark.apache.org The text

Re: Spark Installation

2014-07-09 Thread 田毅
Hi Srikrishna the reason to this issue is you had uploaded assembly jar to HDFS twice. paste your command could be better diagnosis 田毅 === 橘云平台产品线 大数据产品部 亚信联创科技(中国)有限公司 手机:13910177261 电话:010-82166322 传真:010-82166617 Q Q:20057509

Re: Spark Installation

2014-07-08 Thread Sean Owen
On Tue, Jul 8, 2014 at 4:07 AM, Srikrishna S srikrishna...@gmail.com wrote: Hi All, Does anyone know what the command line arguments to mvn are to generate the pre-built binary for spark on Hadoop 2-CHD5. I would like to pull in a recent bug fix in spark-master and rebuild the binaries in

Re: Spark Installation

2014-07-08 Thread Srikrishna S
Hi All, I tried the make distribution script and it worked well. I was able to compile the spark binary on our CDH5 cluster. Once I compiled Spark, I copied over the binaries in the dist folder to all the other machines in the cluster. However, I run into an issue while submit a job in

Spark Installation

2014-07-07 Thread Srikrishna S
Hi All, Does anyone know what the command line arguments to mvn are to generate the pre-built binary for spark on Hadoop 2-CHD5. I would like to pull in a recent bug fix in spark-master and rebuild the binaries in the exact same way that was used for that provided on the website. I have tried

Re: error with cdh 5 spark installation

2014-06-04 Thread Patrick Wendell
Hey Chirag, Those init scripts are part of the Cloudera Spark package (they are not in the Spark project itself) so you might try e-mailing their support lists directly. - Patrick On Wed, Jun 4, 2014 at 7:19 AM, chirag lakhani chirag.lakh...@gmail.com wrote: I recently spun up an AWS cluster

Re: error with cdh 5 spark installation

2014-06-04 Thread Sean Owen
Spark is already part of the distribution, and the core CDH5 parcel. You shouldn't need extra steps unless you're doing something special. It may be that this is the very cause of the error when trying to install over the existing services. On Wed, Jun 4, 2014 at 3:19 PM, chirag lakhani