Re: Apache Spark Installation error

2018-05-31 Thread Irving Duran
You probably want to recognize "spark-shell" as a command in your
environment.  Maybe try "sudo ln -s /path/to/spark-shell
/usr/bin/spark-shell"  Have you tried "./spark-shell" in the current path
to see if it works?

Thank You,

Irving Duran


On Thu, May 31, 2018 at 9:00 AM Remil Mohanan  wrote:

> Hi there,
>
>I am not able to execute the spark-shell command. Can you please help.
>
> Thanks
>
> Remil
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org


Re: Spark Installation to work on Spark Streaming and MLlib

2016-06-10 Thread Ram Krishna
Thanks for suggestion. Can you suggest me from where and how I how to start
from the scratch to work on Spark.

On Fri, Jun 10, 2016 at 8:18 PM, Holden Karau  wrote:

> So that's a bit complicated - you might want to start with reading the
> code for the existing algorithms and go from there. If your goal is to
> contribute the algorithm to Spark you should probably take a look at the
> JIRA as well as the contributing to Spark guide on the wiki. Also we have a
> seperate list (dev@) for people looking to work on Spark it's self.
> I'd also recommend a smaller first project building new functionality in
> Spark as a good starting point rather than adding a new algorithm right
> away, since you learn a lot in the process of making your first
> contribution.
>
> On Friday, June 10, 2016, Ram Krishna  wrote:
>
>> Hi All,
>>
>> How to add new ML algo in Spark MLlib.
>>
>> On Fri, Jun 10, 2016 at 12:50 PM, Ram Krishna 
>> wrote:
>>
>>> Hi All,
>>>
>>> I am new to this this field, I want to implement new ML algo using Spark
>>> MLlib. What is the procedure.
>>>
>>> --
>>> Regards,
>>> Ram Krishna KT
>>>
>>>
>>>
>>>
>>>
>>>
>>
>>
>> --
>> Regards,
>> Ram Krishna KT
>>
>>
>>
>>
>>
>>
>
> --
> Cell : 425-233-8271
> Twitter: https://twitter.com/holdenkarau
>
>


-- 
Regards,
Ram Krishna KT


Re: Spark Installation to work on Spark Streaming and MLlib

2016-06-10 Thread Holden Karau
So that's a bit complicated - you might want to start with reading the code
for the existing algorithms and go from there. If your goal is to
contribute the algorithm to Spark you should probably take a look at the
JIRA as well as the contributing to Spark guide on the wiki. Also we have a
seperate list (dev@) for people looking to work on Spark it's self.
I'd also recommend a smaller first project building new functionality in
Spark as a good starting point rather than adding a new algorithm right
away, since you learn a lot in the process of making your first
contribution.

On Friday, June 10, 2016, Ram Krishna  wrote:

> Hi All,
>
> How to add new ML algo in Spark MLlib.
>
> On Fri, Jun 10, 2016 at 12:50 PM, Ram Krishna  > wrote:
>
>> Hi All,
>>
>> I am new to this this field, I want to implement new ML algo using Spark
>> MLlib. What is the procedure.
>>
>> --
>> Regards,
>> Ram Krishna KT
>>
>>
>>
>>
>>
>>
>
>
> --
> Regards,
> Ram Krishna KT
>
>
>
>
>
>

-- 
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau


Re: Spark Installation to work on Spark Streaming and MLlib

2016-06-10 Thread Ram Krishna
Hi All,

How to add new ML algo in Spark MLlib.

On Fri, Jun 10, 2016 at 12:50 PM, Ram Krishna 
wrote:

> Hi All,
>
> I am new to this this field, I want to implement new ML algo using Spark
> MLlib. What is the procedure.
>
> --
> Regards,
> Ram Krishna KT
>
>
>
>
>
>


-- 
Regards,
Ram Krishna KT


Re: Spark Installation to work on Spark Streaming and MLlib

2016-06-10 Thread Holden Karau
Hi Ram,

Not super certain what you are looking to do. Are you looking to add a new
algorithm to Spark MLlib for streaming or use Spark MLlib on streaming data?

Cheers,

Holden

On Friday, June 10, 2016, Ram Krishna  wrote:

> Hi All,
>
> I am new to this this field, I want to implement new ML algo using Spark
> MLlib. What is the procedure.
>
> --
> Regards,
> Ram Krishna KT
>
>
>
>
>
>

-- 
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau


Spark Installation to work on Spark Streaming and MLlib

2016-06-10 Thread Ram Krishna
Hi All,

I am new to this this field, I want to implement new ML algo using Spark
MLlib. What is the procedure.

-- 
Regards,
Ram Krishna KT


Re: Spark installation

2015-02-10 Thread prabeesh k
Refer this blog

for step by step installation

On 11 February 2015 at 03:42, Mohit Singh  wrote:

> For local machine, I dont think there is any to install.. Just unzip and
> go to $SPARK_DIR/bin/spark-shell and that will open up a repl...
>
> On Tue, Feb 10, 2015 at 3:25 PM, King sami  wrote:
>
>> Hi,
>>
>> I'm new in Spark. I want to install it on my local machine (Ubunti 12.04)
>> Could you help me please to install step by step Spark on may machine and
>> run some Scala programms.
>>
>> Thanks
>>
>
>
>
> --
> Mohit
>
> "When you want success as badly as you want the air, then you will get it.
> There is no other secret of success."
> -Socrates
>


Re: Spark installation

2015-02-10 Thread Mohit Singh
For local machine, I dont think there is any to install.. Just unzip and go
to $SPARK_DIR/bin/spark-shell and that will open up a repl...

On Tue, Feb 10, 2015 at 3:25 PM, King sami  wrote:

> Hi,
>
> I'm new in Spark. I want to install it on my local machine (Ubunti 12.04)
> Could you help me please to install step by step Spark on may machine and
> run some Scala programms.
>
> Thanks
>



-- 
Mohit

"When you want success as badly as you want the air, then you will get it.
There is no other secret of success."
-Socrates


Spark installation

2015-02-10 Thread King sami
Hi,

I'm new in Spark. I want to install it on my local machine (Ubunti 12.04)
Could you help me please to install step by step Spark on may machine and
run some Scala programms.

Thanks


Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-27 Thread varun sharma
This works for me:

export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
-XX:ReservedCodeCacheSize=512m" && mvn -DskipTests clean package



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Installation-Maven-PermGen-OutOfMemoryException-tp20831p20868.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-24 Thread Vladimir Protsenko
Thanks. Bad mistake.

2014-12-24 14:02 GMT+04:00 Sean Owen :

> That command is still wrong. It is -Xmx3g with no =.
> On Dec 24, 2014 9:50 AM, "Vladimir Protsenko" 
> wrote:
>
>> Java 8 rpm 64bit downloaded from official oracle site solved my problem.
>> And I need not set max heap size, final memory shown at the end of maven
>> build was 81/1943M. I want to learn spark so have no restriction on
>> choosing java version.
>>
>> Guru Medasani, thanks for the tip.
>>
>> I will repeat info, that I wrongly send only to Sean. I have tried export
>> MAVEN_OPTS=`-Xmx=3g -XX:MaxPermSize=1g -XX:ReservedCodeCacheSize=1g` and
>> it doesn't work also.
>>
>> Best Regards,
>> Vladimir Protsenko
>>
>> 2014-12-23 19:45 GMT+04:00 Guru Medasani :
>>
>>> Thanks for the clarification Sean.
>>>
>>> Best Regards,
>>> Guru Medasani
>>>
>>>
>>>
>>>
>>> > From: so...@cloudera.com
>>> > Date: Tue, 23 Dec 2014 15:39:59 +
>>> > Subject: Re: Spark Installation Maven PermGen OutOfMemoryException
>>> > To: gdm...@outlook.com
>>> > CC: protsenk...@gmail.com; user@spark.apache.org
>>>
>>> >
>>> > The text there is actually unclear. In Java 8, you still need to set
>>> > the max heap size (-Xmx2g). The optional bit is the
>>> > "-XX:MaxPermSize=512M" actually. Java 8 no longer has a separate
>>> > permanent generation.
>>> >
>>> > On Tue, Dec 23, 2014 at 3:32 PM, Guru Medasani 
>>> wrote:
>>> > > Hi Vladimir,
>>> > >
>>> > > From the link Sean posted, if you use Java 8 there is this following
>>> note.
>>> > >
>>> > > Note: For Java 8 and above this step is not required.
>>> > >
>>> > > So if you have no problems using Java 8, give it a shot.
>>> > >
>>> > > Best Regards,
>>> > > Guru Medasani
>>> > >
>>> >
>>> > -
>>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> > For additional commands, e-mail: user-h...@spark.apache.org
>>> >
>>>
>>
>>


Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-24 Thread Sean Owen
That command is still wrong. It is -Xmx3g with no =.
On Dec 24, 2014 9:50 AM, "Vladimir Protsenko"  wrote:

> Java 8 rpm 64bit downloaded from official oracle site solved my problem.
> And I need not set max heap size, final memory shown at the end of maven
> build was 81/1943M. I want to learn spark so have no restriction on
> choosing java version.
>
> Guru Medasani, thanks for the tip.
>
> I will repeat info, that I wrongly send only to Sean. I have tried export
> MAVEN_OPTS=`-Xmx=3g -XX:MaxPermSize=1g -XX:ReservedCodeCacheSize=1g` and
> it doesn't work also.
>
> Best Regards,
> Vladimir Protsenko
>
> 2014-12-23 19:45 GMT+04:00 Guru Medasani :
>
>> Thanks for the clarification Sean.
>>
>> Best Regards,
>> Guru Medasani
>>
>>
>>
>>
>> > From: so...@cloudera.com
>> > Date: Tue, 23 Dec 2014 15:39:59 +
>> > Subject: Re: Spark Installation Maven PermGen OutOfMemoryException
>> > To: gdm...@outlook.com
>> > CC: protsenk...@gmail.com; user@spark.apache.org
>>
>> >
>> > The text there is actually unclear. In Java 8, you still need to set
>> > the max heap size (-Xmx2g). The optional bit is the
>> > "-XX:MaxPermSize=512M" actually. Java 8 no longer has a separate
>> > permanent generation.
>> >
>> > On Tue, Dec 23, 2014 at 3:32 PM, Guru Medasani 
>> wrote:
>> > > Hi Vladimir,
>> > >
>> > > From the link Sean posted, if you use Java 8 there is this following
>> note.
>> > >
>> > > Note: For Java 8 and above this step is not required.
>> > >
>> > > So if you have no problems using Java 8, give it a shot.
>> > >
>> > > Best Regards,
>> > > Guru Medasani
>> > >
>> >
>> > -
>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> > For additional commands, e-mail: user-h...@spark.apache.org
>> >
>>
>
>


Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-24 Thread Vladimir Protsenko
Java 8 rpm 64bit downloaded from official oracle site solved my problem.
And I need not set max heap size, final memory shown at the end of maven
build was 81/1943M. I want to learn spark so have no restriction on
choosing java version.

Guru Medasani, thanks for the tip.

I will repeat info, that I wrongly send only to Sean. I have tried export
MAVEN_OPTS=`-Xmx=3g -XX:MaxPermSize=1g -XX:ReservedCodeCacheSize=1g` and it
doesn't work also.

Best Regards,
Vladimir Protsenko

2014-12-23 19:45 GMT+04:00 Guru Medasani :

> Thanks for the clarification Sean.
>
> Best Regards,
> Guru Medasani
>
>
>
>
> > From: so...@cloudera.com
> > Date: Tue, 23 Dec 2014 15:39:59 +
> > Subject: Re: Spark Installation Maven PermGen OutOfMemoryException
> > To: gdm...@outlook.com
> > CC: protsenk...@gmail.com; user@spark.apache.org
>
> >
> > The text there is actually unclear. In Java 8, you still need to set
> > the max heap size (-Xmx2g). The optional bit is the
> > "-XX:MaxPermSize=512M" actually. Java 8 no longer has a separate
> > permanent generation.
> >
> > On Tue, Dec 23, 2014 at 3:32 PM, Guru Medasani 
> wrote:
> > > Hi Vladimir,
> > >
> > > From the link Sean posted, if you use Java 8 there is this following
> note.
> > >
> > > Note: For Java 8 and above this step is not required.
> > >
> > > So if you have no problems using Java 8, give it a shot.
> > >
> > > Best Regards,
> > > Guru Medasani
> > >
> >
> > -
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>


RE: Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Guru Medasani
Thanks for the clarification Sean. 

Best Regards,Guru Medasani




> From: so...@cloudera.com
> Date: Tue, 23 Dec 2014 15:39:59 +
> Subject: Re: Spark Installation Maven PermGen OutOfMemoryException
> To: gdm...@outlook.com
> CC: protsenk...@gmail.com; user@spark.apache.org
> 
> The text there is actually unclear. In Java 8, you still need to set
> the max heap size (-Xmx2g). The optional bit is the
> "-XX:MaxPermSize=512M" actually. Java 8 no longer has a separate
> permanent generation.
> 
> On Tue, Dec 23, 2014 at 3:32 PM, Guru Medasani  wrote:
> > Hi Vladimir,
> >
> > From the link Sean posted, if you use Java 8 there is this following note.
> >
> > Note: For Java 8 and above this step is not required.
> >
> > So if you have no problems using Java 8, give it a shot.
> >
> > Best Regards,
> > Guru Medasani
> >
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 
  

Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Sean Owen
The text there is actually unclear. In Java 8, you still need to set
the max heap size (-Xmx2g). The optional bit is the
"-XX:MaxPermSize=512M" actually. Java 8 no longer has a separate
permanent generation.

On Tue, Dec 23, 2014 at 3:32 PM, Guru Medasani  wrote:
> Hi Vladimir,
>
> From the link Sean posted, if you use Java 8 there is this following note.
>
> Note: For Java 8 and above this step is not required.
>
> So if you have no problems using Java 8, give it a shot.
>
> Best Regards,
> Guru Medasani
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



RE: Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Guru Medasani
Hi Vladimir,
>From the link Sean posted, if you use Java 8 there is this following note.
Note: For Java 8 and above this step is not required.

So if you have no problems using Java 8, give it a shot. 

Best Regards,Guru Medasani




> From: so...@cloudera.com
> Date: Tue, 23 Dec 2014 15:04:42 +
> Subject: Re: Spark Installation Maven PermGen OutOfMemoryException
> To: protsenk...@gmail.com
> CC: user@spark.apache.org
> 
> You might try a little more. The official guidance suggests 2GB:
> 
> https://spark.apache.org/docs/latest/building-spark.html#setting-up-mavens-memory-usage
> 
> 
> On Tue, Dec 23, 2014 at 2:57 PM, Vladimir Protsenko
>  wrote:
> > I am installing Spark 1.2.0 on CentOS 6.6. Just downloaded code from github,
> > installed maven and trying to compile system:
> >
> > git clone https://github.com/apache/spark.git
> > git checkout v1.2.0
> > mvn -DskipTests clean package
> >
> > leads to OutOfMemoryException. What amount of memory does it requires?
> >
> > export MAVEN_OPTS=`-Xmx=1500m -XX:MaxPermSize=512m
> > -XX:ReservedCodeCacheSize=512m` doesn't help.
> >
> > Waht is a straight forward way to start using Spark?
> >
> >
> >
> > --
> > View this message in context: 
> > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Installation-Maven-PermGen-OutOfMemoryException-tp20831.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > -
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 
  

RE: Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Somnath Pandeya
I think you should use minimum of 2gb of memory for building it from maven .

-Somnath

-Original Message-
From: Vladimir Protsenko [mailto:protsenk...@gmail.com]
Sent: Tuesday, December 23, 2014 8:28 PM
To: user@spark.apache.org
Subject: Spark Installation Maven PermGen OutOfMemoryException

I am installing Spark 1.2.0 on CentOS 6.6. Just downloaded code from github, 
installed maven and trying to compile system:

git clone https://github.com/apache/spark.git
git checkout v1.2.0
mvn -DskipTests clean package

leads to OutOfMemoryException. What amount of memory does it requires?

export MAVEN_OPTS=`-Xmx=1500m -XX:MaxPermSize=512m 
-XX:ReservedCodeCacheSize=512m` doesn't help.

Waht is a straight forward way to start using Spark?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Installation-Maven-PermGen-OutOfMemoryException-tp20831.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


 CAUTION - Disclaimer *
This e-mail contains PRIVILEGED AND CONFIDENTIAL INFORMATION intended solely
for the use of the addressee(s). If you are not the intended recipient, please
notify the sender by e-mail and delete the original message. Further, you are 
not
to copy, disclose, or distribute this e-mail or its contents to any other 
person and
any such actions are unlawful. This e-mail may contain viruses. Infosys has 
taken
every reasonable precaution to minimize this risk, but is not liable for any 
damage
you may sustain as a result of any virus in this e-mail. You should carry out 
your
own virus checks before opening the e-mail or attachment. Infosys reserves the
right to monitor and review the content of all messages sent to or from this 
e-mail
address. Messages sent to or from this e-mail address may be stored on the
Infosys e-mail system.
***INFOSYS End of Disclaimer INFOSYS***

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Sean Owen
You might try a little more. The official guidance suggests 2GB:

https://spark.apache.org/docs/latest/building-spark.html#setting-up-mavens-memory-usage


On Tue, Dec 23, 2014 at 2:57 PM, Vladimir Protsenko
 wrote:
> I am installing Spark 1.2.0 on CentOS 6.6. Just downloaded code from github,
> installed maven and trying to compile system:
>
> git clone https://github.com/apache/spark.git
> git checkout v1.2.0
> mvn -DskipTests clean package
>
> leads to OutOfMemoryException. What amount of memory does it requires?
>
> export MAVEN_OPTS=`-Xmx=1500m -XX:MaxPermSize=512m
> -XX:ReservedCodeCacheSize=512m` doesn't help.
>
> Waht is a straight forward way to start using Spark?
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Installation-Maven-PermGen-OutOfMemoryException-tp20831.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Spark Installation Maven PermGen OutOfMemoryException

2014-12-23 Thread Vladimir Protsenko
I am installing Spark 1.2.0 on CentOS 6.6. Just downloaded code from github,
installed maven and trying to compile system:

git clone https://github.com/apache/spark.git
git checkout v1.2.0
mvn -DskipTests clean package

leads to OutOfMemoryException. What amount of memory does it requires? 

export MAVEN_OPTS=`-Xmx=1500m -XX:MaxPermSize=512m
-XX:ReservedCodeCacheSize=512m` doesn't help.

Waht is a straight forward way to start using Spark?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Installation-Maven-PermGen-OutOfMemoryException-tp20831.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Spark Installation

2014-07-08 Thread 田毅
Hi Srikrishna

the reason to this issue is you had uploaded assembly jar to HDFS twice.

paste your command could be better diagnosis



田毅
===
橘云平台产品线
大数据产品部
亚信联创科技(中国)有限公司
手机:13910177261
电话:010-82166322
传真:010-82166617
Q Q:20057509
MSN:yi.t...@hotmail.com
地址:北京市海淀区东北旺西路10号院东区  亚信联创大厦


===

在 2014年7月9日,上午3:03,Srikrishna S  写道:

> Hi All,
> 
> 
> I tried the make distribution script and it worked well. I was able to
> compile the spark binary on our CDH5 cluster. Once I compiled Spark, I
> copied over the binaries in the dist folder to all the other machines
> in the cluster.
> 
> However, I run into an issue while submit a job in yarn-client mode. I
> get an error message that says the following
> Resource 
> file:/opt/spark/spark-1.0.0-bin-hadoop2/lib/spark-assembly-1.1.0-SNAPSHOT-hadoop2.3.0.jar"
> changed on src filesystem (expected 1404845211000, was 1404845404000)
> 
> My end goal is to submit a job (that uses MLLib) in our Yarn cluster.
> 
> Any thoughts anyone?
> 
> Regards,
> Krishna
> 
> 
> 
> On Tue, Jul 8, 2014 at 9:49 AM, Sandy Ryza  wrote:
>> 
>> Hi Srikrishna,
>> 
>> The binaries are built with something like
>> mvn package -Pyarn -Dhadoop.version=2.3.0-cdh5.0.1 
>> -Dyarn.version=2.3.0-cdh5.0.1
>> 
>> -Sandy
>> 
>> 
>> On Tue, Jul 8, 2014 at 3:14 AM, 田毅  wrote:
>>> 
>>> try this command:
>>> 
>>> make-distribution.sh --hadoop 2.3.0-cdh5.0.0 --with-yarn --with-hive
>>> 
>>> 
>>> 
>>> 
>>> 田毅
>>> ===
>>> 橘云平台产品线
>>> 大数据产品部
>>> 亚信联创科技(中国)有限公司
>>> 手机:13910177261
>>> 电话:010-82166322
>>> 传真:010-82166617
>>> Q Q:20057509
>>> MSN:yi.t...@hotmail.com
>>> 地址:北京市海淀区东北旺西路10号院东区  亚信联创大厦
>>> 
>>> 
>>> ===
>>> 
>>> 在 2014年7月8日,上午11:53,Krishna Sankar  写道:
>>> 
>>> Couldn't find any reference of CDH in pom.xml - profiles or the 
>>> hadoop.version.Am also wondering how the cdh compatible artifact was 
>>> compiled.
>>> Cheers
>>> 
>>> 
>>> 
>>> On Mon, Jul 7, 2014 at 8:07 PM, Srikrishna S  
>>> wrote:
 
 Hi All,
 
 Does anyone know what the command line arguments to mvn are to generate 
 the pre-built binary for spark on Hadoop 2-CHD5.
 
 I would like to pull in a recent bug fix in spark-master and rebuild the 
 binaries in the exact same way that was used for that provided on the 
 website.
 
 I have tried the following:
 
 mvn install -Pyarn -Dhadoop.version=2.3.0-cdh5.0.1
 
 And it doesn't quite work.
 
 Any thoughts anyone?
 
>>> 
>>> 
>> 
> 



Re: Spark Installation

2014-07-08 Thread Srikrishna S
Hi All,


I tried the make distribution script and it worked well. I was able to
compile the spark binary on our CDH5 cluster. Once I compiled Spark, I
copied over the binaries in the dist folder to all the other machines
in the cluster.

However, I run into an issue while submit a job in yarn-client mode. I
get an error message that says the following
Resource 
file:/opt/spark/spark-1.0.0-bin-hadoop2/lib/spark-assembly-1.1.0-SNAPSHOT-hadoop2.3.0.jar"
changed on src filesystem (expected 1404845211000, was 1404845404000)

My end goal is to submit a job (that uses MLLib) in our Yarn cluster.

Any thoughts anyone?

Regards,
Krishna



On Tue, Jul 8, 2014 at 9:49 AM, Sandy Ryza  wrote:
>
> Hi Srikrishna,
>
> The binaries are built with something like
> mvn package -Pyarn -Dhadoop.version=2.3.0-cdh5.0.1 
> -Dyarn.version=2.3.0-cdh5.0.1
>
> -Sandy
>
>
> On Tue, Jul 8, 2014 at 3:14 AM, 田毅  wrote:
>>
>> try this command:
>>
>> make-distribution.sh --hadoop 2.3.0-cdh5.0.0 --with-yarn --with-hive
>>
>>
>>
>>
>> 田毅
>> ===
>> 橘云平台产品线
>> 大数据产品部
>> 亚信联创科技(中国)有限公司
>> 手机:13910177261
>> 电话:010-82166322
>> 传真:010-82166617
>> Q Q:20057509
>> MSN:yi.t...@hotmail.com
>> 地址:北京市海淀区东北旺西路10号院东区  亚信联创大厦
>>
>>
>> ===
>>
>> 在 2014年7月8日,上午11:53,Krishna Sankar  写道:
>>
>> Couldn't find any reference of CDH in pom.xml - profiles or the 
>> hadoop.version.Am also wondering how the cdh compatible artifact was 
>> compiled.
>> Cheers
>> 
>>
>>
>> On Mon, Jul 7, 2014 at 8:07 PM, Srikrishna S  wrote:
>>>
>>> Hi All,
>>>
>>> Does anyone know what the command line arguments to mvn are to generate the 
>>> pre-built binary for spark on Hadoop 2-CHD5.
>>>
>>> I would like to pull in a recent bug fix in spark-master and rebuild the 
>>> binaries in the exact same way that was used for that provided on the 
>>> website.
>>>
>>> I have tried the following:
>>>
>>> mvn install -Pyarn -Dhadoop.version=2.3.0-cdh5.0.1
>>>
>>> And it doesn't quite work.
>>>
>>> Any thoughts anyone?
>>>
>>
>>
>


Re: Spark Installation

2014-07-08 Thread Sandy Ryza
Hi Srikrishna,

The binaries are built with something like
mvn package -Pyarn -Dhadoop.version=2.3.0-cdh5.0.1
-Dyarn.version=2.3.0-cdh5.0.1

-Sandy


On Tue, Jul 8, 2014 at 3:14 AM, 田毅  wrote:

> try this command:
>
> make-distribution.sh --hadoop 2.3.0-cdh5.0.0 --with-yarn --with-hive
>
>
>
>
> 田毅
> ===
> 橘云平台产品线
> 大数据产品部
> 亚信联创科技(中国)有限公司
> 手机:13910177261
> 电话:010-82166322
> 传真:010-82166617
> Q Q:20057509
> MSN:yi.t...@hotmail.com
> 地址:北京市海淀区东北旺西路10号院东区  亚信联创大厦
>
>
> ===
>
> 在 2014年7月8日,上午11:53,Krishna Sankar  写道:
>
> Couldn't find any reference of CDH in pom.xml - profiles or the
> hadoop.version.Am  also wondering how the cdh
> compatible artifact was compiled.
> Cheers
> 
>
>
> On Mon, Jul 7, 2014 at 8:07 PM, Srikrishna S 
> wrote:
>
>> Hi All,
>>
>> Does anyone know what the command line arguments to mvn are to generate
>> the pre-built binary for spark on Hadoop 2-CHD5.
>>
>> I would like to pull in a recent bug fix in spark-master and rebuild the
>> binaries in the exact same way that was used for that provided on the
>> website.
>>
>> I have tried the following:
>>
>> mvn install -Pyarn -Dhadoop.version=2.3.0-cdh5.0.1
>>
>> And it doesn't quite work.
>>
>> Any thoughts anyone?
>>
>>
>
>


Re: Spark Installation

2014-07-08 Thread 田毅
try this command:

make-distribution.sh --hadoop 2.3.0-cdh5.0.0 --with-yarn --with-hive




田毅
===
橘云平台产品线
大数据产品部
亚信联创科技(中国)有限公司
手机:13910177261
电话:010-82166322
传真:010-82166617
Q Q:20057509
MSN:yi.t...@hotmail.com
地址:北京市海淀区东北旺西路10号院东区  亚信联创大厦


===

在 2014年7月8日,上午11:53,Krishna Sankar  写道:

> Couldn't find any reference of CDH in pom.xml - profiles or the 
> hadoop.version.Am also wondering how the cdh compatible artifact was compiled.
> Cheers
> 
> 
> 
> On Mon, Jul 7, 2014 at 8:07 PM, Srikrishna S  wrote:
> Hi All,
>  
> Does anyone know what the command line arguments to mvn are to generate the 
> pre-built binary for spark on Hadoop 2-CHD5. 
> 
> I would like to pull in a recent bug fix in spark-master and rebuild the 
> binaries in the exact same way that was used for that provided on the website.
> 
> I have tried the following: 
> 
> mvn install -Pyarn -Dhadoop.version=2.3.0-cdh5.0.1
> 
> And it doesn't quite work.
> 
> Any thoughts anyone?
> 
> 



Re: Spark Installation

2014-07-08 Thread Sean Owen
On Tue, Jul 8, 2014 at 4:07 AM, Srikrishna S 
wrote:

> Hi All,
>
> Does anyone know what the command line arguments to mvn are to generate
> the pre-built binary for spark on Hadoop 2-CHD5.
>
> I would like to pull in a recent bug fix in spark-master and rebuild the
> binaries in the exact same way that was used for that provided on the
> website.
>
> I have tried the following:
>
> mvn install -Pyarn -Dhadoop.version=2.3.0-cdh5.0.1
>
> And it doesn't quite work.
>

It would be a lot more helpful to say what didn't work exactly.


Re: Spark Installation

2014-07-07 Thread Krishna Sankar
Couldn't find any reference of CDH in pom.xml - profiles or the
hadoop.version.Am also wondering how the cdh compatible artifact was
compiled.
Cheers



On Mon, Jul 7, 2014 at 8:07 PM, Srikrishna S 
wrote:

> Hi All,
>
> Does anyone know what the command line arguments to mvn are to generate
> the pre-built binary for spark on Hadoop 2-CHD5.
>
> I would like to pull in a recent bug fix in spark-master and rebuild the
> binaries in the exact same way that was used for that provided on the
> website.
>
> I have tried the following:
>
> mvn install -Pyarn -Dhadoop.version=2.3.0-cdh5.0.1
>
> And it doesn't quite work.
>
> Any thoughts anyone?
>
>


Re: Spark Installation

2014-07-07 Thread Jaideep Dhok
Hi Srikrishna,
You can use the make-distribution script in Spark to generate the binary.
Example - ./make-distribution.sh --tgz --hadoop HADOOP_VERSION

The above script calls maven, so you can look into it to get the exact mvn
command too.

Thanks,
Jaideep


On Tue, Jul 8, 2014 at 8:37 AM, Srikrishna S 
wrote:

> Hi All,
>
> Does anyone know what the command line arguments to mvn are to generate
> the pre-built binary for spark on Hadoop 2-CHD5.
>
> I would like to pull in a recent bug fix in spark-master and rebuild the
> binaries in the exact same way that was used for that provided on the
> website.
>
> I have tried the following:
>
> mvn install -Pyarn -Dhadoop.version=2.3.0-cdh5.0.1
>
> And it doesn't quite work.
>
> Any thoughts anyone?
>
>

-- 
_
The information contained in this communication is intended solely for the 
use of the individual or entity to whom it is addressed and others 
authorized to receive it. It may contain confidential or legally privileged 
information. If you are not the intended recipient you are hereby notified 
that any disclosure, copying, distribution or taking any action in reliance 
on the contents of this information is strictly prohibited and may be 
unlawful. If you have received this communication in error, please notify 
us immediately by responding to this email and then delete it from your 
system. The firm is neither liable for the proper and complete transmission 
of the information contained in this communication nor for any delay in its 
receipt.


Spark Installation

2014-07-07 Thread Srikrishna S
Hi All,

Does anyone know what the command line arguments to mvn are to generate the
pre-built binary for spark on Hadoop 2-CHD5.

I would like to pull in a recent bug fix in spark-master and rebuild the
binaries in the exact same way that was used for that provided on the
website.

I have tried the following:

mvn install -Pyarn -Dhadoop.version=2.3.0-cdh5.0.1

And it doesn't quite work.

Any thoughts anyone?


Re: error with cdh 5 spark installation

2014-06-04 Thread Sean Owen
Spark is already part of the distribution, and the core CDH5 parcel.
You shouldn't need extra steps unless you're doing something special.
It may be that this is the very cause of the error when trying to
install over the existing services.


On Wed, Jun 4, 2014 at 3:19 PM, chirag lakhani  wrote:
> I recently spun up an AWS cluster with cdh 5 using Cloudera Manager.  I am
> trying to install spark and simply used the install command, as stated in
> the CDH 5 documentation.
>
>
> sudo apt-get install spark-core spark-master spark-worker spark-python
>
> I get the following error
>
> Setting up spark-master
> (0.9.0+cdh5.0.1+33-1.cdh5.0.1.p0.25~precise-cdh5.0.1) ...
>  * Starting Spark master (spark-master):
> invoke-rc.d: initscript spark-master, action "start" failed.
> dpkg: error processing spark-master (--configure):
>  subprocess installed post-installation script returned error exit status 1
> Errors were encountered while processing:
>  spark-master
>
>
> Has anyone else encountered this?  Does anyone have any suggestions of what
> to do about it?
>
> Chirag
>
>


Re: error with cdh 5 spark installation

2014-06-04 Thread Patrick Wendell
Hey Chirag,

Those init scripts are part of the Cloudera Spark package (they are
not in the Spark project itself) so you might try e-mailing their
support lists directly.

- Patrick

On Wed, Jun 4, 2014 at 7:19 AM, chirag lakhani  wrote:
> I recently spun up an AWS cluster with cdh 5 using Cloudera Manager.  I am
> trying to install spark and simply used the install command, as stated in
> the CDH 5 documentation.
>
>
> sudo apt-get install spark-core spark-master spark-worker spark-python
>
> I get the following error
>
> Setting up spark-master
> (0.9.0+cdh5.0.1+33-1.cdh5.0.1.p0.25~precise-cdh5.0.1) ...
>  * Starting Spark master (spark-master):
> invoke-rc.d: initscript spark-master, action "start" failed.
> dpkg: error processing spark-master (--configure):
>  subprocess installed post-installation script returned error exit status 1
> Errors were encountered while processing:
>  spark-master
>
>
> Has anyone else encountered this?  Does anyone have any suggestions of what
> to do about it?
>
> Chirag
>
>


error with cdh 5 spark installation

2014-06-04 Thread chirag lakhani
I recently spun up an AWS cluster with cdh 5 using Cloudera Manager.  I am
trying to install spark and simply used the install command, as stated in
the CDH 5 documentation.


sudo apt-get install spark-core spark-master spark-worker spark-python

I get the following error

Setting up spark-master
(0.9.0+cdh5.0.1+33-1.cdh5.0.1.p0.25~precise-cdh5.0.1) ...
 * Starting Spark master (spark-master):
invoke-rc.d: initscript spark-master, action "start" failed.
dpkg: error processing spark-master (--configure):
 subprocess installed post-installation script returned error exit status 1
Errors were encountered while processing:
 spark-master


Has anyone else encountered this?  Does anyone have any suggestions of what
to do about it?

Chirag