Re: Spark 1.5.0 java.lang.OutOfMemoryError: PermGen space

2015-09-12 Thread Jagat Singh
Sorry to answer your question fully.

The job starts tasks and few of them fail and some are successful. The
failed one have that PermGen error in logs.

But ultimately full job is marked fail and session quits.


On Sun, Sep 13, 2015 at 10:48 AM, Jagat Singh  wrote:

> Hi Davies,
>
> This was first query on new version.
>
> The one which ran successfully was Spark Pi example
>
> ./bin/spark-submit --class org.apache.spark.examples.SparkPi \
>
> --master yarn-client \
>
> --num-executors 3 \
>
> --driver-memory 4g \
>
> --executor-memory 2g \
>
> --executor-cores 1 \
>
> --queue default \
>
> lib/spark-examples*.jar \
>
> 10
>
> Then i tried using spark-shell , which was started without any extra
> memory Grabage collection or Permgen configurations
>
> ./bin/spark-shell --num-executors 2 --executor-memory 512m --master
> yarn-client
>
> val t1= sqlContext.sql("select count(*) from table")
>
> t1.show
>
> This one fails with PermGen
>
> I will try on Monday the solution suggested about passing extra PermGen to
> driver.
>
> Thanks,
>
> On Sat, Sep 12, 2015 at 2:57 AM, Davies Liu  wrote:
>
>> Did this happen immediately after you start the cluster or after ran
>> some queries?
>>
>> Is this in local mode or cluster mode?
>>
>> On Fri, Sep 11, 2015 at 3:00 AM, Jagat Singh 
>> wrote:
>> > Hi,
>> >
>> > We have queries which were running fine on 1.4.1 system.
>> >
>> > We are testing upgrade and even simple query like
>> >
>> > val t1= sqlContext.sql("select count(*) from table")
>> >
>> > t1.show
>> >
>> > This works perfectly fine on 1.4.1 but throws OOM error in 1.5.0
>> >
>> > Are there any changes in default memory settings from 1.4.1 to 1.5.0
>> >
>> > Thanks,
>> >
>> >
>> >
>>
>
>


Re: Spark 1.5.0 java.lang.OutOfMemoryError: PermGen space

2015-09-12 Thread Jagat Singh
Hi Davies,

This was first query on new version.

The one which ran successfully was Spark Pi example

./bin/spark-submit --class org.apache.spark.examples.SparkPi \

--master yarn-client \

--num-executors 3 \

--driver-memory 4g \

--executor-memory 2g \

--executor-cores 1 \

--queue default \

lib/spark-examples*.jar \

10

Then i tried using spark-shell , which was started without any extra memory
Grabage collection or Permgen configurations

./bin/spark-shell --num-executors 2 --executor-memory 512m --master
yarn-client

val t1= sqlContext.sql("select count(*) from table")

t1.show

This one fails with PermGen

I will try on Monday the solution suggested about passing extra PermGen to
driver.

Thanks,

On Sat, Sep 12, 2015 at 2:57 AM, Davies Liu  wrote:

> Did this happen immediately after you start the cluster or after ran
> some queries?
>
> Is this in local mode or cluster mode?
>
> On Fri, Sep 11, 2015 at 3:00 AM, Jagat Singh  wrote:
> > Hi,
> >
> > We have queries which were running fine on 1.4.1 system.
> >
> > We are testing upgrade and even simple query like
> >
> > val t1= sqlContext.sql("select count(*) from table")
> >
> > t1.show
> >
> > This works perfectly fine on 1.4.1 but throws OOM error in 1.5.0
> >
> > Are there any changes in default memory settings from 1.4.1 to 1.5.0
> >
> > Thanks,
> >
> >
> >
>


Re: Spark 1.5.0 java.lang.OutOfMemoryError: PermGen space

2015-09-11 Thread Davies Liu
Did this happen immediately after you start the cluster or after ran
some queries?

Is this in local mode or cluster mode?

On Fri, Sep 11, 2015 at 3:00 AM, Jagat Singh  wrote:
> Hi,
>
> We have queries which were running fine on 1.4.1 system.
>
> We are testing upgrade and even simple query like
>
> val t1= sqlContext.sql("select count(*) from table")
>
> t1.show
>
> This works perfectly fine on 1.4.1 but throws OOM error in 1.5.0
>
> Are there any changes in default memory settings from 1.4.1 to 1.5.0
>
> Thanks,
>
>
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Spark 1.5.0 java.lang.OutOfMemoryError: PermGen space

2015-09-11 Thread Jagat Singh
Hi,

We have queries which were running fine on 1.4.1 system.

We are testing upgrade and even simple query like

val t1= sqlContext.sql("select count(*) from table")

t1.show

This works perfectly fine on 1.4.1 but throws OOM error in 1.5.0

Are there any changes in default memory settings from 1.4.1 to 1.5.0

Thanks,


Re: Spark 1.5.0 java.lang.OutOfMemoryError: PermGen space

2015-09-11 Thread Ted Yu
Have you seen this thread ?

http://search-hadoop.com/m/q3RTtPPuSvBu0rj2


> On Sep 11, 2015, at 3:00 AM, Jagat Singh  wrote:
> 
> Hi,
> 
> We have queries which were running fine on 1.4.1 system.
> 
> We are testing upgrade and even simple query like
> val t1= sqlContext.sql("select count(*) from table")
> 
> t1.show
> 
> This works perfectly fine on 1.4.1 but throws OOM error in 1.5.0
> 
> Are there any changes in default memory settings from 1.4.1 to 1.5.0
> 
> Thanks,
> 
> 
> 
>