The thing is that the class it is complaining about is part of the spark
assembly jar, not in my extra jar. The assembly jar was compiled with
-Phive which is proven by the fact that it works with the same SPARK_HOME
when run as shell.
On 23 July 2015 at 17:33, Akhil Das
You can try adding that jar in SPARK_CLASSPATH (its deprecated though) in
spark-env.sh file.
Thanks
Best Regards
On Tue, Jul 21, 2015 at 7:34 PM, Michal Haris michal.ha...@visualdna.com
wrote:
I have a spark program that uses dataframes to query hive and I run it
both as a spark-shell for
...@servian.com.au, user user@spark.apache.org
Subject: Re: 1.4.0 regression: out-of-memory errors on small data
I have never seen issue like this. Setting PermGen size to 256m should
solve the problem. Can you send me your test file and the command used to
launch the spark shell or your application
Date: Monday, July 6, 2015 at 11:41 AM
To: Denny Lee denny.g@gmail.com
Cc: Simeon Simeonov s...@swoop.com, Andy Huang andy.hu...@servian.com.au,
user user@spark.apache.org
Subject: Re: 1.4.0 regression: out-of-memory errors on small data
Hi Sim,
I think the right way to set the PermGen
, Andy Huang
andy.hu...@servian.com.au, user user@spark.apache.org
Subject: Re: 1.4.0 regression: out-of-memory errors on small data
I have never seen issue like this. Setting PermGen size to 256m should
solve the problem. Can you send me your test file and the command used to
launch the spark
...@servian.com.aumailto:andy.hu...@servian.com.au, user
user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: 1.4.0 regression: out-of-memory errors on small data
Hi Sim,
I think the right way to set the PermGen Size is through driver extra JVM
options, i.e.
--conf
We have hit the same issue in spark shell when registering a temp table. We
observed it happening with those who had JDK 6. The problem went away after
installing jdk 8. This was only for the tutorial materials which was about
loading a parquet file.
Regards
Andy
On Sat, Jul 4, 2015 at 2:54 AM,
I had run into the same problem where everything was working swimmingly
with Spark 1.3.1. When I switched to Spark 1.4, either by upgrading to
Java8 (from Java7) or by knocking up the PermGenSize had solved my issue.
HTH!
On Mon, Jul 6, 2015 at 8:31 AM Andy Huang andy.hu...@servian.com.au
...@swoop.commailto:s...@swoop.com
Cc: Denny Lee denny.g@gmail.commailto:denny.g@gmail.com, Andy Huang
andy.hu...@servian.com.aumailto:andy.hu...@servian.com.au, user
user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: 1.4.0 regression: out-of-memory errors on small data
I have never
, 2015 at 11:04 PM
To: Denny Lee denny.g@gmail.commailto:denny.g@gmail.com
Cc: Andy Huang andy.hu...@servian.com.aumailto:andy.hu...@servian.com.au,
Simeon Simeonov s...@swoop.commailto:s...@swoop.com, user
user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: 1.4.0 regression: out
@gmail.com
Cc: Andy Huang andy.hu...@servian.com.au, Simeon Simeonov s...@swoop.com,
user user@spark.apache.org
Subject: Re: 1.4.0 regression: out-of-memory errors on small data
Sim,
Can you increase the PermGen size? Please let me know what is your
setting when the problem disappears
Sim,
Can you increase the PermGen size? Please let me know what is your setting
when the problem disappears.
Thanks,
Yin
On Sun, Jul 5, 2015 at 5:59 PM, Denny Lee denny.g@gmail.com wrote:
I had run into the same problem where everything was working swimmingly
with Spark 1.3.1. When I
Hi Sim,
Seems you already set the PermGen size to 256m, right? I notice that in
your the shell, you created a HiveContext (it further increased the memory
consumption on PermGen). But, spark shell has already created a HiveContext
for you (sqlContext. You can use asInstanceOf to access
Date: Thursday, July 2, 2015 at 4:34 PM
To: Simeon Simeonov s...@swoop.commailto:s...@swoop.com
Cc: user user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: 1.4.0 regression: out-of-memory errors on small data
Hi Sim,
Seems you already set the PermGen size to 256m, right? I notice
Hi Sim,
Spark 1.4.0's memory consumption on PermGen is higher then Spark 1.3
(explained in https://issues.apache.org/jira/browse/SPARK-8776). Can you
add --conf spark.driver.extraJavaOptions=-XX:MaxPermSize=256m in the
command you used to launch Spark shell? This will increase the PermGen size
I wonder if this could be a side effect of Spark-3928. Does ending the path
with *.parquet work?
div Original message /divdivFrom: Exie
tfind...@prodevelop.com.au /divdivDate:06/30/2015 9:20 PM (GMT-05:00)
/divdivTo: user@spark.apache.org /divdivSubject: 1.4.0 /divdiv
/divSo
16 matches
Mail list logo