hey,
Thanks. Now it worked.. :)
On Wed, Jun 15, 2016 at 6:59 PM, Jeff Zhang wrote:
> Then the only solution is to increase your driver memory but still
> restricted by your machine's memory. "--driver-memory"
>
> On Thu, Jun 16, 2016 at 9:53 AM, spR
Then the only solution is to increase your driver memory but still
restricted by your machine's memory. "--driver-memory"
On Thu, Jun 16, 2016 at 9:53 AM, spR wrote:
> Hey,
>
> But I just have one machine. I am running everything on my laptop. Won't I
> be able to do
Hey,
But I just have one machine. I am running everything on my laptop. Won't I
be able to do this processing in local mode then?
Regards,
Tejaswini
On Wed, Jun 15, 2016 at 6:32 PM, Jeff Zhang wrote:
> You are using local mode, --executor-memory won't take effect for local
You are using local mode, --executor-memory won't take effect for local
mode, please use other cluster mode.
On Thu, Jun 16, 2016 at 9:32 AM, Jeff Zhang wrote:
> Specify --executor-memory in your spark-submit command.
>
>
>
> On Thu, Jun 16, 2016 at 9:01 AM, spR
Specify --executor-memory in your spark-submit command.
On Thu, Jun 16, 2016 at 9:01 AM, spR wrote:
> Thank you. Can you pls tell How to increase the executor memory?
>
>
>
> On Wed, Jun 15, 2016 at 5:59 PM, Jeff Zhang wrote:
>
>> >>> Caused by:
hey,
I did this in my notebook. But still I get the same error. Is this the
right way to do it?
from pyspark import SparkConf
conf = (SparkConf()
.setMaster("local[4]")
.setAppName("My app")
.set("spark.executor.memory", "12g"))
sc.conf = conf
On Wed, Jun 15, 2016 at
Thank you. Can you pls tell How to increase the executor memory?
On Wed, Jun 15, 2016 at 5:59 PM, Jeff Zhang wrote:
> >>> Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
>
>
> It is OOM on the executor. Please try to increase executor memory.
>
>>> Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
It is OOM on the executor. Please try to increase executor memory.
"--executor-memory"
On Thu, Jun 16, 2016 at 8:54 AM, spR wrote:
> Hey,
>
> error trace -
>
> hey,
>
>
> error trace -
>
>
>
Hey,
error trace -
hey,
error trace -
---Py4JJavaError
Traceback (most recent call
last) in ()> 1 temp.take(2)
Could you paste the full stacktrace ?
On Thu, Jun 16, 2016 at 7:24 AM, spR wrote:
> Hi,
> I am getting this error while executing a query using sqlcontext.sql
>
> The table has around 2.5 gb of data to be scanned.
>
> First I get out of memory exception. But I have 16 gb
Hi,
I am getting this error while executing a query using sqlcontext.sql
The table has around 2.5 gb of data to be scanned.
First I get out of memory exception. But I have 16 gb of ram
Then my notebook dies and I get below error
Py4JNetworkError: An error occurred while trying to connect to
11 matches
Mail list logo