Re: System memory 186646528 must be at least 4.718592E8.

2016-05-13 Thread satish saley
Thank you . Looking at the source code helped :)

I set spark.testing.memory to 512 MB and it worked :)

private def getMaxMemory(conf: SparkConf): Long = {
  val systemMemory = conf.getLong("spark.testing.memory",
Runtime.getRuntime.maxMemory)
  val reservedMemory = conf.getLong("spark.testing.reservedMemory",
if (conf.contains("spark.testing")) 0 else RESERVED_SYSTEM_MEMORY_BYTES)
  val minSystemMemory = reservedMemory * 1.5
  if (systemMemory < minSystemMemory) {
throw new IllegalArgumentException(s"System memory $systemMemory must " +


On Fri, May 13, 2016 at 12:51 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> Here is related code:
>
>   val executorMemory = conf.*getSizeAsBytes*("spark.executor.memory")
>   if (executorMemory < minSystemMemory) {
> throw new IllegalArgumentException(s"Executor memory
> $executorMemory must be at least " +
>
> On Fri, May 13, 2016 at 12:47 PM, satish saley <satishsale...@gmail.com>
> wrote:
>
>> Hello,
>> I am running
>> https://github.com/apache/spark/blob/branch-1.6/examples/src/main/python/pi.py
>>  example,
>> but facing following exception
>>
>> What is the unit of memory pointed out in the error?
>>
>> Following are configs
>>
>> --master
>>
>> local[*]
>>
>> --deploy-mode
>>
>> client
>>
>> --name
>>
>> PysparkExample
>>
>> --py-files
>>
>> py4j-0.9-src.zip,pyspark.zip,
>>
>>         --verbose
>>
>>
>> pi.py/py4j-0.9-src.zip/py4j/protocol.py", line 308, in get_return_value
>>
>> py4j.protocol.Py4JJavaError: An error occurred while calling
>> None.org.apache.spark.api.java.JavaSparkContext.
>>
>> : java.lang.IllegalArgumentException: System memory 186646528 must be at
>> least 4.718592E8. Please use a larger heap size.
>>
>> at
>> org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:193)
>>
>> at
>> org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:175)
>>
>> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:354)
>>
>> at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
>>
>> at
>> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288)
>>
>> at org.apache.spark.SparkContext.(SparkContext.scala:457)
>>
>> at
>> org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:59)
>>
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> Method)
>>
>> at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>>
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>>
>> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
>>
>> at
>> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
>>
>> at py4j.Gateway.invoke(Gateway.java:214)
>>
>> at
>> py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
>>
>> at
>> py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
>>
>> at py4j.GatewayConnection.run(GatewayConnection.java:209)
>>
>> at java.lang.Thread.run(Thread.java:745)
>>
>
>


Re: System memory 186646528 must be at least 4.718592E8.

2016-05-13 Thread Ted Yu
Here is related code:

  val executorMemory = conf.*getSizeAsBytes*("spark.executor.memory")
  if (executorMemory < minSystemMemory) {
throw new IllegalArgumentException(s"Executor memory
$executorMemory must be at least " +

On Fri, May 13, 2016 at 12:47 PM, satish saley <satishsale...@gmail.com>
wrote:

> Hello,
> I am running
> https://github.com/apache/spark/blob/branch-1.6/examples/src/main/python/pi.py
>  example,
> but facing following exception
>
> What is the unit of memory pointed out in the error?
>
> Following are configs
>
> --master
>
> local[*]
>
> --deploy-mode
>
> client
>
> --name
>
> PysparkExample
>
> --py-files
>
> py4j-0.9-src.zip,pyspark.zip,
>
> --verbose
>
>
> pi.py/py4j-0.9-src.zip/py4j/protocol.py", line 308, in get_return_value
>
> py4j.protocol.Py4JJavaError: An error occurred while calling
> None.org.apache.spark.api.java.JavaSparkContext.
>
> : java.lang.IllegalArgumentException: System memory 186646528 must be at
> least 4.718592E8. Please use a larger heap size.
>
> at
> org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:193)
>
> at
> org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:175)
>
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:354)
>
> at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
>
> at
> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288)
>
> at org.apache.spark.SparkContext.(SparkContext.scala:457)
>
> at
> org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:59)
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
>
> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
>
> at
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
>
> at py4j.Gateway.invoke(Gateway.java:214)
>
> at
> py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
>
> at
> py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
>
> at py4j.GatewayConnection.run(GatewayConnection.java:209)
>
> at java.lang.Thread.run(Thread.java:745)
>


System memory 186646528 must be at least 4.718592E8.

2016-05-13 Thread satish saley
Hello,
I am running
https://github.com/apache/spark/blob/branch-1.6/examples/src/main/python/pi.py
example,
but facing following exception

What is the unit of memory pointed out in the error?

Following are configs

--master

local[*]

--deploy-mode

client

--name

PysparkExample

--py-files

py4j-0.9-src.zip,pyspark.zip,

--verbose


pi.py/py4j-0.9-src.zip/py4j/protocol.py", line 308, in get_return_value

py4j.protocol.Py4JJavaError: An error occurred while calling
None.org.apache.spark.api.java.JavaSparkContext.

: java.lang.IllegalArgumentException: System memory 186646528 must be at
least 4.718592E8. Please use a larger heap size.

at
org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:193)

at
org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:175)

at org.apache.spark.SparkEnv$.create(SparkEnv.scala:354)

at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)

at
org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:288)

at org.apache.spark.SparkContext.(SparkContext.scala:457)

at
org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:59)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)

at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:422)

at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)

at
py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)

at py4j.Gateway.invoke(Gateway.java:214)

at
py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)

at
py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)

at py4j.GatewayConnection.run(GatewayConnection.java:209)

at java.lang.Thread.run(Thread.java:745)