I am running on m3.xlarge instances on AWS with 12 gb worker memory and 10
gb executor memory.

On Sun, Feb 1, 2015, 12:41 PM Arush Kharbanda <ar...@sigmoidanalytics.com>
wrote:

> What is the machine configuration you are running it on?
>
> On Mon, Feb 2, 2015 at 1:46 AM, Ankur Srivastava <
> ankur.srivast...@gmail.com> wrote:
>
>> I am using the log4j.properties file which is shipped with SPARK
>> distribution in the conf directory.
>>
>> Thanks
>> Ankur
>>
>> On Sat, Jan 31, 2015, 12:36 AM Arush Kharbanda <
>> ar...@sigmoidanalytics.com> wrote:
>>
>>> Can you share your log4j file.
>>>
>>> On Sat, Jan 31, 2015 at 1:35 PM, Arush Kharbanda <
>>> ar...@sigmoidanalytics.com> wrote:
>>>
>>>> Hi Ankur,
>>>>
>>>> Its running fine for me for spark 1.1 and changes to log4j properties
>>>> file.
>>>>
>>>> Thanks
>>>> Arush
>>>>
>>>> On Fri, Jan 30, 2015 at 9:49 PM, Ankur Srivastava <
>>>> ankur.srivast...@gmail.com> wrote:
>>>>
>>>>> Hi Arush
>>>>>
>>>>> I have configured log4j by updating the file log4j.properties in
>>>>> SPARK_HOME/conf folder.
>>>>>
>>>>> If it was a log4j defect we would get error in debug mode in all apps.
>>>>>
>>>>> Thanks
>>>>> Ankur
>>>>>  Hi Ankur,
>>>>>
>>>>> How are you enabling the debug level of logs. It should be a log4j
>>>>> configuration. Even if there would be some issue it would be in log4j and
>>>>> not in spark.
>>>>>
>>>>> Thanks
>>>>> Arush
>>>>>
>>>>> On Fri, Jan 30, 2015 at 4:24 AM, Ankur Srivastava <
>>>>> ankur.srivast...@gmail.com> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> When ever I enable DEBUG level logs for my spark cluster, on running
>>>>>> a job all the executors die with the below exception. On disabling the
>>>>>> DEBUG logs my jobs move to the next step.
>>>>>>
>>>>>>
>>>>>> I am on spark-1.1.0
>>>>>>
>>>>>> Is this a known issue with spark?
>>>>>>
>>>>>> Thanks
>>>>>> Ankur
>>>>>>
>>>>>> 2015-01-29 22:27:42,467 [main] INFO  org.apache.spark.SecurityManager
>>>>>> - SecurityManager: authentication disabled; ui acls disabled; users with
>>>>>> view permissions: Set(ubuntu); users with modify permissions: Set(ubuntu)
>>>>>>
>>>>>> 2015-01-29 22:27:42,478 [main] DEBUG org.apache.spark.util.AkkaUtils
>>>>>> - In createActorSystem, requireCookie is: off
>>>>>>
>>>>>> 2015-01-29 22:27:42,871
>>>>>> [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO
>>>>>> akka.event.slf4j.Slf4jLogger - Slf4jLogger started
>>>>>>
>>>>>> 2015-01-29 22:27:42,912
>>>>>> [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO  Remoting -
>>>>>> Starting remoting
>>>>>>
>>>>>> 2015-01-29 22:27:43,057
>>>>>> [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO  Remoting -
>>>>>> Remoting started; listening on addresses :[akka.tcp://
>>>>>> driverPropsFetcher@10.77.9.155:36035]
>>>>>>
>>>>>> 2015-01-29 22:27:43,060
>>>>>> [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO  Remoting -
>>>>>> Remoting now listens on addresses: [akka.tcp://
>>>>>> driverPropsFetcher@10.77.9.155:36035]
>>>>>>
>>>>>> 2015-01-29 22:27:43,067 [main] INFO  org.apache.spark.util.Utils -
>>>>>> Successfully started service 'driverPropsFetcher' on port 36035.
>>>>>>
>>>>>> 2015-01-29 22:28:13,077 [main] ERROR
>>>>>> org.apache.hadoop.security.UserGroupInformation -
>>>>>> PriviledgedActionException as:ubuntu
>>>>>> cause:java.util.concurrent.TimeoutException: Futures timed out after [30
>>>>>> seconds]
>>>>>>
>>>>>> Exception in thread "main"
>>>>>> java.lang.reflect.UndeclaredThrowableException: Unknown exception in doAs
>>>>>>
>>>>>>         at
>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1134)
>>>>>>
>>>>>>         at
>>>>>> org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:52)
>>>>>>
>>>>>>         at
>>>>>> org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:113)
>>>>>>
>>>>>>         at
>>>>>> org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:156)
>>>>>>
>>>>>>         at
>>>>>> org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
>>>>>>
>>>>>> Caused by: java.security.PrivilegedActionException:
>>>>>> java.util.concurrent.TimeoutException: Futures timed out after [30 
>>>>>> seconds]
>>>>>>
>>>>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>>>>
>>>>>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>
>>>>>>         at
>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>>>>>>
>>>>>>         ... 4 more
>>>>>>
>>>>>> Caused by: java.util.concurrent.TimeoutException: Futures timed out
>>>>>> after [30 seconds]
>>>>>>
>>>>>>         at
>>>>>> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>>>>>>
>>>>>>         at
>>>>>> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>>>>>>
>>>>>>         at
>>>>>> scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>>>>>>
>>>>>>         at
>>>>>> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>>>>>>
>>>>>>         at scala.concurrent.Await$.result(package.scala:107)
>>>>>>
>>>>>>         at
>>>>>> org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:125)
>>>>>>
>>>>>>         at
>>>>>> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:53)
>>>>>>
>>>>>>         at
>>>>>> org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:52)
>>>>>>
>>>>>>         ... 7 more
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>>
>>>>> [image: Sigmoid Analytics]
>>>>> <http://htmlsig.com/www.sigmoidanalytics.com>
>>>>>
>>>>> *Arush Kharbanda* || Technical Teamlead
>>>>>
>>>>> ar...@sigmoidanalytics.com || www.sigmoidanalytics.com
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>>
>>>> [image: Sigmoid Analytics]
>>>> <http://htmlsig.com/www.sigmoidanalytics.com>
>>>>
>>>> *Arush Kharbanda* || Technical Teamlead
>>>>
>>>> ar...@sigmoidanalytics.com || www.sigmoidanalytics.com
>>>>
>>>
>>>
>>>
>>> --
>>>
>>> [image: Sigmoid Analytics] <http://htmlsig.com/www.sigmoidanalytics.com>
>>>
>>> *Arush Kharbanda* || Technical Teamlead
>>>
>>> ar...@sigmoidanalytics.com || www.sigmoidanalytics.com
>>>
>>
>
>
> --
>
> [image: Sigmoid Analytics] <http://htmlsig.com/www.sigmoidanalytics.com>
>
> *Arush Kharbanda* || Technical Teamlead
>
> ar...@sigmoidanalytics.com || www.sigmoidanalytics.com
>

Reply via email to