Re: Configuring log4j Spark

2016-03-30 Thread Guillermo Ortiz
I changed the place of --files and works.

 ( IT DOESN'T WORK)
spark-submit  --conf spark.metrics.conf=metrics.properties --name
"myProject" --master yarn-cluster --class myCompany.spark.MyClass *--files
/opt/myProject/conf/log4j.properties* --jars $SPARK_CLASSPATH
--executor-memory 1024m --num-executors 5  --executor-cores 1
--driver-memory 1024m  /opt/myProject/myJar.jar

(IT WORKS)
spark-submit  --conf spark.metrics.conf=metrics.properties --name
"myProject" --master yarn-cluster --class myCompany.spark.MyClass  --jars
$SPARK_CLASSPATH --executor-memory 1024m --num-executors 5
 --executor-cores 1 --driver-memory 1024m *--files
/opt/myProject/conf/log4j.properties*  /opt/myProject/myJar.jar

I think I didn't do any others changes.



2016-03-30 15:42 GMT+02:00 Guillermo Ortiz :

> I'm trying to configure log4j in Spark.
>
> spark-submit  --conf spark.metrics.conf=metrics.properties --name
> "myProject" --master yarn-cluster --class myCompany.spark.MyClass *--files
> /opt/myProject/conf/log4j.properties* --jars $SPARK_CLASSPATH
> --executor-memory 1024m --num-executors 5  --executor-cores 1
> --driver-memory 1024m  /opt/myProject/myJar.jar
>
> I have this log4j.properties
> log4j.rootCategory=DEBUG, RollingAppender, myConsoleAppender
> #log4j.logger.mycompany.spark=DEBUG
> log4j.category.myCompany.spark=DEBUG
> spark.log.dir=/opt/myProject/log
> spark.log.file=spark.log
>
> log4j.appender.myConsoleAppender=org.apache.log4j.ConsoleAppender
> log4j.appender.myConsoleAppender.layout=org.apache.log4j.PatternLayout
> log4j.appender.myConsoleAppender.Target=System.out
> log4j.appender.myConsoleAppender.layout.ConversionPattern=%d [%t] %-5p %c
> - %m%n
>
> log4j.appender.RollingAppender=org.apache.log4j.DailyRollingFileAppender
> log4j.appender.RollingAppender.MaxFileSize=50MB
> log4j.appender.RollingAppender.MaxBackupIndex=5
> log4j.appender.RollingAppender.layout.ConversionPattern=%d{dd MMM 
> HH:mm:ss,SSS} %-5p [%t] (%C.%M:%L) %x - %m%n
> log4j.appender.RollingAppender.File=${spark.log.dir}/${spark.log.file}
> log4j.appender.RollingAppender.layout=org.apache.log4j.PatternLayout
> log4j.appender.RollingAppender.layout.ConversionPattern=[%p] %d %c %M -
> %m%n
>
> With this I see the log driver with DEBUG level, but the executors with
> INFO level. Why can't I see the executor logs in INFO level?
> I'm using Spark 1.5.0
>
>
>


Configuring log4j Spark

2016-03-30 Thread Guillermo Ortiz
I'm trying to configure log4j in Spark.

spark-submit  --conf spark.metrics.conf=metrics.properties --name
"myProject" --master yarn-cluster --class myCompany.spark.MyClass *--files
/opt/myProject/conf/log4j.properties* --jars $SPARK_CLASSPATH
--executor-memory 1024m --num-executors 5  --executor-cores 1
--driver-memory 1024m  /opt/myProject/myJar.jar

I have this log4j.properties
log4j.rootCategory=DEBUG, RollingAppender, myConsoleAppender
#log4j.logger.mycompany.spark=DEBUG
log4j.category.myCompany.spark=DEBUG
spark.log.dir=/opt/myProject/log
spark.log.file=spark.log

log4j.appender.myConsoleAppender=org.apache.log4j.ConsoleAppender
log4j.appender.myConsoleAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.myConsoleAppender.Target=System.out
log4j.appender.myConsoleAppender.layout.ConversionPattern=%d [%t] %-5p %c -
%m%n

log4j.appender.RollingAppender=org.apache.log4j.DailyRollingFileAppender
log4j.appender.RollingAppender.MaxFileSize=50MB
log4j.appender.RollingAppender.MaxBackupIndex=5
log4j.appender.RollingAppender.layout.ConversionPattern=%d{dd MMM 
HH:mm:ss,SSS} %-5p [%t] (%C.%M:%L) %x - %m%n
log4j.appender.RollingAppender.File=${spark.log.dir}/${spark.log.file}
log4j.appender.RollingAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.RollingAppender.layout.ConversionPattern=[%p] %d %c %M - %m%n

With this I see the log driver with DEBUG level, but the executors with
INFO level. Why can't I see the executor logs in INFO level?
I'm using Spark 1.5.0


Configuring log4j

2015-12-18 Thread Afshartous, Nick

Hi,


Am trying to configure log4j on an AWS EMR 4.2 Spark cluster for a streaming 
job set in client mode.


I changed


   /etc/spark/conf/log4j.properties


to use a FileAppender.  However the INFO logging still goes to console.


Thanks for any suggestions,

--

Nick


>From the console:

Adding default property: 
spark.driver.extraJavaOptions=-Dlog4j.configuration=file:///etc/spark/conf/log4j.properties
 -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 
-XX:MaxHeapFreeRatio=70 -XX:+CM\
SClassUnloadingEnabled -XX:MaxPermSize=512M -XX:OnOutOfMemoryError='kill -9 %p'



Re: Configuring log4j

2015-12-18 Thread Afshartous, Nick

Found the issue, a conflict between setting Java options in both 
spark-defaults.conf and in the spark-submit.

--

Nick



From: Afshartous, Nick <nafshart...@turbine.com>
Sent: Friday, December 18, 2015 11:46 AM
To: user@spark.apache.org
Subject: Configuring log4j



Hi,


Am trying to configure log4j on an AWS EMR 4.2 Spark cluster for a streaming 
job set in client mode.


I changed


   /etc/spark/conf/log4j.properties


to use a FileAppender.  However the INFO logging still goes to console.


Thanks for any suggestions,

--

Nick


>From the console:

Adding default property: 
spark.driver.extraJavaOptions=-Dlog4j.configuration=file:///etc/spark/conf/log4j.properties
 -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=70 
-XX:MaxHeapFreeRatio=70 -XX:+CM\
SClassUnloadingEnabled -XX:MaxPermSize=512M -XX:OnOutOfMemoryError='kill -9 %p'



Re: Configuring Log4J (Spark 1.5 on EMR 4.1)

2015-11-20 Thread Igor Berman
try to assemble log4j.xml or log4j.properties in your jar...probably you'll
get what you want, however pay attention that when you'll move to multinode
cluster - there will be difference

On 20 November 2015 at 05:10, Afshartous, Nick <nafshart...@turbine.com>
wrote:

>
> < log4j.properties file only exists on the master and not the slave nodes,
> so you are probably running into
> https://issues.apache.org/jira/browse/SPARK-11105, which has already been
> fixed in the not-yet-released Spark 1.6.0. EMR will upgrade to Spark 1.6.0
> once it is released.
>
> Thanks for the info, though this is a single-node cluster so that can't be
> the cause of the error (which is in the driver log).
> --
>   Nick
> 
> From: Jonathan Kelly [jonathaka...@gmail.com]
> Sent: Thursday, November 19, 2015 6:45 PM
> To: Afshartous, Nick
> Cc: user@spark.apache.org
> Subject: Re: Configuring Log4J (Spark 1.5 on EMR 4.1)
>
> This file only exists on the master and not the slave nodes, so you are
> probably running into https://issues.apache.org/jira/browse/SPARK-11105,
> which has already been fixed in the not-yet-released Spark 1.6.0. EMR will
> upgrade to Spark 1.6.0 once it is released.
>
> ~ Jonathan
>
> On Thu, Nov 19, 2015 at 1:30 PM, Afshartous, Nick <nafshart...@turbine.com
> <mailto:nafshart...@turbine.com>> wrote:
>
> Hi,
>
> On Spark 1.5 on EMR 4.1 the message below appears in stderr in the Yarn UI.
>
>   ERROR StatusLogger No log4j2 configuration file found. Using default
> configuration: logging only errors to the console.
>
> I do see that there is
>
>/usr/lib/spark/conf/log4j.properties
>
> Can someone please advise on how to setup log4j properly.
>
> Thanks,
> --
>   Nick
>
> Notice: This communication is for the intended recipient(s) only and may
> contain confidential, proprietary, legally protected or privileged
> information of Turbine, Inc. If you are not the intended recipient(s),
> please notify the sender at once and delete this communication.
> Unauthorized use of the information in this communication is strictly
> prohibited and may be unlawful. For those recipients under contract with
> Turbine, Inc., the information in this communication is subject to the
> terms and conditions of any applicable contracts or agreements.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org user-unsubscr...@spark.apache.org>
> For additional commands, e-mail: user-h...@spark.apache.org user-h...@spark.apache.org>
>
>
>
> Notice: This communication is for the intended recipient(s) only and may
> contain confidential, proprietary, legally protected or privileged
> information of Turbine, Inc. If you are not the intended recipient(s),
> please notify the sender at once and delete this communication.
> Unauthorized use of the information in this communication is strictly
> prohibited and may be unlawful. For those recipients under contract with
> Turbine, Inc., the information in this communication is subject to the
> terms and conditions of any applicable contracts or agreements.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


Configuring Log4J (Spark 1.5 on EMR 4.1)

2015-11-19 Thread Afshartous, Nick

Hi,

On Spark 1.5 on EMR 4.1 the message below appears in stderr in the Yarn UI.

  ERROR StatusLogger No log4j2 configuration file found. Using default 
configuration: logging only errors to the console.

I do see that there is

   /usr/lib/spark/conf/log4j.properties

Can someone please advise on how to setup log4j properly.

Thanks,
--
  Nick

Notice: This communication is for the intended recipient(s) only and may 
contain confidential, proprietary, legally protected or privileged information 
of Turbine, Inc. If you are not the intended recipient(s), please notify the 
sender at once and delete this communication. Unauthorized use of the 
information in this communication is strictly prohibited and may be unlawful. 
For those recipients under contract with Turbine, Inc., the information in this 
communication is subject to the terms and conditions of any applicable 
contracts or agreements.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



RE: Configuring Log4J (Spark 1.5 on EMR 4.1)

2015-11-19 Thread Afshartous, Nick

< log4j.properties file only exists on the master and not the slave nodes, so 
you are probably running into 
https://issues.apache.org/jira/browse/SPARK-11105, which has already been fixed 
in the not-yet-released Spark 1.6.0. EMR will upgrade to Spark 1.6.0 once it is 
released.

Thanks for the info, though this is a single-node cluster so that can't be the 
cause of the error (which is in the driver log).
--
  Nick

From: Jonathan Kelly [jonathaka...@gmail.com]
Sent: Thursday, November 19, 2015 6:45 PM
To: Afshartous, Nick
Cc: user@spark.apache.org
Subject: Re: Configuring Log4J (Spark 1.5 on EMR 4.1)

This file only exists on the master and not the slave nodes, so you are 
probably running into https://issues.apache.org/jira/browse/SPARK-11105, which 
has already been fixed in the not-yet-released Spark 1.6.0. EMR will upgrade to 
Spark 1.6.0 once it is released.

~ Jonathan

On Thu, Nov 19, 2015 at 1:30 PM, Afshartous, Nick 
<nafshart...@turbine.com<mailto:nafshart...@turbine.com>> wrote:

Hi,

On Spark 1.5 on EMR 4.1 the message below appears in stderr in the Yarn UI.

  ERROR StatusLogger No log4j2 configuration file found. Using default 
configuration: logging only errors to the console.

I do see that there is

   /usr/lib/spark/conf/log4j.properties

Can someone please advise on how to setup log4j properly.

Thanks,
--
  Nick

Notice: This communication is for the intended recipient(s) only and may 
contain confidential, proprietary, legally protected or privileged information 
of Turbine, Inc. If you are not the intended recipient(s), please notify the 
sender at once and delete this communication. Unauthorized use of the 
information in this communication is strictly prohibited and may be unlawful. 
For those recipients under contract with Turbine, Inc., the information in this 
communication is subject to the terms and conditions of any applicable 
contracts or agreements.

-
To unsubscribe, e-mail: 
user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org>
For additional commands, e-mail: 
user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>



Notice: This communication is for the intended recipient(s) only and may 
contain confidential, proprietary, legally protected or privileged information 
of Turbine, Inc. If you are not the intended recipient(s), please notify the 
sender at once and delete this communication. Unauthorized use of the 
information in this communication is strictly prohibited and may be unlawful. 
For those recipients under contract with Turbine, Inc., the information in this 
communication is subject to the terms and conditions of any applicable 
contracts or agreements.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org