Re: [Spark 1.5.2] Log4j Configuration for executors

2016-04-19 Thread Prashant Sharma
May be you can try creating it before running the App.

[Spark 1.5.2] Log4j Configuration for executors

2016-04-18 Thread Divya Gehlot
Hi, I tried configuring logs to write it to file for Spark Driver and Executors . I have two separate log4j properties files for Spark driver and executor respectively. Its wrtiting log for Spark driver but for executor logs I am getting below error : java.io.FileNotFoundException:

Re: Spark with log4j

2015-12-21 Thread Igor Berman
t it doesn’t solve my issue. > > Still print logs on console only. > > > > > > Thanks, > > Kalpesh Jadhav. > > > > *From:* Ted Yu [mailto:yuzhih...@gmail.com] > *Sent:* Friday, December 18, 2015 9:15 PM > *To:* Kalpesh Jadhav > *Cc:* user >

Re: Spark with log4j

2015-12-21 Thread Siva
Hi Kalpseh, Just to add, you could use "yarn logs -applicationId " to see aggregated logs once application is finished. Thanks, Sivakumar Bhavanari. On Mon, Dec 21, 2015 at 3:56 PM, Zhan Zhang wrote: > Hi Kalpesh, > > If you are using spark on yarn, it may not work.

Re: Spark with log4j

2015-12-21 Thread Zhan Zhang
Hi Kalpesh, If you are using spark on yarn, it may not work. Because you write log to files other than stdout/stderr, which yarn log aggregation may not work. As I understand, yarn only aggregate log in stdout/stderr, and local cache will be deleted (in configured timeframe). To check it, at

RE: Spark with log4j

2015-12-21 Thread Kalpesh Jadhav
Hi Siva, Through this command it doesn’t print log.info messages whatever I have written in application. Thanks, Kalpesh Jadhav From: Siva [mailto:sbhavan...@gmail.com] Sent: Tuesday, December 22, 2015 6:27 AM To: Zhan Zhang Cc: Kalpesh Jadhav; user@spark.apache.org Subject: Re: Spark

RE: Spark with log4j

2015-12-21 Thread Kalpesh Jadhav
: Spark with log4j Hi Kalpesh, If you are using spark on yarn, it may not work. Because you write log to files other than stdout/stderr, which yarn log aggregation may not work. As I understand, yarn only aggregate log in stdout/stderr, and local cache will be deleted (in configured timeframe

RE: Spark with log4j

2015-12-21 Thread Kalpesh Jadhav
Hi Ted, Thanks for your response, But it doesn’t solve my issue. Still print logs on console only. Thanks, Kalpesh Jadhav. From: Ted Yu [mailto:yuzhih...@gmail.com] Sent: Friday, December 18, 2015 9:15 PM To: Kalpesh Jadhav Cc: user Subject: Re: Spark with log4j See this thread

Re: Spark with log4j

2015-12-18 Thread Ted Yu
See this thread: http://search-hadoop.com/m/q3RTtEor1vYWbsW which mentioned: SPARK-11105 Disitribute the log4j.properties files from the client to the executors FYI On Fri, Dec 18, 2015 at 7:23 AM, Kalpesh Jadhav < kalpesh.jad...@citiustech.com> wrote: > Hi all, > > > > I am new to spark, I am

Spark with log4j

2015-12-18 Thread Kalpesh Jadhav
Hi all, I am new to spark, I am trying to use log4j for logging my application. But any how the logs are not getting written at specified file. I have created application using maven, and kept log.properties file at resources folder. Application written in scala . If there is any

Re: Spark Streaming Log4j Inside Eclipse

2015-09-29 Thread Ashish Soni
stackoverflow.com/questions/28840438/how-to-override-sparks-log4j-properties-per-driver >>> >>> From: Ashish Soni >>> Date: Monday, September 28, 2015 at 5:18 PM >>> To: user >>> Subject: Spark Streaming Log4j Inside Eclipse >>> >>> I

Re: Spark Streaming Log4j Inside Eclipse

2015-09-29 Thread Adrian Tanase
ate: Monday, September 28, 2015 at 5:18 PM To: user Subject: Spark Streaming Log4j Inside Eclipse I need to turn off the verbose logging of Spark Streaming Code when i am running inside eclipse i tried creating a log4j.properties file and placed inside /src/main/resources but i do not see

Re: Spark Streaming Log4j Inside Eclipse

2015-09-29 Thread Shixiong Zhu
om> >>> wrote: >>> >>>> You also need to provide it as parameter to spark submit >>>> >>>> http://stackoverflow.com/questions/28840438/how-to-override-sparks-log4j-properties-per-driver >>>> >>>> From: Ashish Soni >

Re: Spark Streaming Log4j Inside Eclipse

2015-09-28 Thread Ashish Soni
w.com/questions/28840438/how-to-override-sparks-log4j-properties-per-driver > > From: Ashish Soni > Date: Monday, September 28, 2015 at 5:18 PM > To: user > Subject: Spark Streaming Log4j Inside Eclipse > > I need to turn off the verbose logging of Spark Streaming Code when i

Re: Spark Streaming Log4j Inside Eclipse

2015-09-28 Thread Adrian Tanase
You also need to provide it as parameter to spark submit http://stackoverflow.com/questions/28840438/how-to-override-sparks-log4j-properties-per-driver From: Ashish Soni Date: Monday, September 28, 2015 at 5:18 PM To: user Subject: Spark Streaming Log4j Inside Eclipse I need to turn off

Re: Spark Streaming Log4j Inside Eclipse

2015-09-28 Thread Shixiong Zhu
t;> >> From: Ashish Soni >> Date: Monday, September 28, 2015 at 5:18 PM >> To: user >> Subject: Spark Streaming Log4j Inside Eclipse >> >> I need to turn off the verbose logging of Spark Streaming Code when i am >> running inside eclipse i tried creati

Spark Streaming Log4j Inside Eclipse

2015-09-28 Thread Ashish Soni
Hi All , I need to turn off the verbose logging of Spark Streaming Code when i am running inside eclipse i tried creating a log4j.properties file and placed inside /src/main/resources but i do not see it getting any effect , Please help as not sure what else needs to be done to change the log at

missing method in my slf4j after excluding Spark ZK log4j

2014-05-12 Thread Adrian Mocanu
into this. Spark uses log4j v1.2.17 and slf4j-log4j12:1.7.2 I use slf4j 1.7.5, logback 1.0.13, and log4joverslf4j v 1.7.5 I think my slf4j 1.7.5 doesn't agree with what zookeeper expects in its log4j v 1.2.17 because I get missing method error: java.lang.NoSuchMethodError

Re: missing method in my slf4j after excluding Spark ZK log4j

2014-05-12 Thread Sean Owen
dependency and was told that it was gone. However I still find it part of zookeeper imports. This is fine since I exclude it myself in the sbt file, but another issue arises. I wonder if anyone else has run into this. Spark uses log4j v1.2.17 and slf4j-log4j12:1.7.2 I use slf4j 1.7.5, logback

Re: missing method in my slf4j after excluding Spark ZK log4j

2014-05-12 Thread Tathagata Das
since I exclude it myself in the sbt file, but another issue arises. I wonder if anyone else has run into this. Spark uses log4j v1.2.17 and slf4j-log4j12:1.7.2 I use slf4j 1.7.5, logback 1.0.13, and log4joverslf4j v 1.7.5 I think my slf4j 1.7.5 doesn't agree with what

Re: missing method in my slf4j after excluding Spark ZK log4j

2014-05-12 Thread Paul Brown
, in Spark 0.9 - I now use 0.9.1, about removing log4j dependency and was told that it was gone. However I still find it part of zookeeper imports. This is fine since I exclude it myself in the sbt file, but another issue arises. I wonder if anyone else has run into this. Spark uses log4j