Thanks, but I do not to log myself program info, I just do not want spark
output all the info to my console, I want the spark output the log into to
some file which I specified.



On Tue, Mar 11, 2014 at 11:49 AM, Robin Cjc <cjcro...@gmail.com> wrote:

> Hi lihu,
>
> you can extends the org.apache.spark.logging class. Then use the function
> like logInfo(). Then will log according to the config in your
> log4j.properties.
>
> Best Regards,
> Chen Jingci
>
>
> On Tue, Mar 11, 2014 at 11:36 AM, lihu <lihu...@gmail.com> wrote:
>
>> Hi,
>>    I use the spark0.9, and when i run the spark-shell, I can log property
>> according the log4j.properties in the SPARK_HOME/conf directory.But when I
>> use the standalone app, I do not know how to log it.
>>
>>   I use the SparkConf to set it, such as:
>>
>>   *val conf = new SparkConf()*
>> *  conf.set("*log4j.configuration*", "/home/hadoop/spark/conf/l*
>> *og4j.properties**")*
>>
>>   but it does not work.
>>
>>   this question maybe simple, but I can not find anything in the web. and
>> I think this maybe helpful for many people who do not familiar with spark.
>>
>>
>


-- 
*Best Wishes!*

*Li Hu(李浒) | Graduate Student*

*Institute for Interdisciplinary Information Sciences(IIIS
<http://iiis.tsinghua.edu.cn/>)*
*Tsinghua University, China*

*Email: lihu...@gmail.com <lihu...@gmail.com>*
*Tel  : +86 15120081920*
*Homepage: http://iiis.tsinghua.edu.cn/zh/lihu/
<http://iiis.tsinghua.edu.cn/zh/lihu/>*

Reply via email to