If you use > it only print the (print or println) to log file in the others log 
like (INFO, WARN, ERROR) -> (stdout) I believe it not print to the log file. 
But tee can do that.
The following command (with the help of tee command) writes the output both to 
the screen (stdout) and to the file.

Thanks.



> On Jul 21, 2016, at 1:48 PM, <luohui20...@sina.com> <luohui20...@sina.com> 
> wrote:
> 
> 
> got it.
> 
> difference:
> > : all messages goes to the log file, leaving no messages in STDOUT
> tee: all message goes to the log file and STDOUT at the same time.
> 
> --------------------------------
>  
> Thanks&amp;Best regards!
> San.Luo
> 
> ----- 原始邮件 -----
> 发件人:Chanh Le <giaosu...@gmail.com>
> 收件人:luohui20...@sina.com
> 抄送人:focus <focushe...@qq.com>, user <user@spark.apache.org>
> 主题:Re: run spark apps in linux crontab
> 日期:2016年07月21日 11点38分
> 
> you should you use command.sh | tee file.log
> 
>> On Jul 21, 2016, at 10:36 AM, <luohui20...@sina.com 
>> <mailto:luohui20...@sina.com>> <luohui20...@sina.com 
>> <mailto:luohui20...@sina.com>> wrote:
>> 
>> 
>> thank you focus, and all.
>> this problem solved by adding a line ". /etc/profile" in my shell.
>> 
>> 
>> --------------------------------
>>  
>> Thanks&amp;Best regards!
>> San.Luo
>> 
>> ----- 原始邮件 -----
>> 发件人:"focus" <focushe...@qq.com <mailto:focushe...@qq.com>>
>> 收件人:"luohui20001" <luohui20...@sina.com <mailto:luohui20...@sina.com>>, 
>> "user@spark.apache.org <mailto:user@spark.apache.org>" 
>> <user@spark.apache.org <mailto:user@spark.apache.org>>
>> 主题:Re:run spark apps in linux crontab
>> 日期:2016年07月20日 18点11分
>> 
>> Hi, I just meet this problem, too! The reason is crontab runtime doesn't 
>> have the variables you defined, such as $SPARK_HOME.
>> I defined the $SPARK_HOME and other variables in /etc/profile like this:
>> 
>> export $MYSCRIPTS=/opt/myscripts
>> export $SPARK_HOME=/opt/spark
>> 
>> then, in my crontab job script daily_job.sh
>> 
>> #!/bin/sh
>> 
>> . /etc/profile
>> 
>> $SPARK_HOME/bin/spark-submit $MYSCRIPTS/fix_fh_yesterday.py
>> 
>> then, in crontab -e
>> 
>> 0 8 * * * /home/user/daily_job.sh
>> 
>> hope this helps~
>> 
>> 
>> 
>> 
>> ------------------ Original ------------------
>> From: "luohui20001"<luohui20...@sina.com <mailto:luohui20...@sina.com>>;
>> Date: 2016年7月20日(星期三) 晚上6:00
>> To: "user@spark.apache.org 
>> <mailto:user@spark.apache.org>"<user@spark.apache.org 
>> <mailto:user@spark.apache.org>>;
>> Subject: run spark apps in linux crontab
>> 
>> hi guys:
>>       I add a spark-submit job into my Linux crontab list by the means below 
>> ,however none of them works. If I change it to a normal shell script, it is 
>> ok. I don't quite understand why. I checked the 8080 web ui of my spark 
>> cluster, no job submitted, and there is not messages in /home/hadoop/log.
>>       Any idea is welcome.
>> 
>> [hadoop@master ~]$ crontab -e
>> 1.
>> 22 21 * * * sh /home/hadoop/shellscripts/run4.sh > /home/hadoop/log
>> 
>> and in run4.sh,it wrote:
>> $SPARK_HOME/bin/spark-submit --class com.abc.myclass --total-executor-cores 
>> 10 --jars $SPARK_HOME/lib/MyDep.jar $SPARK_HOME/MyJar.jar  > /home/hadoop/log
>> 
>> 2.
>> 22 21 * * * $SPARK_HOME/bin/spark-submit --class com.abc.myclass 
>> --total-executor-cores 10 --jars $SPARK_HOME/lib/MyDep.jar 
>> $SPARK_HOME/MyJar.jar  > /home/hadoop/log
>> 
>> 3.
>> 22 21 * * * /usr/lib/spark/bin/spark-submit --class com.abc.myclass 
>> --total-executor-cores 10 --jars /usr/lib/spark/lib/MyDep.jar 
>> /usr/lib/spark/MyJar.jar  > /home/hadoop/log
>> 
>> 4.
>> 22 21 * * * hadoop /usr/lib/spark/bin/spark-submit --class com.abc.myclass 
>> --total-executor-cores 10 --jars /usr/lib/spark/lib/MyDep.jar 
>> /usr/lib/spark/MyJar.jar  > /home/hadoop/log
>> 
>> --------------------------------
>>  
>> Thanks&amp;Best regards!
>> San.Luo
> 

Reply via email to