[ 
https://issues.apache.org/jira/browse/SPARK-7977?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14991399#comment-14991399
 ] 

sparkerjin commented on SPARK-7977:
-----------------------------------

Hi,
 
I run the spark-submit with cluster mode, but there was not any output  about 
the driverID and the status of the driver.   Is

The descriptions are as follows:
1. run the spark-submit with cluster mode:
[root@jasonspark02 spark-1.5.1-bin-hadoop2.4]# bin/spark-submit --deploy-mode 
cluster --class org.apache.spark.examples.SparkPi ./lib/spark-

examples-1.5.1-hadoop2.4.0.jar
Running Spark using the REST application submission protocol.
15/11/05 02:13:43 INFO rest.RestSubmissionClient: Submitting a request to 
launch an application in spark://jasonspark02:7077.
15/11/05 02:13:43 WARN rest.RestSubmissionClient: Unable to connect to server 
spark://jasonspark02:7077.
Warning: Master endpoint spark://jasonspark02:7077 was not a REST server. 
Falling back to legacy submission gateway instead.
15/11/05 02:13:43 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where 

applicable
[root@jasonspark02 spark-1.5.1-bin-hadoop2.4]#

2. Problem:
we can see that there is not any information about the driverId and the state 
of the driver.  I think users need to use these information to get more 
information about the job.

And running spark-submit cluster mode with --verbose,  it also could not get 
the info.

3. Reason:
I  looked into the code, and found that  --verbose  or -v was not passed into 
the  childArgs in spark-submit, and it used the  Level.WARN as the default  in 
Client.scala.

4. Expected:
I think the users should know the driverID and status after he/she submits a 
job.
such as:
[root@jasonpark02 spark-1.5.1-bin-hadoop2.4]# bin/spark-submit --deploy-mode 
cluster --conf spark.ego.uname=u1 --conf spark.ego.passwd=u1  --class 
org.apache.spark.examples.SparkPi ./lib/spark-examples-1.5.1-hadoop2.4.0.jar
Running Spark using the REST application submission protocol.
15/11/05 03:08:43 INFO rest.RestSubmissionClient: Submitting a request to 
launch an application in spark://jasonspark02:7077.
15/11/05 03:08:44 WARN rest.RestSubmissionClient: Unable to connect to server 
spark://jasonspark02:7077.
Warning: Master endpoint spark://jinspark02:7077 was not a REST server. Falling 
back to legacy submission gateway instead.
15/11/05 03:08:44 WARN util.NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
15/11/05 03:08:44 INFO spark.SecurityManager: Changing view acls to: root
15/11/05 03:08:44 INFO spark.SecurityManager: Changing modify acls to: root
15/11/05 03:08:44 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(root); users with 
modify permissions: Set(root)
15/11/05 03:08:45 INFO slf4j.Slf4jLogger: Slf4jLogger started
15/11/05 03:08:45 INFO util.Utils: Successfully started service 'driverClient' 
on port 47454.
15/11/05 03:08:45 INFO deploy.ClientEndpoint: Driver successfully submitted as 
driver-20151105030845-0000                             <------   I think the 
info is very important
15/11/05 03:08:45 INFO deploy.ClientEndpoint: ... waiting before polling master 
for driver state
15/11/05 03:08:50 INFO deploy.ClientEndpoint: ... polling master for driver 
state
15/11/05 03:08:50 INFO deploy.ClientEndpoint: State of 
driver-20151105030845-0000

What do you think of it? Any ideas, please let me know.  Thanks.

Jie Hua




> Disallow println
> ----------------
>
>                 Key: SPARK-7977
>                 URL: https://issues.apache.org/jira/browse/SPARK-7977
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Project Infra
>            Reporter: Reynold Xin
>            Assignee: Jon Alter
>              Labels: starter
>             Fix For: 1.5.0
>
>
> Very often we see pull requests that added println from debugging, but the 
> author forgot to remove it before code review.
> We can use the regex checker to disallow println. For legitimate use of 
> println, we can then disable the rule where they are used.
> Add to scalastyle-config.xml file:
> {code}
>   <check customId="println" level="error" 
> class="org.scalastyle.scalariform.TokenChecker" enabled="true">
>     <parameters><parameter name="regex">^println$</parameter></parameters>
>     <customMessage><![CDATA[Are you sure you want to println? If yes, wrap 
> the code block with 
>       // scalastyle:off println
>       println(...)
>       // scalastyle:on println]]></customMessage>
>   </check>
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to