Github user skonto commented on the issue:

    https://github.com/apache/spark/pull/18630
  
    > spark-2.3.0-SNAPSHOT-bin-18630/bin$ ./spark-shell --verbose --master 
spark://ip-10-10-1-79:7077 
    Using properties file: null
    Parsed arguments:
      master                  spark://ip-10-10-1-79:7077
      deployMode              null
      executorMemory          null
      executorCores           null
      totalExecutorCores      null
      propertiesFile          null
      driverMemory            null
      driverCores             null
      driverExtraClassPath    null
      driverExtraLibraryPath  null
      driverExtraJavaOptions  null
      supervise               false
      queue                   null
      numExecutors            null
      files                   null
      pyFiles                 null
      archives                null
      mainClass               org.apache.spark.repl.Main
      primaryResource         spark-shell
      name                    Spark shell
      childArgs               []
      jars                    null
      packages                null
      packagesExclusions      null
      repositories            null
      verbose                 true
    
    Spark properties used, including those specified through
     --conf and those from the properties file null:
      
    
        
    Main class:
    org.apache.spark.repl.Main
    Arguments:
    
    System properties:
    (SPARK_SUBMIT,true)
    (spark.app.name,Spark shell)
    (spark.jars,)
    (spark.submit.deployMode,client)
    (spark.master,spark://ip-10-10-1-79:7077)
    Classpath elements:
    
    
    
    Using Spark's default log4j profile: 
org/apache/spark/log4j-defaults.properties
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
    17/08/09 23:28:03 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
    Spark context Web UI available at http://10.10.1.79:4040
    Spark context available as 'sc' (master = spark://ip-10-10-1-79:7077, app 
id = app-20170809232804-0003).
    Spark session available as 'spark'.
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 2.3.0-SNAPSHOT
          /_/
             
    Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_131)
    Type in expressions to have them evaluated.
    Type :help for more information.
    
    scala> 



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to