Github user felixcheung commented on the pull request:

    https://github.com/apache/spark/pull/12648#issuecomment-213856851
  
    # spark-shell
    
    ```
    
    # ./bin/spark-shell
    Using Spark's default log4j profile: 
org/apache/spark/log4j-defaults.properties
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel).
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 2.0.0-SNAPSHOT
          /_/
    
    Using Scala version 2.10.5 (OpenJDK 64-Bit Server VM, Java 
1.8.0_45-internal)
    Type in expressions to have them evaluated.
    Type :help for more information.
    16/04/22 23:56:50 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
    Spark context available as sc (master = local[*], app id = 
local-1461369413316).
    SQL context available as sqlContext.
    
    scala>
    ```
    
    # pyspark BEFORE
    
    ```
    # ./bin/pyspark
    Python 2.7.9 (default, Apr  2 2015, 15:33:21)
    [GCC 4.9.2] on linux2
    Type "help", "copyright", "credits" or "license" for more information.
    Using Spark's default log4j profile: 
org/apache/spark/log4j-defaults.properties
    16/04/22 23:57:27 INFO SparkContext: Running Spark version 2.0.0-SNAPSHOT
    16/04/22 23:57:28 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
    16/04/22 23:57:28 INFO SecurityManager: Changing view acls to: root
    16/04/22 23:57:28 INFO SecurityManager: Changing modify acls to: root
    16/04/22 23:57:28 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(root); users with 
modify permissions: Set(root)
    16/04/22 23:57:29 INFO Utils: Successfully started service 'sparkDriver' on 
port 38291.
    16/04/22 23:57:30 INFO Slf4jLogger: Slf4jLogger started
    16/04/22 23:57:30 INFO Remoting: Starting remoting
    16/04/22 23:57:30 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://sparkDriverActorSystem@192.168.128.1:44013]
    16/04/22 23:57:30 INFO Utils: Successfully started service 
'sparkDriverActorSystem' on port 44013.
    16/04/22 23:57:30 INFO SparkEnv: Registering MapOutputTracker
    16/04/22 23:57:30 INFO SparkEnv: Registering BlockManagerMaster
    16/04/22 23:57:30 INFO DiskBlockManager: Created local directory at 
/tmp/blockmgr-d6702ea8-19ef-4c5f-b97c-1044860d5791
    16/04/22 23:57:30 INFO MemoryStore: MemoryStore started with capacity 511.1 
MB
    16/04/22 23:57:30 INFO SparkEnv: Registering OutputCommitCoordinator
    16/04/22 23:57:31 INFO Utils: Successfully started service 'SparkUI' on 
port 4040.
    16/04/22 23:57:31 INFO SparkUI: Started SparkUI at http://192.168.128.1:4040
    16/04/22 23:57:31 INFO Executor: Starting executor ID driver on host 
localhost
    16/04/22 23:57:31 INFO Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 43464.
    16/04/22 23:57:31 INFO NettyBlockTransferService: Server created on 43464
    16/04/22 23:57:31 INFO BlockManagerMaster: Trying to register BlockManager
    16/04/22 23:57:31 INFO BlockManagerMasterEndpoint: Registering block 
manager localhost:43464 with 511.1 MB RAM, BlockManagerId(driver, localhost, 
43464)
    16/04/22 23:57:31 INFO BlockManagerMaster: Registered BlockManager
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /__ / .__/\_,_/_/ /_/\_\   version 2.0.0-SNAPSHOT
          /_/
    
    Using Python version 2.7.9 (default, Apr  2 2015 15:33:21)
    SparkContext available as sc, SQLContext available as sqlContext.
    >>>
    ```
    
    # pyspark AFTER
    ```
    # ./bin/pyspark
    Python 2.7.9 (default, Apr  2 2015, 15:33:21)
    [GCC 4.9.2] on linux2
    Type "help", "copyright", "credits" or "license" for more information.
    Using Spark's default log4j profile: 
org/apache/spark/log4j-defaults.properties
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel).
    16/04/23 04:17:40 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /__ / .__/\_,_/_/ /_/\_\   version 2.0.0-SNAPSHOT
          /_/
    
    Using Python version 2.7.9 (default, Apr  2 2015 15:33:21)
    SparkContext available as sc, SQLContext available as sqlContext.
    >>>
    ```
    
    # sparkR BEFORE
    ```
    # ./bin/sparkR
    R version 3.2.2 (2015-08-14) -- "Fire Safety"
    Copyright (C) 2015 The R Foundation for Statistical Computing
    Platform: x86_64-pc-linux-gnu (64-bit)
    
    R is free software and comes with ABSOLUTELY NO WARRANTY.
    You are welcome to redistribute it under certain conditions.
    Type 'license()' or 'licence()' for distribution details.
    
      Natural language support but running in an English locale
    
    R is a collaborative project with many contributors.
    Type 'contributors()' for more information and
    'citation()' on how to cite R or R packages in publications.
    
    Type 'demo()' for some demos, 'help()' for on-line help, or
    'help.start()' for an HTML browser interface to help.
    Type 'q()' to quit R.
    
    Launching java with spark-submit command 
/opt/spark-2.0.0-bin-hadoop2.6/bin/spark-submit   "sparkr-shell" 
/tmp/Rtmp6xEKLl/backend_port27e158fefeca
    log4j:WARN No appenders could be found for logger 
(io.netty.util.internal.logging.InternalLoggerFactory).
    log4j:WARN Please initialize the log4j system properly.
    log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for 
more info.
    Using Spark's default log4j profile: 
org/apache/spark/log4j-defaults.properties
    16/04/22 23:58:06 INFO SparkContext: Running Spark version 2.0.0-SNAPSHOT
    16/04/22 23:58:07 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
    16/04/22 23:58:07 INFO SecurityManager: Changing view acls to: root
    16/04/22 23:58:07 INFO SecurityManager: Changing modify acls to: root
    16/04/22 23:58:07 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(root); users with 
modify permissions: Set(root)
    16/04/22 23:58:08 INFO Utils: Successfully started service 'sparkDriver' on 
port 42734.
    16/04/22 23:58:08 INFO Slf4jLogger: Slf4jLogger started
    16/04/22 23:58:09 INFO Remoting: Starting remoting
    16/04/22 23:58:09 INFO Remoting: Remoting started; listening on addresses 
:[akka.tcp://sparkDriverActorSystem@192.168.128.1:59726]
    16/04/22 23:58:09 INFO Utils: Successfully started service 
'sparkDriverActorSystem' on port 59726.
    16/04/22 23:58:09 INFO SparkEnv: Registering MapOutputTracker
    16/04/22 23:58:09 INFO SparkEnv: Registering BlockManagerMaster
    16/04/22 23:58:09 INFO DiskBlockManager: Created local directory at 
/tmp/blockmgr-09f81543-fb8d-4a7d-bf84-3a1f4473fecc
    16/04/22 23:58:09 INFO MemoryStore: MemoryStore started with capacity 511.1 
MB
    16/04/22 23:58:09 INFO SparkEnv: Registering OutputCommitCoordinator
    16/04/22 23:58:10 INFO Utils: Successfully started service 'SparkUI' on 
port 4040.
    16/04/22 23:58:10 INFO SparkUI: Started SparkUI at http://192.168.128.1:4040
    16/04/22 23:58:10 INFO Executor: Starting executor ID driver on host 
localhost
    16/04/22 23:58:10 INFO Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 40490.
    16/04/22 23:58:10 INFO NettyBlockTransferService: Server created on 40490
    16/04/22 23:58:10 INFO BlockManagerMaster: Trying to register BlockManager
    16/04/22 23:58:10 INFO BlockManagerMasterEndpoint: Registering block 
manager localhost:40490 with 511.1 MB RAM, BlockManagerId(driver, localhost, 
40490)
    16/04/22 23:58:10 INFO BlockManagerMaster: Registered BlockManager
    
     Welcome to
        ____              __
       / __/__  ___ _____/ /__
      _\ \/ _ \/ _ `/ __/  '_/
     /___/ .__/\_,_/_/ /_/\_\   version  2.0.0-SNAPSHOT
        /_/
    
    
     Spark context is available as sc, SQL context is available as sqlContext
    >
    ```
    
    # sparkR AFTER
    ```
    # ./bin/sparkR
    
    R version 3.2.2 (2015-08-14) -- "Fire Safety"
    Copyright (C) 2015 The R Foundation for Statistical Computing
    Platform: x86_64-pc-linux-gnu (64-bit)
    
    R is free software and comes with ABSOLUTELY NO WARRANTY.
    You are welcome to redistribute it under certain conditions.
    Type 'license()' or 'licence()' for distribution details.
    
      Natural language support but running in an English locale
    
    R is a collaborative project with many contributors.
    Type 'contributors()' for more information and
    'citation()' on how to cite R or R packages in publications.
    
    Type 'demo()' for some demos, 'help()' for on-line help, or
    'help.start()' for an HTML browser interface to help.
    Type 'q()' to quit R.
    
    Launching java with spark-submit command 
/opt/spark-2.0.0-bin-hadoop2.6/bin/spark-submit   "sparkr-shell" 
/tmp/RtmpeIj9VM/backend_port7f857c59129
    Using Spark's default log4j profile: 
org/apache/spark/log4j-defaults.properties
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel).
    16/04/23 04:18:50 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
    
     Welcome to
        ____              __
       / __/__  ___ _____/ /__
      _\ \/ _ \/ _ `/ __/  '_/
     /___/ .__/\_,_/_/ /_/\_\   version  2.0.0-SNAPSHOT
        /_/
    
    
     Spark context is available as sc, SQL context is available as sqlContext
    >
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to