[ 
https://issues.apache.org/jira/browse/SPARK-45200?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jitin Dominic updated SPARK-45200:
----------------------------------
    Description: 
I've been using Spark core 3.2.2 and was upgrading to 3.4.0

On execution of my Java code with the 3.4.0,  it generates some extra set of 
logs but don't face this issue with 3.2.2.

 

I noticed that logs says _Using Spark's default log4j profile: 
org/apache/spark/log4j2-defaults.properties._

 

Is this a bug or do we have a  a configuration to disable the using of default 
log4j profile?

I didn't see anything in the documentation

 

 
{code:java}
Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
23/09/18 20:05:08 INFO SparkContext: Running Spark version 3.4.0
23/09/18 20:05:08 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
23/09/18 20:05:08 INFO ResourceUtils: 
==============================================================
23/09/18 20:05:08 INFO ResourceUtils: No custom resources configured for 
spark.driver.
23/09/18 20:05:08 INFO ResourceUtils: 
==============================================================
23/09/18 20:05:08 INFO SparkContext: Submitted application: XYZ
23/09/18 20:05:08 INFO ResourceProfile: Default ResourceProfile created, 
executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , 
memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: 
offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: 
cpus, amount: 1.0)
23/09/18 20:05:08 INFO ResourceProfile: Limiting resource is cpu
23/09/18 20:05:08 INFO ResourceProfileManager: Added ResourceProfile id: 0
23/09/18 20:05:08 INFO SecurityManager: Changing view acls to: jd
23/09/18 20:05:08 INFO SecurityManager: Changing modify acls to: jd
23/09/18 20:05:08 INFO SecurityManager: Changing view acls groups to: 
23/09/18 20:05:08 INFO SecurityManager: Changing modify acls groups to: 
23/09/18 20:05:08 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: jd; groups with view 
permissions: EMPTY; users with modify permissions: jd; groups with modify 
permissions: EMPTY
23/09/18 20:05:08 INFO Utils: Successfully started service 'sparkDriver' on 
port 39155.
23/09/18 20:05:08 INFO SparkEnv: Registering MapOutputTracker
23/09/18 20:05:08 INFO SparkEnv: Registering BlockManagerMaster
23/09/18 20:05:08 INFO BlockManagerMasterEndpoint: Using 
org.apache.spark.storage.DefaultTopologyMapper for getting topology information
23/09/18 20:05:08 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
23/09/18 20:05:08 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
23/09/18 20:05:08 INFO DiskBlockManager: Created local directory at 
/tmp/blockmgr-226d599c-1511-4fae-b0e7-aae81b684012
23/09/18 20:05:08 INFO MemoryStore: MemoryStore started with capacity 2004.6 MiB
23/09/18 20:05:08 INFO SparkEnv: Registering OutputCommitCoordinator
23/09/18 20:05:08 INFO JettyUtils: Start Jetty 0.0.0.0:4040 for SparkUI
23/09/18 20:05:09 INFO Utils: Successfully started service 'SparkUI' on port 
4040.
23/09/18 20:05:09 INFO Executor: Starting executor ID driver on host jd
23/09/18 20:05:09 INFO Executor: Starting executor with user classpath 
(userClassPathFirst = false): ''
23/09/18 20:05:09 INFO Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 32819.
23/09/18 20:05:09 INFO NettyBlockTransferService: Server created on jd:32819
23/09/18 20:05:09 INFO BlockManager: Using 
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication 
policy
23/09/18 20:05:09 INFO BlockManagerMaster: Registering BlockManager 
BlockManagerId(driver, jd, 32819, None)
23/09/18 20:05:09 INFO BlockManagerMasterEndpoint: Registering block manager 
jd:32819 with 2004.6 MiB RAM, BlockManagerId(driver, jd, 32819, None)
23/09/18 20:05:09 INFO BlockManagerMaster: Registered BlockManager 
BlockManagerId(driver, jd, 32819, None)
23/09/18 20:05:09 INFO BlockManager: Initialized BlockManager: 
BlockManagerId(driver, jd, 32819, None)
 {code}

  was:
I've been using Spark core 3.2.2 and was upgrading to 3.4.0

 

On execution of my Java code with the 3.4.0,  it generates some extra set of 
logs but don't face this issue with 3.2.2.

 

I noticed that logs says _Using Spark's default log4j profile: 
org/apache/spark/log4j2-defaults.properties._

 

Is this a bug or do we have a  a configuration to disable the using of default 
log4j profile?

 

I didn't see anything in the documentation

 

 
{code:java}
Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
23/09/18 20:05:08 INFO SparkContext: Running Spark version 3.4.0
23/09/18 20:05:08 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
23/09/18 20:05:08 INFO ResourceUtils: 
==============================================================
23/09/18 20:05:08 INFO ResourceUtils: No custom resources configured for 
spark.driver.
23/09/18 20:05:08 INFO ResourceUtils: 
==============================================================
23/09/18 20:05:08 INFO SparkContext: Submitted application: XYZ
23/09/18 20:05:08 INFO ResourceProfile: Default ResourceProfile created, 
executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , 
memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: 
offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: 
cpus, amount: 1.0)
23/09/18 20:05:08 INFO ResourceProfile: Limiting resource is cpu
23/09/18 20:05:08 INFO ResourceProfileManager: Added ResourceProfile id: 0
23/09/18 20:05:08 INFO SecurityManager: Changing view acls to: jd
23/09/18 20:05:08 INFO SecurityManager: Changing modify acls to: jd
23/09/18 20:05:08 INFO SecurityManager: Changing view acls groups to: 
23/09/18 20:05:08 INFO SecurityManager: Changing modify acls groups to: 
23/09/18 20:05:08 INFO SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: jd; groups with view 
permissions: EMPTY; users with modify permissions: jd; groups with modify 
permissions: EMPTY
23/09/18 20:05:08 INFO Utils: Successfully started service 'sparkDriver' on 
port 39155.
23/09/18 20:05:08 INFO SparkEnv: Registering MapOutputTracker
23/09/18 20:05:08 INFO SparkEnv: Registering BlockManagerMaster
23/09/18 20:05:08 INFO BlockManagerMasterEndpoint: Using 
org.apache.spark.storage.DefaultTopologyMapper for getting topology information
23/09/18 20:05:08 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
23/09/18 20:05:08 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
23/09/18 20:05:08 INFO DiskBlockManager: Created local directory at 
/tmp/blockmgr-226d599c-1511-4fae-b0e7-aae81b684012
23/09/18 20:05:08 INFO MemoryStore: MemoryStore started with capacity 2004.6 MiB
23/09/18 20:05:08 INFO SparkEnv: Registering OutputCommitCoordinator
23/09/18 20:05:08 INFO JettyUtils: Start Jetty 0.0.0.0:4040 for SparkUI
23/09/18 20:05:09 INFO Utils: Successfully started service 'SparkUI' on port 
4040.
23/09/18 20:05:09 INFO Executor: Starting executor ID driver on host jd
23/09/18 20:05:09 INFO Executor: Starting executor with user classpath 
(userClassPathFirst = false): ''
23/09/18 20:05:09 INFO Utils: Successfully started service 
'org.apache.spark.network.netty.NettyBlockTransferService' on port 32819.
23/09/18 20:05:09 INFO NettyBlockTransferService: Server created on jd:32819
23/09/18 20:05:09 INFO BlockManager: Using 
org.apache.spark.storage.RandomBlockReplicationPolicy for block replication 
policy
23/09/18 20:05:09 INFO BlockManagerMaster: Registering BlockManager 
BlockManagerId(driver, jd, 32819, None)
23/09/18 20:05:09 INFO BlockManagerMasterEndpoint: Registering block manager 
jd:32819 with 2004.6 MiB RAM, BlockManagerId(driver, jd, 32819, None)
23/09/18 20:05:09 INFO BlockManagerMaster: Registered BlockManager 
BlockManagerId(driver, jd, 32819, None)
23/09/18 20:05:09 INFO BlockManager: Initialized BlockManager: 
BlockManagerId(driver, jd, 32819, None)
 {code}


> Spark 3.4.0 always using default log4j profile
> ----------------------------------------------
>
>                 Key: SPARK-45200
>                 URL: https://issues.apache.org/jira/browse/SPARK-45200
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.4.0
>            Reporter: Jitin Dominic
>            Priority: Major
>
> I've been using Spark core 3.2.2 and was upgrading to 3.4.0
> On execution of my Java code with the 3.4.0,  it generates some extra set of 
> logs but don't face this issue with 3.2.2.
>  
> I noticed that logs says _Using Spark's default log4j profile: 
> org/apache/spark/log4j2-defaults.properties._
>  
> Is this a bug or do we have a  a configuration to disable the using of 
> default log4j profile?
> I didn't see anything in the documentation
>  
>  
> {code:java}
> Using Spark's default log4j profile: 
> org/apache/spark/log4j2-defaults.properties
> 23/09/18 20:05:08 INFO SparkContext: Running Spark version 3.4.0
> 23/09/18 20:05:08 WARN NativeCodeLoader: Unable to load native-hadoop library 
> for your platform... using builtin-java classes where applicable
> 23/09/18 20:05:08 INFO ResourceUtils: 
> ==============================================================
> 23/09/18 20:05:08 INFO ResourceUtils: No custom resources configured for 
> spark.driver.
> 23/09/18 20:05:08 INFO ResourceUtils: 
> ==============================================================
> 23/09/18 20:05:08 INFO SparkContext: Submitted application: XYZ
> 23/09/18 20:05:08 INFO ResourceProfile: Default ResourceProfile created, 
> executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , 
> memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: 
> offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: 
> cpus, amount: 1.0)
> 23/09/18 20:05:08 INFO ResourceProfile: Limiting resource is cpu
> 23/09/18 20:05:08 INFO ResourceProfileManager: Added ResourceProfile id: 0
> 23/09/18 20:05:08 INFO SecurityManager: Changing view acls to: jd
> 23/09/18 20:05:08 INFO SecurityManager: Changing modify acls to: jd
> 23/09/18 20:05:08 INFO SecurityManager: Changing view acls groups to: 
> 23/09/18 20:05:08 INFO SecurityManager: Changing modify acls groups to: 
> 23/09/18 20:05:08 INFO SecurityManager: SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: jd; groups with view 
> permissions: EMPTY; users with modify permissions: jd; groups with modify 
> permissions: EMPTY
> 23/09/18 20:05:08 INFO Utils: Successfully started service 'sparkDriver' on 
> port 39155.
> 23/09/18 20:05:08 INFO SparkEnv: Registering MapOutputTracker
> 23/09/18 20:05:08 INFO SparkEnv: Registering BlockManagerMaster
> 23/09/18 20:05:08 INFO BlockManagerMasterEndpoint: Using 
> org.apache.spark.storage.DefaultTopologyMapper for getting topology 
> information
> 23/09/18 20:05:08 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint 
> up
> 23/09/18 20:05:08 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
> 23/09/18 20:05:08 INFO DiskBlockManager: Created local directory at 
> /tmp/blockmgr-226d599c-1511-4fae-b0e7-aae81b684012
> 23/09/18 20:05:08 INFO MemoryStore: MemoryStore started with capacity 2004.6 
> MiB
> 23/09/18 20:05:08 INFO SparkEnv: Registering OutputCommitCoordinator
> 23/09/18 20:05:08 INFO JettyUtils: Start Jetty 0.0.0.0:4040 for SparkUI
> 23/09/18 20:05:09 INFO Utils: Successfully started service 'SparkUI' on port 
> 4040.
> 23/09/18 20:05:09 INFO Executor: Starting executor ID driver on host jd
> 23/09/18 20:05:09 INFO Executor: Starting executor with user classpath 
> (userClassPathFirst = false): ''
> 23/09/18 20:05:09 INFO Utils: Successfully started service 
> 'org.apache.spark.network.netty.NettyBlockTransferService' on port 32819.
> 23/09/18 20:05:09 INFO NettyBlockTransferService: Server created on jd:32819
> 23/09/18 20:05:09 INFO BlockManager: Using 
> org.apache.spark.storage.RandomBlockReplicationPolicy for block replication 
> policy
> 23/09/18 20:05:09 INFO BlockManagerMaster: Registering BlockManager 
> BlockManagerId(driver, jd, 32819, None)
> 23/09/18 20:05:09 INFO BlockManagerMasterEndpoint: Registering block manager 
> jd:32819 with 2004.6 MiB RAM, BlockManagerId(driver, jd, 32819, None)
> 23/09/18 20:05:09 INFO BlockManagerMaster: Registered BlockManager 
> BlockManagerId(driver, jd, 32819, None)
> 23/09/18 20:05:09 INFO BlockManager: Initialized BlockManager: 
> BlockManagerId(driver, jd, 32819, None)
>  {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to