Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22504#discussion_r223853075
  
    --- Diff: 
core/src/test/scala/org/apache/spark/util/logging/DriverLoggerSuite.scala ---
    @@ -0,0 +1,109 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.util.logging
    +
    +import java.io.{BufferedInputStream, FileInputStream}
    +
    +import org.apache.commons.io.FileUtils
    +
    +import org.apache.spark._
    +import org.apache.spark.{SparkContext, SparkFunSuite}
    +import org.apache.spark.internal.config._
    +import org.apache.spark.launcher.SparkLauncher
    +import org.apache.spark.network.util.JavaUtils
    +import org.apache.spark.util.Utils
    +
    +class DriverLoggerSuite extends SparkFunSuite {
    +
    +  test("driver logs are persisted") {
    +    val sc = getSparkContext()
    +    // Wait for application to start
    +    Thread.sleep(1000)
    +
    +    val app_id = sc.applicationId
    +    // Run a simple spark application
    +    sc.parallelize(1 to 1000).count()
    +
    +    // Assert driver log file exists
    +    val rootDir = Utils.getLocalDir(sc.getConf)
    +    val driverLogsDir = FileUtils.getFile(rootDir, "driver_logs")
    +    assert(driverLogsDir.exists())
    +    val files = driverLogsDir.listFiles()
    +    assert(files.length === 1)
    +    assert(files(0).getName.equals("driver.log"))
    +
    +    sc.stop()
    +    // On application end, file is moved to Hdfs (which is a local dir for 
this test)
    +    assert(!driverLogsDir.exists())
    +    val hdfsDir = FileUtils.getFile("/tmp/hdfs_logs/", app_id)
    +    assert(hdfsDir.exists())
    +    val hdfsFiles = hdfsDir.listFiles()
    +    assert(hdfsFiles.length > 0)
    +    JavaUtils.deleteRecursively(hdfsDir)
    +    assert(!hdfsDir.exists())
    +  }
    +
    +  test("driver logs are synced to hdfs continuously") {
    +    val sc = getSparkContext()
    +    // Wait for application to start
    +    Thread.sleep(1000)
    +
    +    val app_id = sc.applicationId
    +    // Run a simple spark application
    +    sc.parallelize(1 to 1000).count()
    +
    +    // Assert driver log file exists
    +    val rootDir = Utils.getLocalDir(sc.getConf)
    +    val driverLogsDir = FileUtils.getFile(rootDir, "driver_logs")
    +    assert(driverLogsDir.exists())
    +    val files = driverLogsDir.listFiles()
    +    assert(files.length === 1)
    +    assert(files(0).getName.equals("driver.log"))
    +    for (i <- 1 to 1000) {
    +      logInfo("Log enough data to log file so that it can be flushed")
    +    }
    +
    +    // After 5 secs, file contents are synced to Hdfs (which is a local 
dir for this test)
    +    Thread.sleep(6000)
    --- End diff --
    
    I'm not a fan of code that relies on exact timing between threads for 
things to work.
    
    This would be better if you drove the `HdfsAsyncWriter` class manually from 
the test, instead of indirectly through a `SparkContext`. The you can also 
control the log file flushing explicitly instead of the hack you have above.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to