Github user merlintang commented on the issue:

    https://github.com/apache/spark/pull/15819
  
    Actually, I do not have the unit test, but the code list below (same as we
    posted in the JIRA) can reproduce this bug.
    
    The related code would be this way:
    val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
    sqlContext.sql("CREATE TABLE IF NOT EXISTS T1 (key INT, value STRING)")
    sqlContext.sql("LOAD DATA LOCAL INPATH
    '../examples/src/main/resources/kv1.txt' INTO TABLE T1")
    sqlContext.sql("CREATE TABLE IF NOT EXISTS T2 (key INT, value STRING)")
    val sparktestdf = sqlContext.table("T1")
    val dfw = sparktestdf.write
    dfw.insertInto("T2")
    val sparktestcopypydfdf = sqlContext.sql("""SELECT * from T2 """)
    sparktestcopypydfdf.show
    
    Our customer and ourself also have manually reproduced this bug for spark
    1.6.x and 1.5.x.
    
    For the unit test, because we do not know how to find the hive directory
    for the related table in the test case, we can not check the computed
    directory in the end.
    
    The solution is that we reuse three functions in the 2.0.2 to create the
    staging directory, then this bug is fixed.
    
    
    On Wed, Nov 9, 2016 at 10:26 PM, Wenchen Fan <notificati...@github.com>
    wrote:
    
    > do you have a unit test to reproduce this bug?
    >
    > —
    > You are receiving this because you authored the thread.
    > Reply to this email directly, view it on GitHub
    > <https://github.com/apache/spark/pull/15819#issuecomment-259611432>, or 
mute
    > the thread
    > 
<https://github.com/notifications/unsubscribe-auth/ABXY-YcT4gOF3RyXk0YhQTVZpHYVDSHRks5q8rj6gaJpZM4KtFSt>
    > .
    >



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to