Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/22608
  
    I haven't reviewed this (and I do have concerns about using an external 
image in the long term), just wanted to perhaps give some ideas for the future.
    
    Livy does integration tests using some helper code that runs MiniDFSCluster 
and MiniYARNCluster:
    
https://github.com/apache/incubator-livy/blob/master/integration-test/src/main/scala/org/apache/livy/test/framework/MiniCluster.scala
    
    (It doesn't use MiniKdc yet.)
    
    It should not be hard to create a docker image with all the needed Hadoop 
dependencies using maven-dependency-plugin:copy-dependencies + a shell script, 
and have some simple code like Livy's to bring up the servers.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to