It's still dying. Back to this error (it used to be spark-2.2.0 before):

java.io.IOException: Cannot run program "./bin/spark-submit" (in directory 
"/tmp/test-spark/spark-2.1.2"): error=2, No such file or directory

So, a mirror is missing that Spark version... I don't understand why nobody else has these errors and I get them every time without fail.

Petar


Le 6/19/2018 à 2:35 PM, Sean Owen a écrit :
Those still appear to be env problems. I don't know why it is so persistent. Does it all pass locally? Retrigger tests again and see what happens.

On Tue, Jun 19, 2018, 2:53 AM Petar Zecevic <petar.zece...@gmail.com <mailto:petar.zece...@gmail.com>> wrote:


    Thanks, but unfortunately, it died again. Now at pyspark tests:


    ========================================================================
    Running PySpark tests
    ========================================================================
    Running PySpark tests. Output is in 
/home/jenkins/workspace/SparkPullRequestBuilder@2/python/unit-tests.log
    Will test against the following Python executables: ['python2.7', 
'python3.4', 'pypy']
    Will test the following Python modules: ['pyspark-core', 'pyspark-ml', 
'pyspark-mllib', 'pyspark-sql', 'pyspark-streaming']
    Will skip PyArrow related features against Python executable 'python2.7' in 
'pyspark-sql' module. PyArrow >= 0.8.0 is required; however, PyArrow was not 
found.
    Will skip Pandas related features against Python executable 'python2.7' in 
'pyspark-sql' module. Pandas >= 0.19.2 is required; however, Pandas 0.16.0 was 
found.
    Will test PyArrow related features against Python executable 'python3.4' in 
'pyspark-sql' module.
    Will test Pandas related features against Python executable 'python3.4' in 
'pyspark-sql' module.
    Will skip PyArrow related features against Python executable 'pypy' in 
'pyspark-sql' module. PyArrow >= 0.8.0 is required; however, PyArrow was not 
found.
    Will skip Pandas related features against Python executable 'pypy' in 
'pyspark-sql' module. Pandas >= 0.19.2 is required; however, Pandas was not 
found.
    Starting test(python2.7): pyspark.mllib.tests
    Starting test(pypy): pyspark.sql.tests
    Starting test(pypy): pyspark.streaming.tests
    Starting test(pypy): pyspark.tests
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
    ...
    [Stage 0:>                                                          (0 + 1) 
/ 1]
..
    [Stage 0:>                                                          (0 + 4) 
/ 4]
.
    [Stage 0:>                                                          (0 + 4) 
/ 4]
..
    [Stage 0:>                                                          (0 + 4) 
/ 4]
    [Stage 0:>                                                          (0 + 4) 
/ 4]
    [Stage 0:>                                                          (0 + 4) 
/ 4]
    [Stage 0:>                                                          (0 + 4) 
/ 4]
....
    [Stage 0:>                                                        (0 + 32) 
/ 32]...........
    [Stage 10:>                                                         (0 + 1) 
/ 1]
.........
    [Stage 0:>                                                          (0 + 4) 
/ 4]
.....s
    [Stage 0:>                                                          (0 + 1) 
/ 1]
.........
    [Stage 0:>                                                          (0 + 4) 
/ 4]
    [Stage 0:==============>                                            (1 + 3) 
/ 4]
.
    [Stage 0:>                                                          (0 + 4) 
/ 4]
..
    [Stage 0:>                                                          (0 + 2) 
/ 2]
.........
    [Stage 29:===========================================>              (3 + 1) 
/ 4]
......
    [Stage 79:>                                                         (0 + 1) 
/ 1]
..
    [Stage 83:>                                                        (0 + 4) 
/ 10]
    [Stage 83:======================>                                  (4 + 4) 
/ 10]
    [Stage 83:=============================================>           (8 + 2) 
/ 10]
......cc: cc: no input filesno input files

    cc: no input files
    cc: no input files
    Exception in thread Thread-1:
    Traceback (most recent call last):
       File "/usr/lib64/pypy-2.5.1/lib-python/2.7/threading.py", line 806, in 
__bootstrap_inner
         self.run()
       File "/usr/lib64/pypy-2.5.1/lib-python/2.7/threading.py", line 759, in 
run
         self.__target(*self.__args, **self.__kwargs)
       
File"/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/rdd.py"
    
<mailto:/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/rdd.py>,
 line 771, in pipe_objs
         out.write(s.encode('utf-8'))
    IOError: [Errno 32] Broken pipe: '<fdopen>'

    cc: no input files
    cc: no input files
    cc: no input files
    Exception in thread Thread-1:
    Traceback (most recent call last):
       File "/usr/lib64/pypy-2.5.1/lib-python/2.7/threading.py", line 806, in 
__bootstrap_inner
         self.run()
       File "/usr/lib64/pypy-2.5.1/lib-python/2.7/threading.py", line 759, in 
run
         self.__target(*self.__args, **self.__kwargs)
       
File"/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/rdd.py"
    
<mailto:/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/rdd.py>,
 line 771, in pipe_objs
         out.write(s.encode('utf-8'))
    IOError: [Errno 32] Broken pipe: '<fdopen>'

    Exception in thread Thread-1:
    Traceback (most recent call last):
       File "/usr/lib64/pypy-2.5.1/lib-python/2.7/threading.py", line 806, in 
__bootstrap_inner
         self.run()
       File "/usr/lib64/pypy-2.5.1/lib-python/2.7/threading.py", line 759, in 
run
         self.__target(*self.__args, **self.__kwargs)
       
File"/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/rdd.py"
    
<mailto:/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/rdd.py>,
 line 771, in pipe_objs
         out.write(s.encode('utf-8'))
    IOError: [Errno 32] Broken pipe: '<fdopen>'

    cc: no input files
    Exception in thread Thread-1:
    Traceback (most recent call last):
       File "/usr/lib64/pypy-2.5.1/lib-python/2.7/threading.py", line 806, in 
__bootstrap_inner
         self.run()
       File "/usr/lib64/pypy-2.5.1/lib-python/2.7/threading.py", line 759, in 
run
         self.__target(*self.__args, **self.__kwargs)
       
File"/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/rdd.py"
    
<mailto:/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/rdd.py>,
 line 771, in pipe_objs
         out.write(s.encode('utf-8'))
    IOError: [Errno 32] Broken pipe: '<fdopen>'

    .......................s..Traceback (most recent call last):
       
File"/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/serializers.py"
    
<mailto:/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/serializers.py>,
 line 574, in dumps
         return cloudpickle.dumps(obj, 2)
       
File"/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/cloudpickle.py"
    
<mailto:/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/cloudpickle.py>,
 line 858, in dumps
         cp.dump(obj)
       
File"/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/cloudpickle.py"
    
<mailto:/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/cloudpickle.py>,
 line 260, in dump
         return Pickler.dump(self, obj)
       File "/usr/lib64/pypy-2.5.1/lib-python/2.7/pickle.py", line 224, in dump
         self.save(obj)
       File "/usr/lib64/pypy-2.5.1/lib-python/2.7/pickle.py", line 306, in save
         rv = reduce(self.proto)
       
File"/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/tests.py"
    
<mailto:/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/tests.py>,
 line 275, in __reduce__
         raise Exception("not picklable")
    Exception: not picklable
    .......
    [Stage 0:>                                                          (0 + 4) 
/ 4]
....file:/tmp/tmpgO7AIY added as a remote repository with the name: repo-1
    Ivy Default Cache set to: /home/jenkins/.ivy2/cache
    The jars for the packages stored in: /home/jenkins/.ivy2/jars
    :: loading settings :: url 
=jar:file:/home/jenkins/workspace/SparkPullRequestBuilder@2/assembly/target/scala-2.11/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
    a#mylib added as a dependency
    :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
    :: resolution report :: resolve 1990ms :: artifacts dl 0ms
        :: modules in use:
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   1   |   0   |   0   |   0   ||   0   |   0   |
        ---------------------------------------------------------------------

    :: problems summary ::
    :::: WARNINGS
        io problem while parsing ivy 
file:file:/tmp/tmpgO7AIY/a/mylib/0.1/mylib-0.1.pom  
(java.io.FileNotFoundException: 
/home/jenkins/.ivy2/cache/a/mylib/ivy-0.1.xml.original (No such file or 
directory))

                module not found: a#mylib;0.1

        ==== local-m2-cache: tried

        
file:/home/jenkins/workspace/SparkPullRequestBuilder@2/dummy/.m2/repository/a/mylib/0.1/mylib-0.1.pom

          -- artifact a#mylib;0.1!mylib.jar:

        
file:/home/jenkins/workspace/SparkPullRequestBuilder@2/dummy/.m2/repository/a/mylib/0.1/mylib-0.1.jar

        ==== local-ivy-cache: tried

          /home/jenkins/.ivy2/local/a/mylib/0.1/ivys/ivy.xml

          -- artifact a#mylib;0.1!mylib.jar:

          /home/jenkins/.ivy2/local/a/mylib/0.1/jars/mylib.jar

        ==== central: tried

        https://repo1.maven.org/maven2/a/mylib/0.1/mylib-0.1.pom

          -- artifact a#mylib;0.1!mylib.jar:

        https://repo1.maven.org/maven2/a/mylib/0.1/mylib-0.1.jar

        ==== spark-packages: tried

        http://dl.bintray.com/spark-packages/maven/a/mylib/0.1/mylib-0.1.pom

          -- artifact a#mylib;0.1!mylib.jar:

        http://dl.bintray.com/spark-packages/maven/a/mylib/0.1/mylib-0.1.jar

        ==== repo-1: tried

        file:/tmp/tmpgO7AIY/a/mylib/0.1/mylib-0.1.pom

                ::::::::::::::::::::::::::::::::::::::::::::::

                ::          UNRESOLVED DEPENDENCIES         ::

                ::::::::::::::::::::::::::::::::::::::::::::::

                :: a#mylib;0.1: not found

                ::::::::::::::::::::::::::::::::::::::::::::::



    :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
    Exception in thread "main" java.lang.RuntimeException: [unresolved 
dependency: a#mylib;0.1: not found]
        at 
org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1268)
        at 
org.apache.spark.deploy.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:49)
        at 
org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:348)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:170)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:136)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    Ffile:/tmp/tmpwtN2z_ added as a remote repository with the name: repo-1
    Ivy Default Cache set to: /home/jenkins/.ivy2/cache
    The jars for the packages stored in: /home/jenkins/.ivy2/jars
    :: loading settings :: url 
=jar:file:/home/jenkins/workspace/SparkPullRequestBuilder@2/assembly/target/scala-2.11/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
    a#mylib added as a dependency
    :: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
        found a#mylib;0.1 in repo-1
    :: resolution report :: resolve 1378ms :: artifacts dl 4ms
        :: modules in use:
        a#mylib;0.1 from repo-1 in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   1   |   1   |   1   |   0   ||   1   |   0   |
        ---------------------------------------------------------------------
    :: retrieving :: org.apache.spark#spark-submit-parent
        confs: [default]
        0 artifacts copied, 1 already retrieved (0kB/8ms)
    .....
    [Stage 0:>                                                          (0 + 4) 
/ 4]
.
    [Stage 0:>                                                          (0 + 1) 
/ 1]
.
    [Stage 0:>                                                          (0 + 1) 
/ 1]
...
    [Stage 0:>                                                         (0 + 4) 
/ 20]
    [Stage 0:=================>                                        (6 + 4) 
/ 20]
......
    ======================================================================
    FAIL: test_package_dependency (pyspark.tests.SparkSubmitTests)
    Submit and test a script with a dependency on a Spark Package
    ----------------------------------------------------------------------
    Traceback (most recent call last):
       
File"/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/tests.py"
    
<mailto:/home/jenkins/workspace/SparkPullRequestBuilder@2/python/pyspark/tests.py>,
 line 2093, in test_package_dependency
         self.assertEqual(0, proc.returncode)
    AssertionError: 0 != 1

    ----------------------------------------------------------------------
    Ran 127 tests in 205.547s

    FAILED (failures=1, skipped=2)
    NOTE: Skipping SciPy tests as it does not seem to be installed
    NOTE: Skipping NumPy tests as it does not seem to be installed
        Random listing order was used

    Had test failures in pyspark.tests with pypy; see logs.
    [error] running 
/home/jenkins/workspace/SparkPullRequestBuilder@2/python/run-tests 
--parallelism=4 ; received return code 255
    Attempting to post to Github...
      > Post successful.
    Build step 'Execute shell' marked build as failure
    Archiving artifacts
    Recording test results
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed):
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/92038/
    Test FAILed.
    Finished: FAILURE


    Le 6/18/2018 à 8:05 PM, shane knapp a écrit :
    i triggered another build against your PR, so let's see if this
    happens again or was a transient failure.

    https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/92038/

    shane

    On Mon, Jun 18, 2018 at 5:30 AM, Petar Zecevic
    <petar.zece...@gmail.com <mailto:petar.zece...@gmail.com>> wrote:

        Hi,
        Jenkins build for my PR
        (https://github.com/apache/spark/pull/21109 ;
        
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/92023/testReport/org.apache.spark.sql.hive/HiveExternalCatalogVersionsSuite/_It_is_not_a_test_it_is_a_sbt_testing_SuiteSelector_/)
        keeps failing. First it couldn't download Spark v.2.2.0
        (indeed, it wasn't available at the mirror it selected), now
        it's failing with this exception below.

        Can someone explain these errors for me? Is anybody else
        experiencing similar problems?

        Thanks,
        Petar


        Error Message

        java.io.IOException: Cannot run program "./bin/spark-submit"
        (in directory "/tmp/test-spark/spark-2.2.1"): error=2, No
        such file or directory

        Stacktrace

        sbt.ForkMain$ForkError: java.io.IOException: Cannot run
        program "./bin/spark-submit" (in directory
        "/tmp/test-spark/spark-2.2.1"): error=2, No such file or
        directory
            at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
            at
        
org.apache.spark.sql.hive.SparkSubmitTestUtils$class.runSparkSubmit(SparkSubmitTestUtils.scala:73)
            at
        
org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.runSparkSubmit(HiveExternalCatalogVersionsSuite.scala:43)
            at
        
org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite$$anonfun$beforeAll$1.apply(HiveExternalCatalogVersionsSuite.scala:176)
            at
        
org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite$$anonfun$beforeAll$1.apply(HiveExternalCatalogVersionsSuite.scala:161)
            at scala.collection.immutable.List.foreach(List.scala:381)
            at
        
org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.beforeAll(HiveExternalCatalogVersionsSuite.scala:161)
            at
        
org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
            at
        org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)

            at
        org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:52)
            at org.scalatest.tools.Framework.org
        
<http://org.scalatest.tools.Framework.org>$scalatest$tools$Framework$$runSuite(Framework.scala:314)
            at
        org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)

            at sbt.ForkMain$Run$2.call(ForkMain.java:296)
            at sbt.ForkMain$Run$2.call(ForkMain.java:286)
            at java.util.concurrent.FutureTask.run(FutureTask.java:266)
            at
        
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
            at
        
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
            at java.lang.Thread.run(Thread.java:745)
        Caused by: sbt.ForkMain$ForkError: java.io.IOException:
        error=2, No such file or directory
            at java.lang.UNIXProcess.forkAndExec(Native Method)
            at java.lang.UNIXProcess.<init>(UNIXProcess.java:248)
            at java.lang.ProcessImpl.start(ProcessImpl.java:134)
            at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
            ... 17 more




-- Shane Knapp
    UC Berkeley EECS Research / RISELab Staff Technical Lead
    https://rise.cs.berkeley.edu


Reply via email to