[ 
https://issues.apache.org/jira/browse/SPARK-12426?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15094445#comment-15094445
 ] 

Mark Grover commented on SPARK-12426:
-------------------------------------

Thanks Sean, if you could add this, that'd be great.

h2. Running docker integration tests
In order to run [docker integration 
tests|https://github.com/apache/spark/tree/master/docker-integration-tests], 
you have to install docker engine on your box. The instructions for 
installation can be found at https://docs.docker.com/engine/installation/. Once 
installed, the docker service needs to be started, if not already running. On 
Linux, this can be done by {{sudo service docker start}}.
These integration tests run as a part of a regular Spark unit test run, 
therefore, it's necessary for docker engine to be installed and running if you 
want all Spark tests to pass.

> Docker JDBC integration tests are failing again
> -----------------------------------------------
>
>                 Key: SPARK-12426
>                 URL: https://issues.apache.org/jira/browse/SPARK-12426
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL, Tests
>    Affects Versions: 1.6.0
>            Reporter: Mark Grover
>
> The Docker JDBC integration tests were fixed in SPARK-11796 but they seem to 
> be failing again on my machine (Ubuntu Precise). This was the same box that I 
> tested my previous commit on. Also, I am not confident this failure has much 
> to do with Spark, since a well known commit where the tests were passing, 
> fails now, in the same environment.
> [~sowen] mentioned on the Spark 1.6 voting thread that the tests were failing 
> on his Ubuntu 15 box as well.
> Here's the error, fyi:
> {code}
> 15/12/18 10:12:50 INFO SparkContext: Successfully stopped SparkContext
> 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Shutting 
> down remote daemon.
> 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Remote 
> daemon shut down; proceeding with flushing remote transports.
> *** RUN ABORTED ***
>   com.spotify.docker.client.DockerException: 
> java.util.concurrent.ExecutionException: 
> com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: 
> java.io.IOException: No such file or directory
>   at 
> com.spotify.docker.client.DefaultDockerClient.propagate(DefaultDockerClient.java:1141)
>   at 
> com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1082)
>   at 
> com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281)
>   at 
> org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76)
>   at 
> org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
>   at 
> org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58)
>   at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
>   at 
> org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58)
>   at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1492)
>   at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1528)
>   ...
>   Cause: java.util.concurrent.ExecutionException: 
> com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: 
> java.io.IOException: No such file or directory
>   at 
> jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299)
>   at 
> jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286)
>   at 
> jersey.repackaged.com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
>   at 
> com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1080)
>   at 
> com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281)
>   at 
> org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76)
>   at 
> org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
>   at 
> org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58)
>   at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
>   at 
> org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58)
>   ...
>   Cause: com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException: 
> java.io.IOException: No such file or directory
>   at 
> org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:481)
>   at 
> org.glassfish.jersey.apache.connector.ApacheConnector$1.run(ApacheConnector.java:491)
>   at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>   at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>   at 
> jersey.repackaged.com.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299)
>   at 
> java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:110)
>   at 
> jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:50)
>   at 
> jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:37)
>   at 
> org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:487)
> 15/12/18 10:12:50 INFO RemoteActorRefProvider$RemotingTerminator: Remoting 
> shut down.
>   at org.glassfish.jersey.client.ClientRuntime$2.run(ClientRuntime.java:177)
>   ...
>   Cause: java.io.IOException: No such file or directory
>   at jnr.unixsocket.UnixSocketChannel.doConnect(UnixSocketChannel.java:94)
>   at jnr.unixsocket.UnixSocketChannel.connect(UnixSocketChannel.java:102)
>   at 
> com.spotify.docker.client.ApacheUnixSocket.connect(ApacheUnixSocket.java:73)
>   at 
> com.spotify.docker.client.UnixConnectionSocketFactory.connectSocket(UnixConnectionSocketFactory.java:74)
>   at 
> org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:134)
>   at 
> org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:353)
>   at 
> org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:380)
>   at 
> org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:236)
>   at 
> org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:184)
>   at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:88)
>   ...
> 15/12/18 10:12:50 INFO ShutdownHookManager: Shutdown hook called
> 15/12/18 10:12:50 INFO ShutdownHookManager: Deleting directory 
> /root/spark/docker-integration-tests/target/tmp/spark-43a76284-b0c5-4570-9576-d65b21f45058
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Total time: 26.787 s
> [INFO] Finished at: 2015-12-18T10:12:50-08:00
> [INFO] Final Memory: 57M/579M
> [INFO] 
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test 
> (test) on project spark-docker-integration-tests_2.10: There are test 
> failures -> [Help 1]
> [ERROR] 
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
> switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR] 
> [ERROR] For more information about the errors and possible solutions, please 
> read the following articles:
> [ERROR] [Help 1] 
> http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
> {code}
> And, here's the way to reproduce:
> {code}
> build/mvn -pl docker-integration-tests package
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to