On Wed, Jul 17, 2019 at 6:28 AM Tianhua huang <huangtianhua...@gmail.com> wrote:
> Two failed and the reason is 'Can't find 1 executors before 10000 
> milliseconds elapsed', see below, then we try increase timeout the tests 
> passed, so wonder if we can increase the timeout? and here I have another 
> question about 
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/TestUtils.scala#L285,
>  why is not >=? see the comment of the function, it should be >=?
>

I think it's ">" because the driver is also an executor, but not 100%
sure. In any event it passes in general.
These errors typically mean "I didn't start successfully" for some
other reason that may be in the logs.

> The other two failed and the reason is '2143289344 equaled 2143289344', this 
> because the value of floatToRawIntBits(0.0f/0.0f) on aarch64 platform is 
> 2143289344 and equals to floatToRawIntBits(Float.NaN). About this I send 
> email to jdk-dev and proposed a topic on scala community 
> https://users.scala-lang.org/t/the-value-of-floattorawintbits-0-0f-0-0f-is-different-on-x86-64-and-aarch64-platforms/4845
>  and https://github.com/scala/bug/issues/11632, I thought it's something 
> about jdk or scala, but after discuss, it should related with platform, so 
> seems the following asserts is not appropriate? 
> https://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/DataFrameWindowFunctionsSuite.scala#L704-L705
>  and 
> https://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/DataFrameAggregateSuite.scala#L732-L733

These tests could special-case execution on ARM, like you'll see some
tests handle big-endian architectures.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to