[ https://issues.apache.org/jira/browse/BEAM-8933?focusedWorklogId=359618&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-359618 ]
ASF GitHub Bot logged work on BEAM-8933: ---------------------------------------- Author: ASF GitHub Bot Created on: 13/Dec/19 20:12 Start Date: 13/Dec/19 20:12 Worklog Time Spent: 10m Work Description: TheNeuralBit commented on issue #10369: [BEAM-8933] BigQueryIO Arrow for read URL: https://github.com/apache/beam/pull/10369#issuecomment-565591991 My `ArrowUtils` addition seems to have caused some mysterious failures in the spark runner tests (in Java PreCommit). From [`org.apache.beam.runners.spark.CacheTest.shouldCacheTest`](https://builds.apache.org/job/beam_PreCommit_Java_Commit/9227/testReport/junit/org.apache.beam.runners.spark/CacheTest/shouldCacheTest/) ``` java.lang.NoSuchMethodError: io.netty.util.internal.ReflectionUtil.trySetAccessible(Ljava/lang/reflect/AccessibleObject;)Ljava/lang/Throwable; at io.netty.channel.nio.NioEventLoop$5.run(NioEventLoop.java:217) at java.security.AccessController.doPrivileged(Native Method) at io.netty.channel.nio.NioEventLoop.openSelector(NioEventLoop.java:210) at io.netty.channel.nio.NioEventLoop.<init>(NioEventLoop.java:149) at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:127) at io.netty.channel.nio.NioEventLoopGroup.newChild(NioEventLoopGroup.java:36) at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:84) at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:58) at io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:47) at io.netty.channel.MultithreadEventLoopGroup.<init>(MultithreadEventLoopGroup.java:59) at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:77) at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:72) at io.netty.channel.nio.NioEventLoopGroup.<init>(NioEventLoopGroup.java:59) at org.apache.spark.network.util.NettyUtils.createEventLoop(NettyUtils.java:50) at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:102) at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99) at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:71) ``` I think this must be because I added arrow as a dependency for `:sdks:java:core` @kennknowles do you have any idea why this would happen? Is there something we need to re-run when updating core java dependencies? ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org Issue Time Tracking ------------------- Worklog Id: (was: 359618) Time Spent: 2.5h (was: 2h 20m) > BigQuery IO should support read/write in Arrow format > ----------------------------------------------------- > > Key: BEAM-8933 > URL: https://issues.apache.org/jira/browse/BEAM-8933 > Project: Beam > Issue Type: Improvement > Components: io-java-gcp > Reporter: Kirill Kozlov > Assignee: Kirill Kozlov > Priority: Major > Time Spent: 2.5h > Remaining Estimate: 0h > > As of right now BigQuery uses Avro format for reading and writing. > We should add a config to BigQueryIO to specify which format to use (with > Avro as default). -- This message was sent by Atlassian Jira (v8.3.4#803005)