Build failed in Jenkins: beam_PreCommit_Java_Cron #580

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[aaltay] [BEAM-3612] Add a shim generator tool (#7000)

[markliu] [BEAM-5953] Fix DataflowRunner in Python 3 - type errors

[aaltay] [BEAM-3612] Type specialize stats package (#7002)

[github] [BEAM-5446] SplittableDoFn: Remove "internal" methods for public API

[lcwik] [BEAM-6037] Make Spark runner pipeline translation based on URNs (#7005)

[lcwik] [BEAM-3608] Vendor guava 20.0 (#6809)

--
[...truncated 52.65 MB...]
BeamCalcRel(expr#0..2=[{inputs}], num=[$t2], starttime=[$t1])
  BeamAggregationRel(group=[{0, 1}], num=[COUNT()], 
window=[SlidingWindows($1, PT5S, PT10S, PT0S)])
BeamCalcRel(expr#0..4=[{inputs}], auction=[$t0], $f1=[$t3])
  BeamIOSourceRel(table=[[beam, Bid]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery3Test > 
testJoinsPeopleWithAuctions STANDARD_ERROR
Nov 13, 2018 12:17:23 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `P`.`name`, `P`.`city`, `P`.`state`, `A`.`id`
FROM `beam`.`Auction` AS `A`
INNER JOIN `beam`.`Person` AS `P` ON `A`.`seller` = `P`.`id`
WHERE `A`.`category` = 10 AND (`P`.`state` = 'OR' OR `P`.`state` = 'ID' OR 
`P`.`state` = 'CA')
Nov 13, 2018 12:17:23 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(name=[$11], city=[$14], state=[$15], id=[$0])
  LogicalFilter(condition=[AND(=($8, 10), OR(=($15, 'OR'), =($15, 'ID'), 
=($15, 'CA')))])
LogicalJoin(condition=[=($7, $10)], joinType=[inner])
  BeamIOSourceRel(table=[[beam, Auction]])
  BeamIOSourceRel(table=[[beam, Person]])

Nov 13, 2018 12:17:23 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..17=[{inputs}], name=[$t11], city=[$t14], state=[$t15], 
id=[$t0])
  BeamJoinRel(condition=[=($7, $10)], joinType=[inner])
BeamCalcRel(expr#0..9=[{inputs}], expr#10=[10], expr#11=[=($t8, $t10)], 
proj#0..9=[{exprs}], $condition=[$t11])
  BeamIOSourceRel(table=[[beam, Auction]])
BeamCalcRel(expr#0..7=[{inputs}], expr#8=['OR'], expr#9=[=($t5, $t8)], 
expr#10=['ID'], expr#11=[=($t5, $t10)], expr#12=['CA'], expr#13=[=($t5, $t12)], 
expr#14=[OR($t9, $t11, $t13)], proj#0..7=[{exprs}], $condition=[$t14])
  BeamIOSourceRel(table=[[beam, Person]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery7Test > testBids STANDARD_ERROR
Nov 13, 2018 12:17:23 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, `B`.`extra`
FROM (SELECT `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, 
`B`.`extra`, TUMBLE_START(`B`.`dateTime`, INTERVAL '10' SECOND) AS `starttime`
FROM `beam`.`Bid` AS `B`
GROUP BY `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, 
`B`.`extra`, TUMBLE(`B`.`dateTime`, INTERVAL '10' SECOND)) AS `B`
INNER JOIN (SELECT MAX(`B1`.`price`) AS `maxprice`, 
TUMBLE_START(`B1`.`dateTime`, INTERVAL '10' SECOND) AS `starttime`
FROM `beam`.`Bid` AS `B1`
GROUP BY TUMBLE(`B1`.`dateTime`, INTERVAL '10' SECOND)) AS `B1` ON 
`B`.`starttime` = `B1`.`starttime` AND `B`.`price` = `B1`.`maxprice`
Nov 13, 2018 12:17:23 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(auction=[$0], price=[$1], bidder=[$2], dateTime=[$3], 
extra=[$4])
  LogicalJoin(condition=[AND(=($5, $7), =($1, $6))], joinType=[inner])
LogicalProject(auction=[$0], price=[$1], bidder=[$2], dateTime=[$3], 
extra=[$4], starttime=[$5])
  LogicalAggregate(group=[{0, 1, 2, 3, 4, 5}])
LogicalProject(auction=[$0], price=[$2], bidder=[$1], 
dateTime=[$3], extra=[$4], $f5=[TUMBLE($3, 1)])
  BeamIOSourceRel(table=[[beam, Bid]])
LogicalProject(maxprice=[$1], starttime=[$0])
  LogicalAggregate(group=[{0}], maxprice=[MAX($1)])
LogicalProject($f0=[TUMBLE($3, 1)], price=[$2])
  BeamIOSourceRel(table=[[beam, Bid]])

Nov 13, 2018 12:17:23 AM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..7=[{inputs}], proj#0..4=[{exprs}])
  BeamJoinRel(condition=[AND(=($5, $7), =($1, $6))], joinType=[inner])
BeamCalcRel(expr#0..5=[{inputs}], proj#0..5=[{exprs}])
  BeamAggregationRel(group=[{0, 1, 2, 3, 4, 5}], 
window=[FixedWindows($5, PT10S, PT0S)])
BeamCalcRel(expr#0..4=[{inputs}], auction=[$t0], price=[$t2], 
bidder=[$t1], dateTime=[$t3], extra=[$t4], $f5=[$t3])
  BeamIOSourceRel(table=[[beam, Bid]])
BeamCalcRel(expr#0..1=[{inputs}], maxprice=[$t1], starttime=[$t0])
  BeamAggregationRel(group=[{0}], 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Apex_Gradle #2078

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[huangry] Update worker container version to most recent release.

--
[...truncated 23.04 MB...]
INFO: Container container-57 terminating.
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-75
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Nov 13, 2018 1:06:37 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-75 msg: [container-75] Exiting heartbeat loop..
Nov 13, 2018 1:06:37 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-75 terminating.
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-73
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Nov 13, 2018 1:06:37 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-73 msg: [container-73] Exiting heartbeat loop..
Nov 13, 2018 1:06:37 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-73 terminating.
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-72
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-71
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Nov 13, 2018 1:06:37 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-72 msg: [container-72] Exiting heartbeat loop..
Nov 13, 2018 1:06:37 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-71 msg: [container-71] Exiting heartbeat loop..
Nov 13, 2018 1:06:37 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-72 terminating.
Nov 13, 2018 1:06:37 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-71 terminating.
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-69
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Nov 13, 2018 1:06:37 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-69 msg: [container-69] Exiting heartbeat loop..
Nov 13, 2018 1:06:37 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-69 terminating.
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-65
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-66
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Nov 13, 2018 1:06:37 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-65 msg: [container-65] Exiting heartbeat loop..
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Nov 13, 2018 1:06:37 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-66 msg: [container-66] Exiting heartbeat loop..
Nov 13, 2018 1:06:37 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-65 terminating.
Nov 13, 2018 1:06:37 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-66 terminating.
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-64
Nov 13, 2018 1:06:37 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Nov 

Jenkins build is back to normal : beam_PostCommit_Python_VR_Flink #724

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Samza_Gradle #1278

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Dataflow_Gradle #1445

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[markliu] [BEAM-5953] Fix DataflowRunner in Python 3 - type errors

[github] [BEAM-5446] SplittableDoFn: Remove "internal" methods for public API

[lcwik] [BEAM-6037] Make Spark runner pipeline translation based on URNs (#7005)

[lcwik] [BEAM-3608] Vendor guava 20.0 (#6809)

--
[...truncated 19.43 MB...]
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView as step s9
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Create123/Read(CreateSource) as step s10
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding OutputSideInputs as step s11
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/Window.Into()/Window.Assign as step 
s12
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
PAssert$33/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous) as step 
s13
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map 
as step s14
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
PAssert$33/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign as step 
s15
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/GatherAllOutputs/GroupByKey as step 
s16
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/GatherAllOutputs/Values/Values/Map as 
step s17
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/RewindowActuals/Window.Assign as step 
s18
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/KeyForDummy/AddKeys/Map as step s19
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
PAssert$33/GroupGlobally/RemoveActualsTriggering/Flatten.PCollections as step 
s20
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/Create.Values/Read(CreateSource) as 
step s21
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/WindowIntoDummy/Window.Assign as step 
s22
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
PAssert$33/GroupGlobally/RemoveDummyTriggering/Flatten.PCollections as step s23
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/FlattenDummyAndContents as step s24
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/NeverTrigger/Flatten.PCollections as 
step s25
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/GroupDummyAndContents as step s26
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/Values/Values/Map as step s27
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GroupGlobally/ParDo(Concat) as step s28
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/GetPane/Map as step s29
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/RunChecks as step s30
Nov 13, 2018 2:34:12 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$33/VerifyAssertions/ParDo(DefaultConclude) as step s31
Nov 13, 2018 2:34:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Dataflow_Gradle #1446

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1005

2018-11-12 Thread Apache Jenkins Server
See 


--
[...truncated 244.45 KB...]
}
at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel$UploadOperation.call(AbstractGoogleAsyncWriteChannel.java:358)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

Nov 13, 2018 6:04:04 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-zPXRdXxo4YdlRDpVef8XKw.jar
Nov 13, 2018 6:04:25 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly uploaded
Nov 13, 2018 6:04:25 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 13, 2018 6:04:25 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 13, 2018 6:04:25 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 13, 2018 6:04:26 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 13, 2018 6:04:26 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 13, 2018 6:04:26 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 13, 2018 6:04:26 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 13, 2018 6:04:26 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 13, 2018 6:04:26 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 13, 2018 6:04:26 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 13, 2018 6:04:26 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 13, 2018 6:04:26 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/StripNonce/Map
 as step s12
Nov 13, 2018 6:04:26 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Window.Remerge/Identity/Map
 as step s13
Nov 13, 2018 6:04:26 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PrepareCold/Map
 as step s14
Nov 13, 2018 6:04:26 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Flatten.PCollections
 as step s15

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Apex_Gradle #2077

2018-11-12 Thread Apache Jenkins Server
See 


--
[...truncated 23.06 MB...]
INFO: container-72 msg: [container-72] Exiting heartbeat loop..
Nov 13, 2018 12:33:19 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-72 terminating.
Nov 13, 2018 12:33:19 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-70
Nov 13, 2018 12:33:19 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Nov 13, 2018 12:33:19 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-70 msg: [container-70] Exiting heartbeat loop..
Nov 13, 2018 12:33:19 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-70 terminating.
Nov 13, 2018 12:33:19 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-65
Nov 13, 2018 12:33:19 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Nov 13, 2018 12:33:19 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-65 msg: [container-65] Exiting heartbeat loop..
Nov 13, 2018 12:33:19 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-66
Nov 13, 2018 12:33:19 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-65 terminating.
Nov 13, 2018 12:33:19 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Nov 13, 2018 12:33:19 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-66 msg: [container-66] Exiting heartbeat loop..
Nov 13, 2018 12:33:19 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-66 terminating.
Nov 13, 2018 12:33:19 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-74
Nov 13, 2018 12:33:19 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Nov 13, 2018 12:33:19 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-74 msg: [container-74] Exiting heartbeat loop..
Nov 13, 2018 12:33:19 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-74 terminating.
Nov 13, 2018 12:33:19 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-54
Nov 13, 2018 12:33:19 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Nov 13, 2018 12:33:19 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-53
Nov 13, 2018 12:33:19 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-63
Nov 13, 2018 12:33:19 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Nov 13, 2018 12:33:19 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-53 msg: [container-53] Exiting heartbeat loop..
Nov 13, 2018 12:33:19 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-54 msg: [container-54] Exiting heartbeat loop..
Nov 13, 2018 12:33:19 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-53 terminating.
Nov 13, 2018 12:33:19 AM com.datatorrent.stram.engine.StreamingContainer 
processHeartbeatResponse
INFO: Received shutdown request type ABORT
Nov 13, 2018 12:33:19 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-54 terminating.
Nov 13, 2018 12:33:19 AM 
com.datatorrent.stram.StramLocalCluster$UmbilicalProtocolLocalImpl log
INFO: container-63 msg: [container-63] Exiting heartbeat loop..
Nov 13, 2018 12:33:19 AM 
com.datatorrent.stram.StramLocalCluster$LocalStreamingContainerLauncher run
INFO: Container container-63 terminating.
Nov 13, 2018 12:33:19 AM com.datatorrent.stram.StreamingContainerManager 
processHeartbeat
INFO: requesting idle shutdown for container container-55
Nov 13, 2018 12:33:19 AM 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Gearpump_Gradle #1308

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[huangry] Update worker container version to most recent release.

--
[...truncated 9.80 MB...]
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 30
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 33
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 34
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 32
Nov 13, 2018 12:51:59 AM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 36
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 35
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 37
Nov 13, 2018 12:51:59 AM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 38
Nov 13, 2018 12:51:59 AM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 39
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 40
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 41
Nov 13, 2018 12:51:59 AM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 42
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 43
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 44
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 45
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 46
Nov 13, 2018 12:51:59 AM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 13, 2018 12:51:59 AM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 47
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 48
Nov 13, 2018 12:51:59 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 49
Nov 13, 2018 

Build failed in Jenkins: beam_PostCommit_Python_Verify #6537

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3608] Vendor guava 20.0 (#6809)

--
[...truncated 1.43 MB...]
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_parse_schema_descriptor 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySink) ... SKIP: GCP dependencies 
are not installed
test_project_table_display_data 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySink) ... SKIP: GCP dependencies 
are not installed
test_simple_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_table_spec_display_data 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySink) ... SKIP: GCP dependencies 
are not installed
test_date_partitioned_table_name 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 
dependencies are not installed
test_display_data_item_on_validate_true 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 
dependencies are not installed
test_parse_table_reference 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 
dependencies are not installed
test_query_only_display_data 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 
dependencies are not installed
test_specify_query_flattened_records 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 
dependencies are not installed
test_specify_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 
dependencies are not installed
test_specify_query_unflattened_records 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySource) ... SKIP: GCP 
dependencies are not installed
test_specify_query_without_table 

Build failed in Jenkins: beam_PostCommit_Java_PVR_Flink #295

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3608] Vendor guava 20.0 (#6809)

--
[...truncated 293.20 MB...]
org.apache.beam.vendor.grpc.v1.io.grpc.StatusRuntimeException: CANCELLED: 
call already cancelled
at 
org.apache.beam.vendor.grpc.v1.io.grpc.Status.asRuntimeException(Status.java:517)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:23)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener$SimpleForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:40)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onCancel(Contexts.java:96)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$ServerCallStreamObserverImpl.onCompleted(ServerCalls.java:356)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.closed(ServerCallImpl.java:293)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1Closed.runInContext(ServerImpl.java:738)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
at 
org.apache.beam.runners.fnexecution.state.GrpcStateService$Inbound.onError(GrpcStateService.java:145)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onCancel(ServerCalls.java:269)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.PartialForwardingServerCallListener.onCancel(PartialForwardingServerCallListener.java:40)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:23)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener$SimpleForwardingServerCallListener.onCancel(ForwardingServerCallListener.java:40)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onCancel(Contexts.java:96)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.closed(ServerCallImpl.java:293)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1Closed.runInContext(ServerImpl.java:738)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at 
org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[pool-276-thread-1] WARN 
org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown 
endpoint.

org.apache.beam.sdk.transforms.CombineTest$BasicTests > testCombinePerKeyLambda 
STANDARD_ERROR
[CHAIN MapPartition (MapPartition at 
Create.Values/Read(CreateSource)/Impulse.out/beam:env:docker:v1:0) -> FlatMap 
(FlatMap at 
Create.Values/Read(CreateSource)/Impulse.out/beam:env:docker:v1:0/out.0) -> Map 
(Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: 
Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey) 
-> Map (Key Extractor) (2/16)] INFO 
org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory - 
Still waiting for startup of environment 
jenkins-docker-apache.bintray.io/beam/java for worker id 1

org.apache.beam.sdk.transforms.ParDoTest$TimerCoderInferenceTests > 
testValueStateCoderInference STANDARD_ERROR
[CHAIN MapPartition (MapPartition at 
Create.Values/Read(CreateSource)/Impulse.out/beam:env:docker:v1:0) -> FlatMap 
(FlatMap at 
Create.Values/Read(CreateSource)/Impulse.out/beam:env:docker:v1:0/out.0) -> Map 
(Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: 
Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey) 
-> Map (Key Extractor) (16/16)] INFO 
org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory - 
Still waiting for startup of environment 
jenkins-docker-apache.bintray.io/beam/java for worker id 1

org.apache.beam.sdk.transforms.CombineTest$BasicTests > testCombinePerKeyLambda 
STANDARD_ERROR
[CHAIN MapPartition (MapPartition at 
Create.Values/Read(CreateSource)/Impulse.out/beam:env:docker:v1:0) -> FlatMap 
(FlatMap at 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #2183

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[github] Clarify in docstrings that we expect  TFRecord values to be bytes

--
[...truncated 29.36 MB...]
[streaming-job-executor-0] INFO org.apache.spark.SparkContext - Starting 
job: foreach at UnboundedDataset.java:79
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2787 (mapToPair at GroupCombineFunctions.java:57)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2815 (mapToPair at GroupCombineFunctions.java:57)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2824 (mapPartitionsToPair at 
SparkGroupAlsoByWindowViaWindowSet.java:564)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Got job 35 (foreach at UnboundedDataset.java:79) with 4 output partitions
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Final stage: ResultStage 839 (foreach at UnboundedDataset.java:79)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Parents of final stage: List(ShuffleMapStage 828, ShuffleMapStage 832, 
ShuffleMapStage 826, ShuffleMapStage 836, ShuffleMapStage 830, ShuffleMapStage 
834, ShuffleMapStage 838, ShuffleMapStage 821, ShuffleMapStage 810)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Missing parents: List(ShuffleMapStage 826)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting ShuffleMapStage 824 (MapPartitionsRDD[2787] at mapToPair at 
GroupCombineFunctions.java:57), which has no missing parents
[dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_143 stored as values in memory (estimated size 177.8 KB, free 
13.5 GB)
[dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_143_piece0 stored as bytes in memory (estimated size 54.7 KB, 
free 13.5 GB)
[dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - 
Added broadcast_143_piece0 in memory on localhost:36139 (size: 54.7 KB, free: 
13.5 GB)
[dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created 
broadcast 143 from broadcast at DAGScheduler.scala:1039
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 4 missing tasks from ShuffleMapStage 824 (MapPartitionsRDD[2787] at 
mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions 
Vector(0, 1, 2, 3))
[dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 824.0 with 4 
tasks
[dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 824.0 (TID 642, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8165 bytes)
[dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 824.0 (TID 643, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 8165 bytes)
[dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 824.0 (TID 644, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 8165 bytes)
[dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 824.0 (TID 645, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 8165 bytes)
[Executor task launch worker for task 642] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 824.0 (TID 642)
[Executor task launch worker for task 644] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 824.0 (TID 644)
[Executor task launch worker for task 643] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 824.0 (TID 643)
[Executor task launch worker for task 645] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 824.0 (TID 645)
[Executor task launch worker for task 643] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_1 locally
[Executor task launch worker for task 645] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_3 locally
[Executor task launch worker for task 642] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_0 locally
[Executor task launch worker for task 644] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_2 locally
[Executor task launch worker for task 645] INFO 
org.apache.spark.executor.Executor - Finished task 3.0 in stage 824.0 (TID 
645). 59509 bytes result sent to driver
[Executor task launch worker for task 643] INFO 
org.apache.spark.executor.Executor - Finished task 1.0 in stage 824.0 (TID 
643). 59509 bytes result sent to driver
[task-result-getter-1] INFO 

Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #1702

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #2181

2018-11-12 Thread Apache Jenkins Server
See 


--
[...truncated 29.72 MB...]
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2472 (mapToPair at GroupCombineFunctions.java:57)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2500 (mapToPair at GroupCombineFunctions.java:57)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2509 (mapPartitionsToPair at 
SparkGroupAlsoByWindowViaWindowSet.java:564)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Got job 31 (foreach at UnboundedDataset.java:79) with 4 output partitions
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Final stage: ResultStage 667 (foreach at UnboundedDataset.java:79)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Parents of final stage: List(ShuffleMapStage 658, ShuffleMapStage 662, 
ShuffleMapStage 651, ShuffleMapStage 666, ShuffleMapStage 660, ShuffleMapStage 
637, ShuffleMapStage 664, ShuffleMapStage 653)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Missing parents: List(ShuffleMapStage 658)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting ShuffleMapStage 656 (MapPartitionsRDD[2472] at mapToPair at 
GroupCombineFunctions.java:57), which has no missing parents
[dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_127 stored as values in memory (estimated size 177.5 KB, free 
13.4 GB)
[dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_127_piece0 stored as bytes in memory (estimated size 54.7 KB, 
free 13.4 GB)
[dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - 
Added broadcast_127_piece0 in memory on localhost:33319 (size: 54.7 KB, free: 
13.5 GB)
[dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created 
broadcast 127 from broadcast at DAGScheduler.scala:1039
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 4 missing tasks from ShuffleMapStage 656 (MapPartitionsRDD[2472] at 
mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions 
Vector(0, 1, 2, 3))
[dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 656.0 with 4 
tasks
[dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 656.0 (TID 570, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8127 bytes)
[dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 656.0 (TID 571, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 8127 bytes)
[dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 656.0 (TID 572, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 8127 bytes)
[dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 656.0 (TID 573, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 8127 bytes)
[Executor task launch worker for task 571] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 656.0 (TID 571)
[Executor task launch worker for task 572] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 656.0 (TID 572)
[Executor task launch worker for task 570] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 656.0 (TID 570)
[Executor task launch worker for task 573] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 656.0 (TID 573)
[Executor task launch worker for task 571] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2244_1 locally
[Executor task launch worker for task 573] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2244_3 locally
[Executor task launch worker for task 572] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2244_2 locally
[Executor task launch worker for task 570] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2244_0 locally
[Executor task launch worker for task 571] INFO 
org.apache.spark.executor.Executor - Finished task 1.0 in stage 656.0 (TID 
571). 59509 bytes result sent to driver
[Executor task launch worker for task 573] INFO 
org.apache.spark.executor.Executor - Finished task 3.0 in stage 656.0 (TID 
573). 59509 bytes result sent to driver
[Executor task launch worker for task 572] INFO 
org.apache.spark.executor.Executor - Finished task 2.0 in stage 656.0 (TID 
572). 59509 bytes result sent to driver
[Executor task launch worker for task 570] INFO 
org.apache.spark.executor.Executor - Finished task 0.0 in stage 

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #1701

2018-11-12 Thread Apache Jenkins Server
See 


--
[...truncated 356.59 KB...]
root: INFO: 2018-11-13T00:01:16.437Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert:even/Group/GroupByKey/Write into assert:even/Group/GroupByKey/Reify
root: INFO: 2018-11-13T00:01:16.492Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-11-13T00:01:16.544Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert:even/Group/GroupByKey/Reify into assert:even/Group/pair_with_1
root: INFO: 2018-11-13T00:01:16.585Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert:odd/Group/GroupByKey/Reify into assert:odd/Group/pair_with_1
root: INFO: 2018-11-13T00:01:16.627Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert:even/ToVoidKey into assert:even/WindowInto(WindowIntoFn)
root: INFO: 2018-11-13T00:01:16.657Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert:even/Group/pair_with_1 into assert:even/ToVoidKey
root: INFO: 2018-11-13T00:01:16.706Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-11-13T00:01:16.751Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-11-13T00:01:16.807Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert:even/WindowInto(WindowIntoFn) into ClassifyNumbers/ParDo(SomeDoFn)
root: INFO: 2018-11-13T00:01:16.857Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert:odd/ToVoidKey into assert:odd/WindowInto(WindowIntoFn)
root: INFO: 2018-11-13T00:01:16.902Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/WindowInto(WindowIntoFn) into ClassifyNumbers/ParDo(SomeDoFn)
root: INFO: 2018-11-13T00:01:16.955Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert:odd/Group/pair_with_1 into assert:odd/ToVoidKey
root: INFO: 2018-11-13T00:01:17.002Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert:odd/WindowInto(WindowIntoFn) into ClassifyNumbers/ParDo(SomeDoFn)
root: INFO: 2018-11-13T00:01:17.046Z: JOB_MESSAGE_DETAILED: Fusing consumer 
ClassifyNumbers/ParDo(SomeDoFn) into Some Numbers/Read
root: INFO: 2018-11-13T00:01:17.105Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-11-13T00:01:17.160Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert:even/Group/pair_with_0 into assert:even/Create/Read
root: INFO: 2018-11-13T00:01:17.211Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert:odd/Group/pair_with_0 into assert:odd/Create/Read
root: INFO: 2018-11-13T00:01:17.268Z: JOB_MESSAGE_DEBUG: Workflow config is 
missing a default resource spec.
root: INFO: 2018-11-13T00:01:17.321Z: JOB_MESSAGE_DEBUG: Adding StepResource 
setup and teardown to workflow graph.
root: INFO: 2018-11-13T00:01:17.370Z: JOB_MESSAGE_DEBUG: Adding workflow start 
and stop steps.
root: INFO: 2018-11-13T00:01:17.421Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-11-13T00:01:17.673Z: JOB_MESSAGE_DEBUG: Executing wait step 
start54
root: INFO: 2018-11-13T00:01:17.798Z: JOB_MESSAGE_BASIC: Executing operation 
assert:odd/Group/GroupByKey/Create
root: INFO: 2018-11-13T00:01:17.845Z: JOB_MESSAGE_BASIC: Executing operation 
assert:even/Group/GroupByKey/Create
root: INFO: 2018-11-13T00:01:17.856Z: JOB_MESSAGE_DEBUG: Starting worker pool 
setup.
root: INFO: 2018-11-13T00:01:17.886Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Group/GroupByKey/Create
root: INFO: 2018-11-13T00:01:17.897Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-b...
root: INFO: 2018-11-13T00:01:17.998Z: JOB_MESSAGE_DEBUG: Value 
"assert:odd/Group/GroupByKey/Session" materialized.
root: INFO: 2018-11-13T00:01:18.051Z: JOB_MESSAGE_DEBUG: Value 
"assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-11-13T00:01:18.093Z: JOB_MESSAGE_DEBUG: Value 
"assert:even/Group/GroupByKey/Session" materialized.
root: INFO: 2018-11-13T00:01:18.131Z: JOB_MESSAGE_BASIC: Executing operation 
assert:odd/Create/Read+assert:odd/Group/pair_with_0+assert:odd/Group/GroupByKey/Reify+assert:odd/Group/GroupByKey/Write
root: INFO: 2018-11-13T00:01:18.207Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-11-13T00:01:18.355Z: JOB_MESSAGE_BASIC: Executing operation 
Some 
Numbers/Read+ClassifyNumbers/ParDo(SomeDoFn)+assert:even/WindowInto(WindowIntoFn)+assert:even/ToVoidKey+assert:even/Group/pair_with_1+assert:even/Group/GroupByKey/Reify+assert:even/Group/GroupByKey/Write+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write+assert:odd/WindowInto(WindowIntoFn)+assert:odd/ToVoidKey+assert:odd/Group/pair_with_1+assert:odd/Group/GroupByKey/Reify+assert:odd/Group/GroupByKey/Write
root: INFO: 2018-11-13T00:01:18.422Z: JOB_MESSAGE_BASIC: Executing 

Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #723

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[huangry] Update worker container version to most recent release.

--
[...truncated 4.43 MB...]
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Successful registration at 
resource manager 
akka://flink/user/resourcemanager_724b86b5-9c0d-45e8-a8d3-c730cf99627d under 
registration id f1d1832586c464caafef7d5354f326a6.
[flink-runner-job-server] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Rest endpoint 
listening at localhost:46151
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint@1580b247 @ 
http://localhost:46151
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Starting job dispatcher(s) for JobManger
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - 
http://localhost:46151 was granted leadership with 
leaderSessionID=b0e15e94-d24c-4d35-819a-04a02990d136
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader http://localhost:46151 , 
session=b0e15e94-d24c-4d35-819a-04a02990d136
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService 
- Starting RPC endpoint for 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher at 
akka://flink/user/dispatcherfa6cdb6d-f8ff-4f63-bb33-8adb0380fa90 .
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher@4a578fc4 @ 
akka://flink/user/dispatcherfa6cdb6d-f8ff-4f63-bb33-8adb0380fa90
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Flink Mini Cluster started successfully
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Dispatcher 
akka://flink/user/dispatcherfa6cdb6d-f8ff-4f63-bb33-8adb0380fa90 was granted 
leadership with fencing token 4e2e5f30-2a01-4c0e-9b81-e5b761733dbd
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Recovering all 
persisted jobs.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader 
akka://flink/user/dispatcherfa6cdb6d-f8ff-4f63-bb33-8adb0380fa90 , 
session=4e2e5f30-2a01-4c0e-9b81-e5b761733dbd
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Submitting job 
338980753c858d90b6297d6ebfe4569a (test_windowing_1542068901.36).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_41 
.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Initializing job 
test_windowing_1542068901.36 (338980753c858d90b6297d6ebfe4569a).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Using restart strategy 
NoRestartStrategy for test_windowing_1542068901.36 
(338980753c858d90b6297d6ebfe4569a).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool at 
akka://flink/user/c50ca5f4-613b-4faa-8087-783f27aa7044 .
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job recovers via 
failover strategy: full graph restart
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Running initialization on master 
for job test_windowing_1542068901.36 (338980753c858d90b6297d6ebfe4569a).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Successfully ran initialization 
on master in 0 ms.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - No state backend has been 
configured, using default (Memory / JobManager) MemoryStateBackend (data in 
heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 
'null', asynchronous: TRUE, maxStateSize: 5242880)
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.jobmaster.JobManagerRunner@7a3d2f8f @ 
akka://flink/user/jobmanager_41
[flink-akka.actor.default-dispatcher-2] INFO 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1003

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[huangry] Update worker container version to most recent release.

--
[...truncated 272.84 KB...]
}
at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel$UploadOperation.call(AbstractGoogleAsyncWriteChannel.java:358)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

Nov 13, 2018 12:21:45 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-hqGbgtCy_7LiJtbkWLfFXw.jar
Nov 13, 2018 12:21:48 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly uploaded
Nov 13, 2018 12:21:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 13, 2018 12:21:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 13, 2018 12:21:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 13, 2018 12:21:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 13, 2018 12:21:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 13, 2018 12:21:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 13, 2018 12:21:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 13, 2018 12:21:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 13, 2018 12:21:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 13, 2018 12:21:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 13, 2018 12:21:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 13, 2018 12:21:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/StripNonce/Map
 as step s12
Nov 13, 2018 12:21:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Window.Remerge/Identity/Map
 as step s13
Nov 13, 2018 12:21:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PrepareCold/Map
 as step s14
Nov 13, 2018 12:21:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 

Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #722

2018-11-12 Thread Apache Jenkins Server
See 


--
[...truncated 4.44 MB...]
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Starting job dispatcher(s) for JobManger
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - 
http://localhost:42985 was granted leadership with 
leaderSessionID=9d1f7a6b-dec1-42a3-82db-47f9dd978735
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader http://localhost:42985 , 
session=9d1f7a6b-dec1-42a3-82db-47f9dd978735
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService 
- Starting RPC endpoint for 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher at 
akka://flink/user/dispatchercb00d36c-5e03-4bcb-8f42-b852a239d778 .
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher@2f3b0bc6 @ 
akka://flink/user/dispatchercb00d36c-5e03-4bcb-8f42-b852a239d778
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Flink Mini Cluster started successfully
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Dispatcher 
akka://flink/user/dispatchercb00d36c-5e03-4bcb-8f42-b852a239d778 was granted 
leadership with fencing token e736aa4f-3a2f-4f72-917c-9916a4f10d25
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Recovering all 
persisted jobs.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader 
akka://flink/user/dispatchercb00d36c-5e03-4bcb-8f42-b852a239d778 , 
session=e736aa4f-3a2f-4f72-917c-9916a4f10d25
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Submitting job 
27136b6d885a4430ad81d09a2c68ec1a (test_windowing_1542067906.48).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_41 
.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Initializing job 
test_windowing_1542067906.48 (27136b6d885a4430ad81d09a2c68ec1a).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Using restart strategy 
NoRestartStrategy for test_windowing_1542067906.48 
(27136b6d885a4430ad81d09a2c68ec1a).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool at 
akka://flink/user/b0a3feb5-781c-4262-9139-54579258b42e .
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job recovers via 
failover strategy: full graph restart
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Running initialization on master 
for job test_windowing_1542067906.48 (27136b6d885a4430ad81d09a2c68ec1a).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Successfully ran initialization 
on master in 0 ms.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - No state backend has been 
configured, using default (Memory / JobManager) MemoryStateBackend (data in 
heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 
'null', asynchronous: TRUE, maxStateSize: 5242880)
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.jobmaster.JobManagerRunner@7fe077a7 @ 
akka://flink/user/jobmanager_41
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobManagerRunner - JobManager runner for job 
test_windowing_1542067906.48 (27136b6d885a4430ad81d09a2c68ec1a) was granted 
leadership with session id ebf542c8-df91-486d-821e-d5060bce5d3b at 
akka://flink/user/jobmanager_41.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Starting execution of job 
test_windowing_1542067906.48 (27136b6d885a4430ad81d09a2c68ec1a)
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job 
test_windowing_1542067906.48 (27136b6d885a4430ad81d09a2c68ec1a) switched from 
state CREATED to RUNNING.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: Custom 

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1877

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[markliu] [BEAM-5953] Fix DataflowRunner in Python 3 - type errors

[github] [BEAM-5446] SplittableDoFn: Remove "internal" methods for public API

[lcwik] [BEAM-6037] Make Spark runner pipeline translation based on URNs (#7005)

[lcwik] [BEAM-3608] Vendor guava 20.0 (#6809)

--
[...truncated 55.18 MB...]
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr-runtime/3.1.2/c4ca32c2be1b22a5553dd3171f51f9b2b04030b/antlr-runtime-3.1.2.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113010144-79616b1/output/results/staging/antlr-runtime-3.1.2-dpdEvr2kPK4dWHj9fUOv1A.jar
Nov 13, 2018 1:01:46 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.4/d94ae6d7d27242eaa4b6c323f881edbb98e48da6/snappy-java-1.1.4.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113010144-79616b1/output/results/staging/snappy-java-1.1.4-SFNwbMuGq13aaoKVzeS1Tw.jar
Nov 13, 2018 1:01:46 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.squareup.okhttp/okhttp/2.5.0/4de2b4ed3445c37ec1720a7d214712e845a24636/okhttp-2.5.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113010144-79616b1/output/results/staging/okhttp-2.5.0-64v0X4G_nxfR_PsuymOqpg.jar
Nov 13, 2018 1:01:46 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.ow2.asm/asm/4.0/659add6efc75a4715d738e73f07505246edf4d66/asm-4.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113010144-79616b1/output/results/staging/asm-4.0-Mi2PiMURGvYS34OMAZHNfg.jar
Nov 13, 2018 1:01:46 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.18/e4a441249ade301985cb8d009d4e4a72b85bf68e/snakeyaml-1.18.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113010144-79616b1/output/results/staging/snakeyaml-1.18-R-oOCdgXK4J6PV-saHyToQ.jar
Nov 13, 2018 1:01:46 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.httpcomponents/httpcore/4.4.6/e3fd8ced1f52c7574af952e2e6da0df8df08eb82/httpcore-4.4.6.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113010144-79616b1/output/results/staging/httpcore-4.4.6-qfvVA-CAJQfv7q_7VrvfUg.jar
Nov 13, 2018 1:01:46 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/ognl/ognl/3.1.12/a7fa0db32f882cd3bb41ec6c489853b3bfb6aebc/ognl-3.1.12.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113010144-79616b1/output/results/staging/ognl-3.1.12-aq6HFAi1aHiNX6qcImOsmw.jar
Nov 13, 2018 1:01:46 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.beanshell/bsh/2.0b4/a05f0a0feefa8d8467ac80e16e7de071489f0d9c/bsh-2.0b4.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113010144-79616b1/output/results/staging/bsh-2.0b4-ocYKqDycmmyyORwcG4XrAA.jar
Nov 13, 2018 1:01:47 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/20.0/89507701249388e1ed5ddcf8c41f4ce1be7831ef/guava-20.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113010144-79616b1/output/results/staging/guava-20.0-8yqKJSRiDb7Mn2v2ogwpPw.jar
Nov 13, 2018 1:01:47 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/io.netty/netty-tcnative-boringssl-static/2.0.8.Final/5c3483dfa33cd04f5469c95abf67e1b69a8f1221/netty-tcnative-boringssl-static-2.0.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113010144-79616b1/output/results/staging/netty-tcnative-boringssl-static-2.0.8.Final-RCm0wU8kBdzNqqi47Z837A.jar
Nov 13, 2018 1:01:47 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.javassist/javassist/3.20.0-GA/a9cbcdfb7e9f86fbc74d3afae65f2248bfbf82a0/javassist-3.20.0-GA.jar
 to 

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1878

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[huangry] Update worker container version to most recent release.

--
[...truncated 54.30 MB...]
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.18/e4a441249ade301985cb8d009d4e4a72b85bf68e/snakeyaml-1.18.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/snakeyaml-1.18-R-oOCdgXK4J6PV-saHyToQ.jar
Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.antlr/antlr-runtime/3.1.2/c4ca32c2be1b22a5553dd3171f51f9b2b04030b/antlr-runtime-3.1.2.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/antlr-runtime-3.1.2-dpdEvr2kPK4dWHj9fUOv1A.jar
Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/ognl/ognl/3.1.12/a7fa0db32f882cd3bb41ec6c489853b3bfb6aebc/ognl-3.1.12.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/ognl-3.1.12-aq6HFAi1aHiNX6qcImOsmw.jar
Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/io.netty/netty-tcnative-boringssl-static/2.0.8.Final/5c3483dfa33cd04f5469c95abf67e1b69a8f1221/netty-tcnative-boringssl-static-2.0.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/netty-tcnative-boringssl-static-2.0.8.Final-RCm0wU8kBdzNqqi47Z837A.jar
Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.beanshell/bsh/2.0b4/a05f0a0feefa8d8467ac80e16e7de071489f0d9c/bsh-2.0b4.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/bsh-2.0b4-ocYKqDycmmyyORwcG4XrAA.jar
Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.25.Final/e17d5c05c101fe14536ce3fb34b36c54e04791f6/netty-common-4.1.25.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/netty-common-4.1.25.Final-cYwMY1_F2pzboFUnBL3CxQ.jar
Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-lrT1R8SMeyI_cgvx4NWfww.jar
Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.testng/testng/6.8.21/15e02d8d7be3c3640b585b97eda56026fdb5bf4d/testng-6.8.21.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/testng-6.8.21-4nE6Pd5l58HFjEGfLzL88Q.jar
Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/beam-vendor-java-grpc-v1-2.9.0-SNAPSHOT-HjrNvMjG9CsQJ1maWDon0w.jar
Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/io.grpc/grpc-netty-shaded/1.13.1/ccdc4f2c2791d93164c574fbfb90d614aa0849ae/grpc-netty-shaded-1.13.1.jar
 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/grpc-netty-shaded-1.13.1-YS-LK7_gIZl0B4Kw_7Rw7A.jar
Nov 13, 2018 2:01:08 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-end-to-end-tests/testpipeline-jenkins-1113020105-15ff7659/output/results/staging/beam-runners-direct-java-2.9.0-SNAPSHOT-vg9cZryWT8X9iF3ZS4VmSQ.jar
Nov 13, 2018 2:01:16 AM org.apache.beam.runners.dataflow.util.PackageUtil 

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #2184

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Gearpump_Gradle #1307

2018-11-12 Thread Apache Jenkins Server
See 


--
[...truncated 9.84 MB...]
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 14
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 15
Nov 13, 2018 12:26:54 AM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 13, 2018 12:26:54 AM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 16
Nov 13, 2018 12:26:54 AM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 13, 2018 12:26:54 AM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 17
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 18
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 19
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 20
Nov 13, 2018 12:26:54 AM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 21
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 22
Nov 13, 2018 12:26:54 AM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 23
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 24
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 25
Nov 13, 2018 12:26:54 AM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 27
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 28
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 29
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 26
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 31
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 32
Nov 13, 2018 12:26:54 AM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow_Gradle #82

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[github] [BEAM-5446] SplittableDoFn: Remove "internal" methods for public API

[lcwik] [BEAM-6037] Make Spark runner pipeline translation based on URNs (#7005)

[huangry] Update worker container version to most recent release.

[lcwik] [BEAM-3608] Vendor guava 20.0 (#6809)

--
[...truncated 17.72 MB...]
INFO: 2018-11-13T02:47:37.624Z: Expanding CollectionToSingleton operations 
into optimizable parts.
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:37.727Z: Expanding CoGroupByKey operations into 
optimizable parts.
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.028Z: Expanding GroupByKey operations into 
optimizable parts.
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.139Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.184Z: Elided trivial flatten 
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.221Z: Elided trivial flatten 
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.270Z: Elided trivial flatten 
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.321Z: Unzipping flatten s43 for input 
s31.org.apache.beam.sdk.values.PCollection.:402#536e37dc9c742e20
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.363Z: Fusing unzipped copy of 
PAssert$27/GroupGlobally/GroupDummyAndContents/Reify, through flatten 
PAssert$27/GroupGlobally/FlattenDummyAndContents, into producer 
PAssert$27/GroupGlobally/KeyForDummy/AddKeys/Map
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.395Z: Fusing consumer 
PAssert$27/GroupGlobally/GroupDummyAndContents/GroupByWindow into 
PAssert$27/GroupGlobally/GroupDummyAndContents/Read
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.435Z: Fusing consumer PAssert$27/GetPane/Map into 
PAssert$27/GroupGlobally/ParDo(Concat)
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.473Z: Fusing consumer 
PAssert$27/VerifyAssertions/ParDo(DefaultConclude) into PAssert$27/RunChecks
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.521Z: Fusing consumer PAssert$27/RunChecks into 
PAssert$27/GetPane/Map
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.596Z: Fusing consumer 
PAssert$27/GroupGlobally/Values/Values/Map into 
PAssert$27/GroupGlobally/GroupDummyAndContents/GroupByWindow
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.647Z: Fusing consumer 
PAssert$27/GroupGlobally/ParDo(Concat) into 
PAssert$27/GroupGlobally/Values/Values/Map
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.696Z: Unzipping flatten s43-u80 for input 
s45-reify-value58-c78
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.738Z: Fusing unzipped copy of 
PAssert$27/GroupGlobally/GroupDummyAndContents/Write, through flatten s43-u80, 
into producer PAssert$27/GroupGlobally/GroupDummyAndContents/Reify
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.788Z: Fusing consumer 
PAssert$27/GroupGlobally/GroupDummyAndContents/Reify into 
PAssert$27/GroupGlobally/WindowIntoDummy/Window.Assign
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.837Z: Fusing consumer 
PAssert$27/GroupGlobally/GroupDummyAndContents/Write into 
PAssert$27/GroupGlobally/GroupDummyAndContents/Reify
Nov 13, 2018 2:47:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T02:47:38.887Z: Fusing consumer 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Gearpump_Gradle #1306

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3608] Vendor guava 20.0 (#6809)

--
[...truncated 9.80 MB...]
Nov 12, 2018 11:22:54 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 14
Nov 12, 2018 11:22:54 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 15
Nov 12, 2018 11:22:54 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 16
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 17
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 18
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 19
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 20
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 21
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 22
Nov 12, 2018 11:22:54 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 23
Nov 12, 2018 11:22:54 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 25
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 24
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 28
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 26
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 27
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 29
Nov 12, 2018 11:22:54 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 30
Nov 12, 2018 11:22:54 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 35
Nov 12, 2018 11:22:54 PM org.apache.gearpump.streaming.task.TaskActor 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Samza_Gradle #1276

2018-11-12 Thread Apache Jenkins Server
See 


--
[...truncated 46.64 MB...]
INFO: Starting stores in task instance Partition 0
Nov 13, 2018 12:04:55 AM org.apache.samza.util.Logging$class info
INFO: Got non logged storage partition directory as 
/tmp/beam-samza-test/beamStore/Partition_0
Nov 13, 2018 12:04:55 AM org.apache.samza.util.Logging$class info
INFO: Got logged storage partition directory as 
/tmp/beam-samza-test/beamStore/Partition_0
Nov 13, 2018 12:04:55 AM org.apache.samza.util.Logging$class info
INFO: Deleting logged storage partition directory 
/tmp/beam-samza-test/beamStore/Partition_0.
Nov 13, 2018 12:04:55 AM org.apache.samza.util.Logging$class info
INFO: Using non logged storage partition directory: 
/tmp/beam-samza-test/beamStore/Partition_0 for store: beamStore.
Nov 13, 2018 12:04:55 AM org.apache.samza.util.Logging$class info
INFO: Validating change log streams: Map()
Nov 13, 2018 12:04:55 AM org.apache.samza.util.Logging$class info
INFO: Got change log stream metadata: Map()
Nov 13, 2018 12:04:55 AM org.apache.samza.util.Logging$class info
INFO: Assigning oldest change log offsets for taskName Partition 0: Map()
Nov 13, 2018 12:04:55 AM org.apache.samza.util.Logging$class info
INFO: Starting table manager in task instance Partition 0
Nov 13, 2018 12:04:55 AM org.apache.samza.util.Logging$class info
INFO: Starting host statistics monitor
Nov 13, 2018 12:04:55 AM org.apache.samza.util.Logging$class info
INFO: Registering task instances with producers.
Nov 13, 2018 12:04:55 AM org.apache.samza.util.Logging$class info
INFO: Starting producer multiplexer.
Nov 13, 2018 12:04:55 AM org.apache.samza.util.Logging$class info
INFO: Initializing stream tasks.
Nov 13, 2018 12:04:55 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
0-split0_out__PCollection_. Keys will not be (de)serialized
Nov 13, 2018 12:04:55 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
0-split0_out__PCollection_. Values will not be (de)serialized
Nov 13, 2018 12:04:55 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
1-split1_out__PCollection_. Keys will not be (de)serialized
Nov 13, 2018 12:04:55 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
1-split1_out__PCollection_. Values will not be (de)serialized
Nov 13, 2018 12:04:55 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
2-split2_out__PCollection_. Keys will not be (de)serialized
Nov 13, 2018 12:04:55 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
2-split2_out__PCollection_. Values will not be (de)serialized
Nov 13, 2018 12:04:55 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
3-split3_out__PCollection_. Keys will not be (de)serialized
Nov 13, 2018 12:04:55 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
3-split3_out__PCollection_. Values will not be (de)serialized
Nov 13, 2018 12:04:55 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
4-split4_out__PCollection_. Keys will not be (de)serialized
Nov 13, 2018 12:04:55 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
4-split4_out__PCollection_. Values will not be (de)serialized
Nov 13, 2018 12:04:55 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
5-split5_out__PCollection_. Keys will not be (de)serialized
Nov 13, 2018 12:04:55 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
5-split5_out__PCollection_. Values will not be (de)serialized
Nov 13, 2018 12:04:55 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
6-split6_out__PCollection_. Keys will not be (de)serialized
Nov 13, 2018 12:04:55 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
6-split6_out__PCollection_. Values will not be (de)serialized
Nov 13, 2018 12:04:55 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
7-split7_out__PCollection_. Keys will not be (de)serialized
Nov 13, 2018 12:04:55 AM 

Build failed in Jenkins: beam_PostCommit_Python_Verify #6538

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[huangry] Update worker container version to most recent release.

--
[...truncated 1.38 MB...]
test_delete_error (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_parse_schema_descriptor 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Samza_Gradle #1277

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[huangry] Update worker container version to most recent release.

--
[...truncated 46.63 MB...]
INFO: Starting stores in task instance Partition 0
Nov 13, 2018 12:20:20 AM org.apache.samza.util.Logging$class info
INFO: Got non logged storage partition directory as 
/tmp/beam-samza-test/beamStore/Partition_0
Nov 13, 2018 12:20:20 AM org.apache.samza.util.Logging$class info
INFO: Got logged storage partition directory as 
/tmp/beam-samza-test/beamStore/Partition_0
Nov 13, 2018 12:20:20 AM org.apache.samza.util.Logging$class info
INFO: Deleting logged storage partition directory 
/tmp/beam-samza-test/beamStore/Partition_0.
Nov 13, 2018 12:20:20 AM org.apache.samza.util.Logging$class info
INFO: Using non logged storage partition directory: 
/tmp/beam-samza-test/beamStore/Partition_0 for store: beamStore.
Nov 13, 2018 12:20:20 AM org.apache.samza.util.Logging$class info
INFO: Validating change log streams: Map()
Nov 13, 2018 12:20:20 AM org.apache.samza.util.Logging$class info
INFO: Got change log stream metadata: Map()
Nov 13, 2018 12:20:20 AM org.apache.samza.util.Logging$class info
INFO: Assigning oldest change log offsets for taskName Partition 0: Map()
Nov 13, 2018 12:20:20 AM org.apache.samza.util.Logging$class info
INFO: Starting table manager in task instance Partition 0
Nov 13, 2018 12:20:20 AM org.apache.samza.util.Logging$class info
INFO: Starting host statistics monitor
Nov 13, 2018 12:20:20 AM org.apache.samza.util.Logging$class info
INFO: Registering task instances with producers.
Nov 13, 2018 12:20:20 AM org.apache.samza.util.Logging$class info
INFO: Starting producer multiplexer.
Nov 13, 2018 12:20:20 AM org.apache.samza.util.Logging$class info
INFO: Initializing stream tasks.
Nov 13, 2018 12:20:20 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
0-split0_out__PCollection_. Keys will not be (de)serialized
Nov 13, 2018 12:20:20 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
0-split0_out__PCollection_. Values will not be (de)serialized
Nov 13, 2018 12:20:20 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
1-split1_out__PCollection_. Keys will not be (de)serialized
Nov 13, 2018 12:20:20 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
1-split1_out__PCollection_. Values will not be (de)serialized
Nov 13, 2018 12:20:20 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
2-split2_out__PCollection_. Keys will not be (de)serialized
Nov 13, 2018 12:20:20 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
2-split2_out__PCollection_. Values will not be (de)serialized
Nov 13, 2018 12:20:20 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
3-split3_out__PCollection_. Keys will not be (de)serialized
Nov 13, 2018 12:20:20 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
3-split3_out__PCollection_. Values will not be (de)serialized
Nov 13, 2018 12:20:20 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
4-split4_out__PCollection_. Keys will not be (de)serialized
Nov 13, 2018 12:20:20 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
4-split4_out__PCollection_. Values will not be (de)serialized
Nov 13, 2018 12:20:20 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
5-split5_out__PCollection_. Keys will not be (de)serialized
Nov 13, 2018 12:20:20 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
5-split5_out__PCollection_. Values will not be (de)serialized
Nov 13, 2018 12:20:20 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
6-split6_out__PCollection_. Keys will not be (de)serialized
Nov 13, 2018 12:20:20 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
6-split6_out__PCollection_. Values will not be (de)serialized
Nov 13, 2018 12:20:20 AM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
7-split7_out__PCollection_. Keys will 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #2192

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[huangry] Update worker container version to most recent release.

--
[...truncated 67.60 MB...]
Nov 13, 2018 12:37:01 AM org.apache.flink.runtime.executiongraph.Execution 
transitionState
INFO: Source: 
GenerateSequence/Read(UnboundedCountingSource)/Create/Read(CreateSource) -> 
GenerateSequence/Read(UnboundedCountingSource)/Split/ParMultiDo(Split) -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Pair with random 
key/ParMultiDo(AssignShard) -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/Window.Into()/Window.Assign.out
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> ToKeyedWorkItem (1/1) (a3644361058e582910f01bf3f268fa90) switched from 
CREATED to SCHEDULED.
Nov 13, 2018 12:37:01 AM org.apache.flink.runtime.executiongraph.Execution 
transitionState
INFO: 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/GroupByKey 
-> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Values/Values/Map/ParMultiDo(Anonymous)
 -> GenerateSequence/Read(UnboundedCountingSource)/Read/ParMultiDo(Read) -> 
GenerateSequence/Read(UnboundedCountingSource)/StripIds/ParMultiDo(StripIds) -> 
ParDo(Counting)/ParMultiDo(Counting) (1/1) (4bdcf8ce479b7918e1ffb26064e5c57e) 
switched from CREATED to SCHEDULED.
Nov 13, 2018 12:37:01 AM 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool 
stashRequestWaitingForResourceManager
INFO: Cannot serve slot request, no ResourceManager connected. Adding as 
pending request [SlotRequestId{ebbd15d3c822ebb21669a097d6cb8dcd}]
Nov 13, 2018 12:37:01 AM 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
confirmLeader
INFO: Received confirmation of leadership for leader 
akka://flink/user/jobmanager_237 , session=bff6633d-ca63-4521-b98c-9ac389fa764e
Nov 13, 2018 12:37:01 AM org.apache.flink.runtime.jobmaster.JobMaster 
connectToResourceManager
INFO: Connecting to ResourceManager 
akka://flink/user/resourcemanager_b6225048-a202-48b9-9dbd-3a4331c362f5(8a652b825d533aff109b5fec227f48f6)
Nov 13, 2018 12:37:01 AM 
org.apache.flink.runtime.registration.RetryingRegistration 
lambda$startRegistration$0
INFO: Resolved ResourceManager address, beginning registration
Nov 13, 2018 12:37:01 AM 
org.apache.flink.runtime.registration.RetryingRegistration register
INFO: Registration at ResourceManager attempt 1 (timeout=100ms)
Nov 13, 2018 12:37:01 AM 
org.apache.flink.runtime.resourcemanager.ResourceManager registerJobManager
INFO: Registering job manager 
b98c9ac389fa764ebff6633dca634521@akka://flink/user/jobmanager_237 for job 
9c1e8c68ccb2a023393cf42aa3201479.
Nov 13, 2018 12:37:01 AM 
org.apache.flink.runtime.resourcemanager.ResourceManager 
registerJobMasterInternal
INFO: Registered job manager 
b98c9ac389fa764ebff6633dca634521@akka://flink/user/jobmanager_237 for job 
9c1e8c68ccb2a023393cf42aa3201479.
Nov 13, 2018 12:37:01 AM org.apache.flink.runtime.jobmaster.JobMaster 
establishResourceManagerConnection
INFO: JobManager successfully registered at ResourceManager, leader id: 
8a652b825d533aff109b5fec227f48f6.
Nov 13, 2018 12:37:01 AM 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool 
requestSlotFromResourceManager
INFO: Requesting new slot [SlotRequestId{ebbd15d3c822ebb21669a097d6cb8dcd}] 
and profile ResourceProfile{cpuCores=-1.0, heapMemoryInMB=-1, 
directMemoryInMB=0, nativeMemoryInMB=0, networkMemoryInMB=0} from resource 
manager.
Nov 13, 2018 12:37:01 AM 
org.apache.flink.runtime.resourcemanager.ResourceManager requestSlot
INFO: Request slot with profile ResourceProfile{cpuCores=-1.0, 
heapMemoryInMB=-1, directMemoryInMB=0, nativeMemoryInMB=0, networkMemoryInMB=0} 
for job 9c1e8c68ccb2a023393cf42aa3201479 with allocation id 
AllocationID{11e6526b60158a7c824e4e21ca9b1f7d}.
Nov 13, 2018 12:37:01 AM org.apache.flink.runtime.taskexecutor.TaskExecutor 
requestSlot
INFO: Receive slot request AllocationID{11e6526b60158a7c824e4e21ca9b1f7d} 
for job 9c1e8c68ccb2a023393cf42aa3201479 from resource manager with leader id 
8a652b825d533aff109b5fec227f48f6.
Nov 13, 2018 12:37:01 AM org.apache.flink.runtime.taskexecutor.TaskExecutor 
requestSlot
INFO: Allocated slot for AllocationID{11e6526b60158a7c824e4e21ca9b1f7d}.

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #2193

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1879

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[github] Clarify in docstrings that we expect  TFRecord values to be bytes

[github] [BEAM-6050] Use correct type on @ProcessElement method for

--
[...truncated 55.82 MB...]
Nov 13, 2018 3:57:32 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/ru.vyarus/generics-resolver/2.0.1/2182e67f161ddbe3ff8cb055bb54398354fda3f5/generics-resolver-2.0.1.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113035730-91c95d9c/output/results/staging/generics-resolver-2.0.1-VrpA4CuIscdXm0wF6oMTaQ.jar
Nov 13, 2018 3:57:32 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.esotericsoftware.minlog/minlog/1.2/59bfcd171d82f9981a5e242b9e840191f650e209/minlog-1.2.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113035730-91c95d9c/output/results/staging/minlog-1.2-98-99jtn3wu_pMfLJgiFvA.jar
Nov 13, 2018 3:57:32 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.squareup.okhttp/okhttp/2.5.0/4de2b4ed3445c37ec1720a7d214712e845a24636/okhttp-2.5.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113035730-91c95d9c/output/results/staging/okhttp-2.5.0-64v0X4G_nxfR_PsuymOqpg.jar
Nov 13, 2018 3:57:32 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.18/e4a441249ade301985cb8d009d4e4a72b85bf68e/snakeyaml-1.18.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113035730-91c95d9c/output/results/staging/snakeyaml-1.18-R-oOCdgXK4J6PV-saHyToQ.jar
Nov 13, 2018 3:57:32 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.httpcomponents/httpcore/4.4.6/e3fd8ced1f52c7574af952e2e6da0df8df08eb82/httpcore-4.4.6.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113035730-91c95d9c/output/results/staging/httpcore-4.4.6-qfvVA-CAJQfv7q_7VrvfUg.jar
Nov 13, 2018 3:57:32 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.9/9ce04e34240f674bc72680f8b843b1457383161a/commons-codec-1.9.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113035730-91c95d9c/output/results/staging/commons-codec-1.9-dWFTVmBcgSgBPanjrGKiSQ.jar
Nov 13, 2018 3:57:32 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.javaruntype/javaruntype/1.3/26ba963f4b20c751e07b58b990bb41bf850622d8/javaruntype-1.3.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113035730-91c95d9c/output/results/staging/javaruntype-1.3-rOez6Sa6vLqndviPQ7_noQ.jar
Nov 13, 2018 3:57:32 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.esotericsoftware.reflectasm/reflectasm/1.07/76f11c94a53ee975a0d9154b325c408b210155bd/reflectasm-1.07-shaded.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113035730-91c95d9c/output/results/staging/reflectasm-1.07-shaded-IELCIoQPsh49E873w5ZYMQ.jar
Nov 13, 2018 3:57:32 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/io.netty/netty-tcnative-boringssl-static/2.0.8.Final/5c3483dfa33cd04f5469c95abf67e1b69a8f1221/netty-tcnative-boringssl-static-2.0.8.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113035730-91c95d9c/output/results/staging/netty-tcnative-boringssl-static-2.0.8.Final-RCm0wU8kBdzNqqi47Z837A.jar
Nov 13, 2018 3:57:32 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.25.Final/e17d5c05c101fe14536ce3fb34b36c54e04791f6/netty-common-4.1.25.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1113035730-91c95d9c/output/results/staging/netty-common-4.1.25.Final-cYwMY1_F2pzboFUnBL3CxQ.jar
Nov 13, 2018 3:57:32 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #2191

2018-11-12 Thread Apache Jenkins Server
See 


--
[...truncated 67.56 MB...]
Nov 13, 2018 12:13:52 AM org.apache.flink.runtime.executiongraph.Execution 
transitionState
INFO: Source: 
GenerateSequence/Read(UnboundedCountingSource)/Create/Read(CreateSource) -> 
GenerateSequence/Read(UnboundedCountingSource)/Split/ParMultiDo(Split) -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Pair with random 
key/ParMultiDo(AssignShard) -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/Window.Into()/Window.Assign.out
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> ToKeyedWorkItem (1/1) (7661eea4ed7f245541ed39acef91d1fb) switched from 
CREATED to SCHEDULED.
Nov 13, 2018 12:13:52 AM org.apache.flink.runtime.executiongraph.Execution 
transitionState
INFO: 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/GroupByKey 
-> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Values/Values/Map/ParMultiDo(Anonymous)
 -> GenerateSequence/Read(UnboundedCountingSource)/Read/ParMultiDo(Read) -> 
GenerateSequence/Read(UnboundedCountingSource)/StripIds/ParMultiDo(StripIds) -> 
ParDo(Counting)/ParMultiDo(Counting) (1/1) (5fff4fde72a3eac98017b711d5c0f53b) 
switched from CREATED to SCHEDULED.
Nov 13, 2018 12:13:52 AM 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool 
stashRequestWaitingForResourceManager
INFO: Cannot serve slot request, no ResourceManager connected. Adding as 
pending request [SlotRequestId{0137e075af931a2a8b0bcb5ce5b305bf}]
Nov 13, 2018 12:13:52 AM 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
confirmLeader
INFO: Received confirmation of leadership for leader 
akka://flink/user/jobmanager_237 , session=251ad528-1463-4b9a-a3bb-fd9149f515a6
Nov 13, 2018 12:13:52 AM org.apache.flink.runtime.jobmaster.JobMaster 
connectToResourceManager
INFO: Connecting to ResourceManager 
akka://flink/user/resourcemanager_e140dfe4-2cfd-482d-a8ac-5be2e06dd080(9e7b473e972513fafa6023502f044c35)
Nov 13, 2018 12:13:52 AM 
org.apache.flink.runtime.registration.RetryingRegistration 
lambda$startRegistration$0
INFO: Resolved ResourceManager address, beginning registration
Nov 13, 2018 12:13:52 AM 
org.apache.flink.runtime.registration.RetryingRegistration register
INFO: Registration at ResourceManager attempt 1 (timeout=100ms)
Nov 13, 2018 12:13:52 AM 
org.apache.flink.runtime.resourcemanager.ResourceManager registerJobManager
INFO: Registering job manager 
a3bbfd9149f515a6251ad52814634b9a@akka://flink/user/jobmanager_237 for job 
69a2726d337883ad7288f532adebfc60.
Nov 13, 2018 12:13:52 AM 
org.apache.flink.runtime.resourcemanager.ResourceManager 
registerJobMasterInternal
INFO: Registered job manager 
a3bbfd9149f515a6251ad52814634b9a@akka://flink/user/jobmanager_237 for job 
69a2726d337883ad7288f532adebfc60.
Nov 13, 2018 12:13:52 AM org.apache.flink.runtime.jobmaster.JobMaster 
establishResourceManagerConnection
INFO: JobManager successfully registered at ResourceManager, leader id: 
9e7b473e972513fafa6023502f044c35.
Nov 13, 2018 12:13:52 AM 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool 
requestSlotFromResourceManager
INFO: Requesting new slot [SlotRequestId{0137e075af931a2a8b0bcb5ce5b305bf}] 
and profile ResourceProfile{cpuCores=-1.0, heapMemoryInMB=-1, 
directMemoryInMB=0, nativeMemoryInMB=0, networkMemoryInMB=0} from resource 
manager.
Nov 13, 2018 12:13:52 AM 
org.apache.flink.runtime.resourcemanager.ResourceManager requestSlot
INFO: Request slot with profile ResourceProfile{cpuCores=-1.0, 
heapMemoryInMB=-1, directMemoryInMB=0, nativeMemoryInMB=0, networkMemoryInMB=0} 
for job 69a2726d337883ad7288f532adebfc60 with allocation id 
AllocationID{8aaf22608b5b591717a7d29629bbd912}.
Nov 13, 2018 12:13:52 AM org.apache.flink.runtime.taskexecutor.TaskExecutor 
requestSlot
INFO: Receive slot request AllocationID{8aaf22608b5b591717a7d29629bbd912} 
for job 69a2726d337883ad7288f532adebfc60 from resource manager with leader id 
9e7b473e972513fafa6023502f044c35.
Nov 13, 2018 12:13:52 AM org.apache.flink.runtime.taskexecutor.TaskExecutor 
requestSlot
INFO: Allocated slot for AllocationID{8aaf22608b5b591717a7d29629bbd912}.
Nov 13, 2018 12:13:52 AM 
org.apache.flink.runtime.taskexecutor.JobLeaderService addJob

Jenkins build is back to normal : beam_PreCommit_Java_Cron #581

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_Release_Gradle_NightlySnapshot #236

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[robertwb] [BEAM-5791] Implement time-based pushback in the dataflow harness 
data

[robertwb] Ensure writer thread is unblocked on errors.

[github] Update instructions to run streaming wordcount.

--
[...truncated 41.57 MB...]
  No history is available.
:beam-vendor-sdks-java-extensions-protobuf:validateShadedJarDoesntLeakNonOrgApacheBeamClasses
 (Thread[Task worker for ':' Thread 7,5,main]) completed. Took 0.059 secs.
:beam-vendor-sdks-java-extensions-protobuf:check (Thread[Task worker for ':' 
Thread 7,5,main]) started.

> Task :beam-vendor-sdks-java-extensions-protobuf:check
Skipping task ':beam-vendor-sdks-java-extensions-protobuf:check' as it has no 
actions.
:beam-vendor-sdks-java-extensions-protobuf:check (Thread[Task worker for ':' 
Thread 7,5,main]) completed. Took 0.0 secs.
:beam-vendor-sdks-java-extensions-protobuf:build (Thread[Task worker for ':' 
Thread 7,5,main]) started.

> Task :beam-vendor-sdks-java-extensions-protobuf:build
Skipping task ':beam-vendor-sdks-java-extensions-protobuf:build' as it has no 
actions.
:beam-vendor-sdks-java-extensions-protobuf:build (Thread[Task worker for ':' 
Thread 7,5,main]) completed. Took 0.0 secs.
:beam-website:assemble (Thread[Task worker for ':' Thread 7,5,main]) started.

> Task :beam-website:assemble UP-TO-DATE
Skipping task ':beam-website:assemble' as it has no actions.
:beam-website:assemble (Thread[Task worker for ':' Thread 7,5,main]) completed. 
Took 0.0 secs.
:beam-website:setupBuildDir (Thread[Task worker for ':' Thread 7,5,main]) 
started.

> Task :beam-website:setupBuildDir
Build cache key for task ':beam-website:setupBuildDir' is 
fb2b365f958e0219d3a886db810dc756
Caching disabled for task ':beam-website:setupBuildDir': Caching has not been 
enabled for the task
Task ':beam-website:setupBuildDir' is not up-to-date because:
  No history is available.
:beam-website:setupBuildDir (Thread[Task worker for ':' Thread 7,5,main]) 
completed. Took 0.005 secs.
:beam-website:buildDockerImage (Thread[Task worker for ':' Thread 7,5,main]) 
started.

> Task :beam-website:buildDockerImage
Caching disabled for task ':beam-website:buildDockerImage': Caching has not 
been enabled for the task
Task ':beam-website:buildDockerImage' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: 

 Command: docker build -t beam-website .
Successfully started process 'command 'docker''
Sending build context to Docker daemon  26.34MB
Step 1/7 : FROM ruby:2.5
2.5: Pulling from library/ruby
Digest: sha256:4fa27d2f60ffb82eb6160d8a2d112d2eea216cc4f1d814d4fd7daea704a6917d
Status: Downloaded newer image for ruby:2.5
 ---> 1dfe66cd651f
Step 2/7 : WORKDIR /ruby
 ---> Using cache
 ---> bec49e663d43
Step 3/7 : RUN gem install bundler
 ---> Using cache
 ---> 380cc7b0fe26
Step 4/7 : ADD Gemfile Gemfile.lock /ruby/
 ---> Using cache
 ---> cf9b1dc8d8cb
Step 5/7 : RUN bundle install --deployment --path $GEM_HOME
 ---> Using cache
 ---> 7178c2f8c9ae
Step 6/7 : ENV LC_ALL C.UTF-8
 ---> Using cache
 ---> 895f003a4dae
Step 7/7 : CMD sleep 3600
 ---> Using cache
 ---> 644f4b7c69fa
Successfully built 644f4b7c69fa
Successfully tagged beam-website:latest
:beam-website:buildDockerImage (Thread[Task worker for ':' Thread 7,5,main]) 
completed. Took 1.047 secs.
:beam-website:createDockerContainer (Thread[Task worker for ':' Thread 
7,5,main]) started.

> Task :beam-website:createDockerContainer
Caching disabled for task ':beam-website:createDockerContainer': Caching has 
not been enabled for the task
Task ':beam-website:createDockerContainer' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command '/bin/bash''. Working directory: 

 Command: /bin/bash -c docker create -v 
:/repo
 -u $(id -u):$(id -g)  beam-website
Successfully started process 'command '/bin/bash''
:beam-website:createDockerContainer (Thread[Task worker for ':' Thread 
7,5,main]) completed. Took 1.09 secs.
:beam-website:startDockerContainer (Thread[Task worker for ':' Thread 
7,5,main]) started.

> Task :beam-website:startDockerContainer
Caching disabled for task ':beam-website:startDockerContainer': Caching has not 
been enabled for the task
Task ':beam-website:startDockerContainer' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: 

 Command: docker start 

Build failed in Jenkins: beam_PostCommit_Python_Verify #6527

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[robertwb] [BEAM-5791] Implement time-based pushback in the dataflow harness 
data

[robertwb] Ensure writer thread is unblocked on errors.

[github] Update instructions to run streaming wordcount.

--
[...truncated 1.38 MB...]
test_delete_error (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json 

Build failed in Jenkins: beam_PerformanceTests_XmlIOIT_HDFS #896

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[github] Clarify in docstrings that we expect  TFRecord values to be bytes

[github] [BEAM-6050] Use correct type on @ProcessElement method for

--
[...truncated 304.83 KB...]
INFO: 2018-11-13T06:28:07.235Z: Fusing consumer Calculate 
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues into Calculate 
hashcode/Combine.perKey(Hashing)/GroupByKey/Read
Nov 13, 2018 6:28:52 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T06:28:07.337Z: Fusing consumer Calculate 
hashcode/Combine.perKey(Hashing)/GroupByKey/Write into Calculate 
hashcode/Combine.perKey(Hashing)/GroupByKey/Reify
Nov 13, 2018 6:28:52 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T06:28:07.603Z: Fusing consumer Calculate 
hashcode/WithKeys/AddKeys/Map into Map xml records to strings/Map
Nov 13, 2018 6:28:52 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T06:28:07.849Z: Fusing consumer Find 
files/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into Find 
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Nov 13, 2018 6:28:52 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T06:28:07.950Z: Fusing consumer Find 
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into Find 
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
Nov 13, 2018 6:28:52 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T06:28:08.198Z: Fusing consumer Write xml 
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with 
random key into Write xml files/WriteFiles/FinalizeTempFileBundles/Finalize
Nov 13, 2018 6:28:52 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T06:28:08.434Z: Fusing consumer Find 
files/Reshuffle.ViaRandomKey/Values/Values/Map into Find 
files/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
Nov 13, 2018 6:28:52 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T06:28:08.742Z: Fusing consumer Write xml 
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
 into Write xml 
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with 
random key
Nov 13, 2018 6:28:52 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T06:28:08.902Z: Fusing consumer Write xml 
files/WriteFiles/FinalizeTempFileBundles/Finalize into Write xml 
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map
Nov 13, 2018 6:28:52 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T06:28:09.140Z: Fusing consumer Read xml 
files/ReadAllViaFileBasedSource/Split into ranges into Read matched 
files/ParDo(ToReadableFile)
Nov 13, 2018 6:28:52 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T06:28:09.479Z: Fusing consumer Read xml 
files/ReadAllViaFileBasedSource/Read ranges into Read xml 
files/ReadAllViaFileBasedSource/Reshuffle/Values/Values/Map
Nov 13, 2018 6:28:52 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T06:28:09.701Z: Fusing consumer Write xml 
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write into Write 
xml files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify
Nov 13, 2018 6:28:52 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T06:28:10.051Z: Fusing consumer Write xml 
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map 
into Write xml 
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
Nov 13, 2018 6:28:52 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T06:28:10.330Z: Fusing consumer Find 
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into Find 
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
Nov 13, 2018 6:28:52 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T06:28:10.740Z: Fusing consumer Write xml 
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
 into Write xml 
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Nov 13, 2018 6:28:52 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-11-13T06:28:10.931Z: Fusing consumer Read xml 

Jenkins build is back to normal : beam_PostCommit_Java_PVR_Flink #298

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_Verify #6540

2018-11-12 Thread Apache Jenkins Server
See 


--
[...truncated 1.38 MB...]
test_delete_error (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_parse_schema_descriptor 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySink) ... SKIP: GCP dependencies 
are not installed

Jenkins build is back to normal : beam_PostCommit_Java_GradleBuild #1880

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #1690

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[github] Update built-in.md

--
[...truncated 271.51 KB...]
exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
SDK_LOCATION=$(find ${SDK_LOCATION})
  else
python setup.py -q sdist
SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used 
for
  # running on DirectRunner (some options ignored).
  opts=(
"--runner=$RUNNER"
"--project=$PROJECT"
"--staging_location=$GCS_LOCATION/staging-it"
"--temp_location=$GCS_LOCATION/temp-it"
"--output=$GCS_LOCATION/py-it-cloud/output"
"--sdk_location=$SDK_LOCATION"
"--requirements_file=postcommit_requirements.txt"
"--num_workers=$NUM_WORKERS"
"--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"


###
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=apache-beam-testing 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=build/apache-beam.tar.gz 
>>> --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 --streaming 
>>> --dataflow_worker_jar=
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
:398:
 UserWarning: Normalizing '2.9.0.dev' to '2.9.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
:800:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  options = pbegin.pipeline.options.view_as(DebugOptions)
:800:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  options = pbegin.pipeline.options.view_as(DebugOptions)
:800:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  options = pbegin.pipeline.options.view_as(DebugOptions)
:800:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  options = pbegin.pipeline.options.view_as(DebugOptions)
:800:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to .options will not be supported
  options = 

Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #1691

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #990

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[github] Update built-in.md

--
[...truncated 235.98 KB...]
INFO: Adding Query7/Query7.Snoop as step s3
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/StripNonce/Map
 as step s12
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Window.Remerge/Identity/Map
 as step s13
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PrepareCold/Map
 as step s14
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Flatten.PCollections
 as step s15
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/GroupByKey
 as step s16
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/Combine.GroupedValues
 as step s17
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Values/Values/Map
 as step s18
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 as step s19
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly
 as step s20
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
 as step s21
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Combine.GloballyAsSingletonView/CreateDataflowView 
as step s22
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Query7.Select as step s23
Nov 12, 2018 11:13:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Debug as 

Jenkins build is back to normal : beam_PostCommit_Java_Nexmark_Spark #1037

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PreCommit_Website_Cron #294

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[robertwb] [BEAM-5791] Implement time-based pushback in the dataflow harness 
data

[robertwb] Ensure writer thread is unblocked on errors.

[echauchot] [BEAM-5817] Fix javadoc for BoundedSideInputJoinModel.Simulator and

[github] Update instructions to run streaming wordcount.

[github] Update built-in.md

--
[...truncated 16.35 KB...]
:buildSrc:test (Thread[Task worker for ':buildSrc' Thread 3,5,main]) completed. 
Took 0.004 secs.
:buildSrc:check (Thread[Task worker for ':buildSrc' Thread 3,5,main]) started.

> Task :buildSrc:check
Skipping task ':buildSrc:check' as it has no actions.
:buildSrc:check (Thread[Task worker for ':buildSrc' Thread 3,5,main]) 
completed. Took 0.0 secs.
:buildSrc:build (Thread[Task worker for ':buildSrc' Thread 3,5,main]) started.

> Task :buildSrc:build
Skipping task ':buildSrc:build' as it has no actions.
:buildSrc:build (Thread[Task worker for ':buildSrc' Thread 3,5,main]) 
completed. Took 0.0 secs.
Settings evaluated using settings file 
'
Using local directory build cache for the root build (location = 
/home/jenkins/.gradle/caches/build-cache-1, removeUnusedEntriesAfter = 7 days).
Projects loaded. Root project using build file 
'
Included projects: [root project 'beam', project ':beam-examples-java', project 
':beam-model-fn-execution', project ':beam-model-job-management', project 
':beam-model-pipeline', project ':beam-runners-apex', project 
':beam-runners-core-construction-java', project ':beam-runners-core-java', 
project ':beam-runners-direct-java', project 
':beam-runners-extensions-java-metrics', project ':beam-runners-flink_2.11', 
project ':beam-runners-flink_2.11-job-server', project 
':beam-runners-flink_2.11-job-server-container', project 
':beam-runners-gearpump', project ':beam-runners-google-cloud-dataflow-java', 
project ':beam-runners-google-cloud-dataflow-java-examples', project 
':beam-runners-google-cloud-dataflow-java-examples-streaming', project 
':beam-runners-google-cloud-dataflow-java-fn-api-worker', project 
':beam-runners-google-cloud-dataflow-java-legacy-worker', project 
':beam-runners-google-cloud-dataflow-java-windmill', project 
':beam-runners-java-fn-execution', project ':beam-runners-local-java-core', 
project ':beam-runners-reference-java', project 
':beam-runners-reference-job-server', project ':beam-runners-samza', project 
':beam-runners-spark', project ':beam-sdks-go', project 
':beam-sdks-go-container', project ':beam-sdks-go-examples', project 
':beam-sdks-go-test', project ':beam-sdks-java-build-tools', project 
':beam-sdks-java-container', project ':beam-sdks-java-core', project 
':beam-sdks-java-extensions-euphoria', project 
':beam-sdks-java-extensions-google-cloud-platform-core', project 
':beam-sdks-java-extensions-join-library', project 
':beam-sdks-java-extensions-json-jackson', project 
':beam-sdks-java-extensions-kryo', project 
':beam-sdks-java-extensions-protobuf', project 
':beam-sdks-java-extensions-sketching', project 
':beam-sdks-java-extensions-sorter', project ':beam-sdks-java-extensions-sql', 
project ':beam-sdks-java-extensions-sql-jdbc', project 
':beam-sdks-java-extensions-sql-shell', project ':beam-sdks-java-fn-execution', 
project ':beam-sdks-java-harness', project 
':beam-sdks-java-io-amazon-web-services', project ':beam-sdks-java-io-amqp', 
project ':beam-sdks-java-io-cassandra', project ':beam-sdks-java-io-common', 
project ':beam-sdks-java-io-elasticsearch', project 
':beam-sdks-java-io-elasticsearch-tests-2', project 
':beam-sdks-java-io-elasticsearch-tests-5', project 
':beam-sdks-java-io-elasticsearch-tests-6', project 
':beam-sdks-java-io-elasticsearch-tests-common', project 
':beam-sdks-java-io-file-based-io-tests', project 
':beam-sdks-java-io-google-cloud-platform', project 
':beam-sdks-java-io-hadoop-common', project 
':beam-sdks-java-io-hadoop-file-system', project 
':beam-sdks-java-io-hadoop-input-format', project ':beam-sdks-java-io-hbase', 
project ':beam-sdks-java-io-hcatalog', project ':beam-sdks-java-io-jdbc', 
project ':beam-sdks-java-io-jms', project ':beam-sdks-java-io-kafka', project 
':beam-sdks-java-io-kinesis', project ':beam-sdks-java-io-kudu', project 
':beam-sdks-java-io-mongodb', project ':beam-sdks-java-io-mqtt', project 
':beam-sdks-java-io-parquet', project ':beam-sdks-java-io-rabbitmq', project 
':beam-sdks-java-io-redis', project ':beam-sdks-java-io-solr', project 
':beam-sdks-java-io-synthetic', project ':beam-sdks-java-io-tika', project 
':beam-sdks-java-io-xml', project ':beam-sdks-java-javadoc', project 
':beam-sdks-java-load-tests', project 
':beam-sdks-java-maven-archetypes-examples', project 
':beam-sdks-java-maven-archetypes-starter', project 

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1872

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[echauchot] [BEAM-5817] Fix javadoc for BoundedSideInputJoinModel.Simulator and

--
[...truncated 57.53 MB...]
Nov 12, 2018 1:01:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.javaruntype/javaruntype/1.3/26ba963f4b20c751e07b58b990bb41bf850622d8/javaruntype-1.3.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112130154-3f8197c0/output/results/staging/javaruntype-1.3-rOez6Sa6vLqndviPQ7_noQ.jar
Nov 12, 2018 1:01:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/io.netty/netty-transport/4.1.25.Final/19a6f1f649894b6705aa9d8cbcced188dff133b0/netty-transport-4.1.25.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112130154-3f8197c0/output/results/staging/netty-transport-4.1.25.Final-3-E9kW1IVWJlpjxjfs5xBA.jar
Nov 12, 2018 1:01:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.25.Final/70888d3f2a829541378f68503ddd52c3193df35a/netty-codec-http-4.1.25.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112130154-3f8197c0/output/results/staging/netty-codec-http-4.1.25.Final-tufaEwMjr6MgPM2gPyDcGA.jar
Nov 12, 2018 1:01:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.powermock/powermock-core/1.6.4/85fb32e9ccba748d569fc36aef92e0b9e7f40b87/powermock-core-1.6.4.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112130154-3f8197c0/output/results/staging/powermock-core-1.6.4-0RprEjI0qIRX-gAeCFZdXQ.jar
Nov 12, 2018 1:01:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/io.netty/netty-resolver/4.1.25.Final/dc0965d00746b782b33f419b005cbc130973030d/netty-resolver-4.1.25.Final.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112130154-3f8197c0/output/results/staging/netty-resolver-4.1.25.Final-P7rE-qrpxYHhZqQizPWgXw.jar
Nov 12, 2018 1:01:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.ow2.asm/asm/4.0/659add6efc75a4715d738e73f07505246edf4d66/asm-4.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112130154-3f8197c0/output/results/staging/asm-4.0-Mi2PiMURGvYS34OMAZHNfg.jar
Nov 12, 2018 1:01:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.beust/jcommander/1.27/58c9cbf0f1fa296f93c712f2cf46de50471920f9/jcommander-1.27.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112130154-3f8197c0/output/results/staging/jcommander-1.27-YxYbyOYD5gurYr1hw0nzcA.jar
Nov 12, 2018 1:01:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/ognl/ognl/3.1.12/a7fa0db32f882cd3bb41ec6c489853b3bfb6aebc/ognl-3.1.12.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112130154-3f8197c0/output/results/staging/ognl-3.1.12-aq6HFAi1aHiNX6qcImOsmw.jar
Nov 12, 2018 1:01:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.httpcomponents/httpcore/4.4.6/e3fd8ced1f52c7574af952e2e6da0df8df08eb82/httpcore-4.4.6.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112130154-3f8197c0/output/results/staging/httpcore-4.4.6-qfvVA-CAJQfv7q_7VrvfUg.jar
Nov 12, 2018 1:01:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.9/9ce04e34240f674bc72680f8b843b1457383161a/commons-codec-1.9.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112130154-3f8197c0/output/results/staging/commons-codec-1.9-dWFTVmBcgSgBPanjrGKiSQ.jar
Nov 12, 2018 1:01:56 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.18/e4a441249ade301985cb8d009d4e4a72b85bf68e/snakeyaml-1.18.jar
 to 

Build failed in Jenkins: beam_PostCommit_Python_Verify #6530

2018-11-12 Thread Apache Jenkins Server
See 


--
[...truncated 1.38 MB...]
test_delete_error (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_parse_schema_descriptor 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySink) ... SKIP: GCP dependencies 
are not installed

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Gearpump_Gradle #1296

2018-11-12 Thread Apache Jenkins Server
See 


--
[...truncated 10.09 MB...]
INFO: opening data source at -9223372036854775808
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 15
Nov 12, 2018 12:26:49 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 16
Nov 12, 2018 12:26:49 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 12:26:49 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 21
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 18
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 17
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 19
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 22
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 20
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 25
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 23
Nov 12, 2018 12:26:49 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 24
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 35
Nov 12, 2018 12:26:49 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 27
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 26
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 28
Nov 12, 2018 12:26:49 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 12:26:49 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 29
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 40
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 30
Nov 12, 2018 12:26:49 PM org.apache.gearpump.streaming.task.TaskActor 

Build failed in Jenkins: beam_PostCommit_Python_Verify #6529

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[github] Update built-in.md

--
[...truncated 1.38 MB...]
test_delete_error (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_parse_schema_descriptor 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySink) ... SKIP: GCP 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Spark #1036

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[github] Update built-in.md

--
[...truncated 892.23 KB...]
at java.lang.Thread.run(Thread.java:748)
18/11/12 11:27:42 ERROR 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter: Error while 
deleting file 
/tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/28/temp_shuffle_dc491e52-da7c-4351-86d1-f798c0dbae5a
18/11/12 11:27:42 ERROR org.apache.spark.storage.DiskBlockObjectWriter: 
Uncaught exception while reverting partial writes to file 
/tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/07/temp_shuffle_7dd78bed-2e4a-43f2-a562-c49f16bf7cb3
java.io.FileNotFoundException: 
/tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/07/temp_shuffle_7dd78bed-2e4a-43f2-a562-c49f16bf7cb3
 (No such file or directory)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.(FileOutputStream.java:213)
at 
org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
at 
org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
18/11/12 11:27:42 ERROR 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter: Error while 
deleting file 
/tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/07/temp_shuffle_7dd78bed-2e4a-43f2-a562-c49f16bf7cb3
18/11/12 11:27:42 ERROR org.apache.spark.storage.DiskBlockObjectWriter: 
Uncaught exception while reverting partial writes to file 
/tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/15/temp_shuffle_43301066-7eab-4f2b-ba0a-6bdd1345d946
java.io.FileNotFoundException: 
/tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/15/temp_shuffle_43301066-7eab-4f2b-ba0a-6bdd1345d946
 (No such file or directory)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.(FileOutputStream.java:213)
at 
org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
at 
org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
18/11/12 11:27:42 ERROR 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter: Error while 
deleting file 
/tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/15/temp_shuffle_43301066-7eab-4f2b-ba0a-6bdd1345d946
18/11/12 11:27:42 ERROR org.apache.spark.storage.DiskBlockObjectWriter: 
Uncaught exception while reverting partial writes to file 
/tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/02/temp_shuffle_7c7fa744-d90e-421b-a46e-32c5c59bc220
java.io.FileNotFoundException: 
/tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/02/temp_shuffle_7c7fa744-d90e-421b-a46e-32c5c59bc220
 (No such file or directory)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.(FileOutputStream.java:213)
at 
org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
at 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #991

2018-11-12 Thread Apache Jenkins Server
See 


--
[...truncated 268.33 KB...]
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-3MEfV2tSumFe00kuEztDXg.jar
Nov 12, 2018 12:04:03 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly uploaded
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/StripNonce/Map
 as step s12
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Window.Remerge/Identity/Map
 as step s13
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PrepareCold/Map
 as step s14
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Flatten.PCollections
 as step s15
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/GroupByKey
 as step s16
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/Combine.GroupedValues
 as step s17
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Values/Values/Map
 as step s18
Nov 12, 2018 12:04:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 as step s19
Nov 12, 2018 12:04:03 PM 

Build failed in Jenkins: beam_PostCommit_Python_Verify #6531

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[robertwb] [BEAM-3253] Specify gradle dependencies for Python tox tasks.

--
[...truncated 1.38 MB...]
test_delete_error (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_parse_schema_descriptor 

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #1692

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[robertwb] [BEAM-3253] Specify gradle dependencies for Python tox tasks.

--
[...truncated 269.73 KB...]
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

--
XML: 

--
Ran 16 tests in 1020.263s

OK
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_06_52_03-279055853632004277?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_07_00_39-16157442368899909467?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_06_52_02-7622132847849386286?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_07_00_48-1022471022588102064?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_06_52_01-4751052871804700189?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_06_59_13-2483136345761509918?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_06_52_02-1601910331177415236?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_07_01_02-4371021464321570820?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_06_52_02-5288720495186212331?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_07_00_58-9533669261038584768?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_06_52_03-9288315918119624798?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_07_00_54-17304034483186705607?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_06_52_03-11503454100825696399?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_06_59_19-13530980596054855439?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_06_52_01-11960010504093251312?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_07_00_37-11601547775946287898?project=apache-beam-testing.
:beam-sdks-python:validatesRunnerBatchTests (Thread[Task worker for ':' Thread 
6,5,main]) completed. Took 17 mins 1.391 secs.
:beam-sdks-python:validatesRunnerStreamingTests (Thread[Task worker for ':' 
Thread 6,5,main]) started.

> Task :beam-sdks-python:validatesRunnerStreamingTests
Caching disabled for task ':beam-sdks-python:validatesRunnerStreamingTests': 
Caching has not been enabled for the task
Task ':beam-sdks-python:validatesRunnerStreamingTests' is not up-to-date 
because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: 

 Command: sh -c . 

 && ./scripts/run_integration_test.sh --test_opts "--nocapture --processes=8 
--process-timeout=4500 --attr=ValidatesRunner,!sickbay-streaming" --streaming 
true --worker_jar 

Successfully started process 'command 'sh''


###
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script 

Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #712

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[robertwb] [BEAM-3253] Specify gradle dependencies for Python tox tasks.

--
[...truncated 4.44 MB...]
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Successful registration at 
resource manager 
akka://flink/user/resourcemanager_1ae732df-9f04-4078-8ebe-da8d71559f32 under 
registration id 5453e2ca2e8996bbd36b0f8e4758d2be.
[flink-runner-job-server] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Rest endpoint 
listening at localhost:33705
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint@11962542 @ 
http://localhost:33705
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Starting job dispatcher(s) for JobManger
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - 
http://localhost:33705 was granted leadership with 
leaderSessionID=ce7c65e8-9fa6-4f3d-a3dd-5fa42318a3fb
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader http://localhost:33705 , 
session=ce7c65e8-9fa6-4f3d-a3dd-5fa42318a3fb
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService 
- Starting RPC endpoint for 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher at 
akka://flink/user/dispatcherd2cb120d-33e9-4356-b32c-0d52e479c556 .
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher@6e999af4 @ 
akka://flink/user/dispatcherd2cb120d-33e9-4356-b32c-0d52e479c556
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Flink Mini Cluster started successfully
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Dispatcher 
akka://flink/user/dispatcherd2cb120d-33e9-4356-b32c-0d52e479c556 was granted 
leadership with fencing token f5e4bdd6-d9c4-48b3-83ec-68f33e1a80a4
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Recovering all 
persisted jobs.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader 
akka://flink/user/dispatcherd2cb120d-33e9-4356-b32c-0d52e479c556 , 
session=f5e4bdd6-d9c4-48b3-83ec-68f33e1a80a4
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Submitting job 
658fdb8eef8cb52dd831762371a66978 (test_windowing_1542034928.27).
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_41 
.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Initializing job 
test_windowing_1542034928.27 (658fdb8eef8cb52dd831762371a66978).
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Using restart strategy 
NoRestartStrategy for test_windowing_1542034928.27 
(658fdb8eef8cb52dd831762371a66978).
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool at 
akka://flink/user/a67bb2f5-91dc-45f4-8486-a12f9497ed9c .
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job recovers via 
failover strategy: full graph restart
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Running initialization on master 
for job test_windowing_1542034928.27 (658fdb8eef8cb52dd831762371a66978).
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Successfully ran initialization 
on master in 0 ms.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - No state backend has been 
configured, using default (Memory / JobManager) MemoryStateBackend (data in 
heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 
'null', asynchronous: TRUE, maxStateSize: 5242880)
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.jobmaster.JobManagerRunner@781eb61d @ 
akka://flink/user/jobmanager_41
[flink-akka.actor.default-dispatcher-5] INFO 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #992

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[robertwb] [BEAM-3253] Specify gradle dependencies for Python tox tasks.

--
[...truncated 269.99 KB...]
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

Nov 12, 2018 2:55:22 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-jYtIckQZlRIwD26qHxfBUw.jar
Nov 12, 2018 2:55:25 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly uploaded
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/StripNonce/Map
 as step s12
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Window.Remerge/Identity/Map
 as step s13
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PrepareCold/Map
 as step s14
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Flatten.PCollections
 as step s15
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/GroupByKey
 as step s16
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/Combine.GroupedValues
 as step s17
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Values/Values/Map
 as step s18
Nov 12, 2018 2:55:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 

Jenkins build is back to normal : beam_PostCommit_Java_ValidatesRunner_Gearpump_Gradle #1297

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #996

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[aaltay] [BEAM-3612] Add a shim generator tool (#7000)

--
[...truncated 275.05 KB...]
INFO: Adding Query7.ReadBounded as step s1
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/StripNonce/Map
 as step s12
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Window.Remerge/Identity/Map
 as step s13
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PrepareCold/Map
 as step s14
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Flatten.PCollections
 as step s15
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/GroupByKey
 as step s16
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/Combine.GroupedValues
 as step s17
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Values/Values/Map
 as step s18
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 as step s19
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly
 as step s20
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
 as step s21
Nov 12, 2018 6:16:03 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Combine.GloballyAsSingletonView/CreateDataflowView 
as 

Build failed in Jenkins: beam_PostCommit_Python_Verify #6533

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-6018] Don't use getExitingExecutorService (#6994)

--
[...truncated 1.38 MB...]
test_delete_error (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_parse_schema_descriptor 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #994

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-6018] Don't use getExitingExecutorService (#6994)

--
[...truncated 254.13 KB...]
{
  "code" : 429,
  "errors" : [ {
"domain" : "usageLimits",
"message" : "The total number of changes to the object 
temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-u62fEVonql0e6N3gVxyklQ.jar
 exceeds the rate limit. Please reduce the rate of create, update, and delete 
requests.",
"reason" : "rateLimitExceeded"
  } ],
  "message" : "The total number of changes to the object 
temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-u62fEVonql0e6N3gVxyklQ.jar
 exceeds the rate limit. Please reduce the rate of create, update, and delete 
requests."
}
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel.waitForCompletionAndThrowIfUploadFailed(AbstractGoogleAsyncWriteChannel.java:432)
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel.close(AbstractGoogleAsyncWriteChannel.java:287)
at 
org.apache.beam.runners.dataflow.util.PackageUtil.$closeResource(PackageUtil.java:260)
at 
org.apache.beam.runners.dataflow.util.PackageUtil.tryStagePackage(PackageUtil.java:260)
at 
org.apache.beam.runners.dataflow.util.PackageUtil.tryStagePackageWithRetry(PackageUtil.java:203)
at 
org.apache.beam.runners.dataflow.util.PackageUtil.stagePackageSynchronously(PackageUtil.java:187)
at 
org.apache.beam.runners.dataflow.util.PackageUtil.lambda$stagePackage$1(PackageUtil.java:171)
at 
org.apache.beam.sdk.util.MoreFutures.lambda$supplyAsync$0(MoreFutures.java:104)
at 
java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1626)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 
410 Gone
{
  "code" : 429,
  "errors" : [ {
"domain" : "usageLimits",
"message" : "The total number of changes to the object 
temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-u62fEVonql0e6N3gVxyklQ.jar
 exceeds the rate limit. Please reduce the rate of create, update, and delete 
requests.",
"reason" : "rateLimitExceeded"
  } ],
  "message" : "The total number of changes to the object 
temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-u62fEVonql0e6N3gVxyklQ.jar
 exceeds the rate limit. Please reduce the rate of create, update, and delete 
requests."
}
at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel$UploadOperation.call(AbstractGoogleAsyncWriteChannel.java:358)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

Nov 12, 2018 5:18:43 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-u62fEVonql0e6N3gVxyklQ.jar
Nov 12, 2018 5:18:46 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 28 files newly uploaded
Nov 12, 2018 5:18:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JoinToFiles.ReadBounded as step s1
Nov 12, 2018 5:18:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
NexmarkUtils.GenerateSideInputData/GenerateSequence/Read(BoundedCountingSource) 
as step s2
Nov 

Build failed in Jenkins: beam_PostCommit_Python_Verify #6532

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[mxm] Fix website precommit

--
[...truncated 1.38 MB...]
test_delete_error (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_parse_schema_descriptor 
(apache_beam.io.gcp.bigquery_test.TestBigQuerySink) ... SKIP: GCP 

Jenkins build is back to normal : beam_PreCommit_Website_Cron #295

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #995

2018-11-12 Thread Apache Jenkins Server
See 


--
[...truncated 248.09 KB...]
  "errors" : [ {
"domain" : "usageLimits",
"message" : "The total number of changes to the object 
temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-rs_h3GBsZXa-wKYMlsz9Sg.jar
 exceeds the rate limit. Please reduce the rate of create, update, and delete 
requests.",
"reason" : "rateLimitExceeded"
  } ],
  "message" : "The total number of changes to the object 
temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-rs_h3GBsZXa-wKYMlsz9Sg.jar
 exceeds the rate limit. Please reduce the rate of create, update, and delete 
requests."
}
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel.waitForCompletionAndThrowIfUploadFailed(AbstractGoogleAsyncWriteChannel.java:432)
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel.close(AbstractGoogleAsyncWriteChannel.java:287)
at 
org.apache.beam.runners.dataflow.util.PackageUtil.$closeResource(PackageUtil.java:260)
at 
org.apache.beam.runners.dataflow.util.PackageUtil.tryStagePackage(PackageUtil.java:260)
at 
org.apache.beam.runners.dataflow.util.PackageUtil.tryStagePackageWithRetry(PackageUtil.java:203)
at 
org.apache.beam.runners.dataflow.util.PackageUtil.stagePackageSynchronously(PackageUtil.java:187)
at 
org.apache.beam.runners.dataflow.util.PackageUtil.lambda$stagePackage$1(PackageUtil.java:171)
at 
org.apache.beam.sdk.util.MoreFutures.lambda$supplyAsync$0(MoreFutures.java:104)
at 
java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1626)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 
410 Gone
{
  "code" : 429,
  "errors" : [ {
"domain" : "usageLimits",
"message" : "The total number of changes to the object 
temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-rs_h3GBsZXa-wKYMlsz9Sg.jar
 exceeds the rate limit. Please reduce the rate of create, update, and delete 
requests.",
"reason" : "rateLimitExceeded"
  } ],
  "message" : "The total number of changes to the object 
temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-rs_h3GBsZXa-wKYMlsz9Sg.jar
 exceeds the rate limit. Please reduce the rate of create, update, and delete 
requests."
}
at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel$UploadOperation.call(AbstractGoogleAsyncWriteChannel.java:358)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

Nov 12, 2018 6:04:43 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-rs_h3GBsZXa-wKYMlsz9Sg.jar
Nov 12, 2018 6:04:43 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-rs_h3GBsZXa-wKYMlsz9Sg.jar
Nov 12, 2018 6:04:46 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #993

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[mxm] Fix website precommit

--
[...truncated 236.29 KB...]
INFO: Adding Query7/Query7.Monitor as step s2
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/StripNonce/Map
 as step s12
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Window.Remerge/Identity/Map
 as step s13
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PrepareCold/Map
 as step s14
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Flatten.PCollections
 as step s15
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/GroupByKey
 as step s16
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/Combine.GroupedValues
 as step s17
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Values/Values/Map
 as step s18
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 as step s19
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly
 as step s20
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
 as step s21
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Combine.GloballyAsSingletonView/CreateDataflowView 
as step s22
Nov 12, 2018 4:16:12 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Query7.Select as step s23
Nov 12, 

Jenkins build is back to normal : beam_PostCommit_Python_VR_Flink #713

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Python_VR_Flink #715

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PreCommit_Java_Cron #579

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[robertwb] [BEAM-3253] Specify gradle dependencies for Python tox tasks.

[mxm] Fix website precommit

[lcwik] [BEAM-6018] Don't use getExitingExecutorService (#6994)

--
[...truncated 53.03 MB...]
GROUP BY `B2`.`auction`, HOP(`B2`.`dateTime`, INTERVAL '5' SECOND, INTERVAL 
'10' SECOND)) AS `CountBids`
GROUP BY `CountBids`.`starttime`) AS `MaxBids` ON `AuctionBids`.`starttime` 
= `MaxBids`.`starttime` AND `AuctionBids`.`num` >= `MaxBids`.`maxnum`
Nov 12, 2018 6:18:20 PM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(auction=[$0], num=[$1])
  LogicalJoin(condition=[AND(=($2, $4), >=($1, $3))], joinType=[inner])
LogicalProject(auction=[$0], num=[$2], starttime=[$1])
  LogicalAggregate(group=[{0, 1}], num=[COUNT()])
LogicalProject(auction=[$0], $f1=[HOP($3, 5000, 1)])
  BeamIOSourceRel(table=[[beam, Bid]])
LogicalProject(maxnum=[$1], starttime=[$0])
  LogicalAggregate(group=[{0}], maxnum=[MAX($1)])
LogicalProject(starttime=[$1], num=[$0])
  LogicalProject(num=[$2], starttime=[$1])
LogicalAggregate(group=[{0, 1}], num=[COUNT()])
  LogicalProject(auction=[$0], $f1=[HOP($3, 5000, 1)])
BeamIOSourceRel(table=[[beam, Bid]])

Nov 12, 2018 6:18:20 PM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..4=[{inputs}], proj#0..1=[{exprs}])
  BeamJoinRel(condition=[AND(=($2, $4), >=($1, $3))], joinType=[inner])
BeamCalcRel(expr#0..2=[{inputs}], auction=[$t0], num=[$t2], 
starttime=[$t1])
  BeamAggregationRel(group=[{0, 1}], num=[COUNT()], 
window=[SlidingWindows($1, PT5S, PT10S, PT0S)])
BeamCalcRel(expr#0..4=[{inputs}], auction=[$t0], $f1=[$t3])
  BeamIOSourceRel(table=[[beam, Bid]])
BeamCalcRel(expr#0..1=[{inputs}], maxnum=[$t1], starttime=[$t0])
  BeamAggregationRel(group=[{1}], maxnum=[MAX($0)])
BeamCalcRel(expr#0..2=[{inputs}], num=[$t2], starttime=[$t1])
  BeamAggregationRel(group=[{0, 1}], num=[COUNT()], 
window=[SlidingWindows($1, PT5S, PT10S, PT0S)])
BeamCalcRel(expr#0..4=[{inputs}], auction=[$t0], $f1=[$t3])
  BeamIOSourceRel(table=[[beam, Bid]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery3Test > 
testJoinsPeopleWithAuctions STANDARD_ERROR
Nov 12, 2018 6:18:20 PM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `P`.`name`, `P`.`city`, `P`.`state`, `A`.`id`
FROM `beam`.`Auction` AS `A`
INNER JOIN `beam`.`Person` AS `P` ON `A`.`seller` = `P`.`id`
WHERE `A`.`category` = 10 AND (`P`.`state` = 'OR' OR `P`.`state` = 'ID' OR 
`P`.`state` = 'CA')
Nov 12, 2018 6:18:20 PM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(name=[$11], city=[$14], state=[$15], id=[$0])
  LogicalFilter(condition=[AND(=($8, 10), OR(=($15, 'OR'), =($15, 'ID'), 
=($15, 'CA')))])
LogicalJoin(condition=[=($7, $10)], joinType=[inner])
  BeamIOSourceRel(table=[[beam, Auction]])
  BeamIOSourceRel(table=[[beam, Person]])

Nov 12, 2018 6:18:20 PM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..17=[{inputs}], name=[$t11], city=[$t14], state=[$t15], 
id=[$t0])
  BeamJoinRel(condition=[=($7, $10)], joinType=[inner])
BeamCalcRel(expr#0..9=[{inputs}], expr#10=[10], expr#11=[=($t8, $t10)], 
proj#0..9=[{exprs}], $condition=[$t11])
  BeamIOSourceRel(table=[[beam, Auction]])
BeamCalcRel(expr#0..7=[{inputs}], expr#8=['OR'], expr#9=[=($t5, $t8)], 
expr#10=['ID'], expr#11=[=($t5, $t10)], expr#12=['CA'], expr#13=[=($t5, $t12)], 
expr#14=[OR($t9, $t11, $t13)], proj#0..7=[{exprs}], $condition=[$t14])
  BeamIOSourceRel(table=[[beam, Person]])


org.apache.beam.sdk.nexmark.queries.sql.SqlQuery7Test > testBids STANDARD_ERROR
Nov 12, 2018 6:18:21 PM 
org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQL:
SELECT `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, `B`.`extra`
FROM (SELECT `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, 
`B`.`extra`, TUMBLE_START(`B`.`dateTime`, INTERVAL '10' SECOND) AS `starttime`
FROM `beam`.`Bid` AS `B`
GROUP BY `B`.`auction`, `B`.`price`, `B`.`bidder`, `B`.`dateTime`, 
`B`.`extra`, TUMBLE(`B`.`dateTime`, INTERVAL '10' SECOND)) AS `B`
INNER JOIN (SELECT MAX(`B1`.`price`) AS `maxprice`, 
TUMBLE_START(`B1`.`dateTime`, INTERVAL '10' SECOND) AS `starttime`
FROM `beam`.`Bid` AS `B1`
GROUP BY TUMBLE(`B1`.`dateTime`, INTERVAL '10' SECOND)) AS 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #998

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[markliu] [BEAM-5953] Fix DataflowRunner in Python 3 - type errors

--
[...truncated 225.76 KB...]
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-sdks-java-extensions-google-cloud-platform-core-2.9.0-SNAPSHOT-22tmbONvb1fvKR1uqU91bQ.jar
Nov 12, 2018 9:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-direct-java-2.9.0-SNAPSHOT-DODZibgTPzRJCvrMBrVfGA.jar
Nov 12, 2018 9:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-sdks-java-core-2.9.0-SNAPSHOT-EhZ2AuwNSXQSQuvp-E9tIg.jar
Nov 12, 2018 9:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-model-pipeline-2.9.0-SNAPSHOT-J0qQnskC9AzviSOEPpyHQQ.jar
Nov 12, 2018 9:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-model-job-management-2.9.0-SNAPSHOT-xw4tpz6SoXKE2v1H6IQg8Q.jar
Nov 12, 2018 9:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-UFa_4F6_7Gluje8BPGPIDw.jar
Nov 12, 2018 9:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-UFa_4F6_7Gluje8BPGPIDw.jar
Nov 12, 2018 9:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-vendor-java-grpc-v1-2.9.0-SNAPSHOT-CMUXbyM7EZTK1p7PkgsHRQ.jar
Nov 12, 2018 9:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-UFa_4F6_7Gluje8BPGPIDw.jar
Nov 12, 2018 9:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-UFa_4F6_7Gluje8BPGPIDw.jar
Nov 12, 2018 9:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-UFa_4F6_7Gluje8BPGPIDw.jar
Nov 12, 2018 9:45:15 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #718

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[markliu] [BEAM-5953] Fix DataflowRunner in Python 3 - type errors

--
[...truncated 4.44 MB...]
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.taskexecutor.TaskExecutor - Successful registration at 
resource manager 
akka://flink/user/resourcemanager_80a0fe4d-06bb-41a1-bddc-3236a03da4bb under 
registration id 7f75f6ac84102b019e4b031a8f78d776.
[flink-runner-job-server] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Rest endpoint 
listening at localhost:34383
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint@52a5e4c9 @ 
http://localhost:34383
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Starting job dispatcher(s) for JobManger
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - 
http://localhost:34383 was granted leadership with 
leaderSessionID=062673d2-8348-4ac9-8883-9effc2629f84
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader http://localhost:34383 , 
session=062673d2-8348-4ac9-8883-9effc2629f84
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService 
- Starting RPC endpoint for 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher at 
akka://flink/user/dispatcher9387120b-859a-4006-82b1-24b2e8fb1ee1 .
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher@dba23f8 @ 
akka://flink/user/dispatcher9387120b-859a-4006-82b1-24b2e8fb1ee1
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Flink Mini Cluster started successfully
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Dispatcher 
akka://flink/user/dispatcher9387120b-859a-4006-82b1-24b2e8fb1ee1 was granted 
leadership with fencing token f3d12a38-4456-4a39-85b9-565acd3d2d5e
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Recovering all 
persisted jobs.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader 
akka://flink/user/dispatcher9387120b-859a-4006-82b1-24b2e8fb1ee1 , 
session=f3d12a38-4456-4a39-85b9-565acd3d2d5e
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Submitting job 
7b0b06069814f741531c54f5e6bf58f1 (test_windowing_1542059669.08).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_41 
.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Initializing job 
test_windowing_1542059669.08 (7b0b06069814f741531c54f5e6bf58f1).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Using restart strategy 
NoRestartStrategy for test_windowing_1542059669.08 
(7b0b06069814f741531c54f5e6bf58f1).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool at 
akka://flink/user/86d16a22-9f5d-44ae-b830-84d4e7be5b55 .
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job recovers via 
failover strategy: full graph restart
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Running initialization on master 
for job test_windowing_1542059669.08 (7b0b06069814f741531c54f5e6bf58f1).
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Successfully ran initialization 
on master in 0 ms.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - No state backend has been 
configured, using default (Memory / JobManager) MemoryStateBackend (data in 
heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 
'null', asynchronous: TRUE, maxStateSize: 5242880)
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.jobmaster.JobManagerRunner@4d103588 @ 
akka://flink/user/jobmanager_41
[flink-akka.actor.default-dispatcher-5] INFO 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Samza_Gradle #1273

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[github] [BEAM-5446] SplittableDoFn: Remove "internal" methods for public API

--
[...truncated 46.63 MB...]
INFO: Starting stores in task instance Partition 0
Nov 12, 2018 10:10:34 PM org.apache.samza.util.Logging$class info
INFO: Got non logged storage partition directory as 
/tmp/beam-samza-test/beamStore/Partition_0
Nov 12, 2018 10:10:34 PM org.apache.samza.util.Logging$class info
INFO: Got logged storage partition directory as 
/tmp/beam-samza-test/beamStore/Partition_0
Nov 12, 2018 10:10:34 PM org.apache.samza.util.Logging$class info
INFO: Deleting logged storage partition directory 
/tmp/beam-samza-test/beamStore/Partition_0.
Nov 12, 2018 10:10:34 PM org.apache.samza.util.Logging$class info
INFO: Using non logged storage partition directory: 
/tmp/beam-samza-test/beamStore/Partition_0 for store: beamStore.
Nov 12, 2018 10:10:34 PM org.apache.samza.util.Logging$class info
INFO: Validating change log streams: Map()
Nov 12, 2018 10:10:34 PM org.apache.samza.util.Logging$class info
INFO: Got change log stream metadata: Map()
Nov 12, 2018 10:10:34 PM org.apache.samza.util.Logging$class info
INFO: Assigning oldest change log offsets for taskName Partition 0: Map()
Nov 12, 2018 10:10:34 PM org.apache.samza.util.Logging$class info
INFO: Starting table manager in task instance Partition 0
Nov 12, 2018 10:10:34 PM org.apache.samza.util.Logging$class info
INFO: Starting host statistics monitor
Nov 12, 2018 10:10:34 PM org.apache.samza.util.Logging$class info
INFO: Registering task instances with producers.
Nov 12, 2018 10:10:34 PM org.apache.samza.util.Logging$class info
INFO: Starting producer multiplexer.
Nov 12, 2018 10:10:34 PM org.apache.samza.util.Logging$class info
INFO: Initializing stream tasks.
Nov 12, 2018 10:10:34 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
0-split0_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:10:34 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
0-split0_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:10:34 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
1-split1_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:10:34 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
1-split1_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:10:34 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
2-split2_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:10:34 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
2-split2_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:10:34 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
3-split3_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:10:34 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
3-split3_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:10:34 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
4-split4_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:10:34 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
4-split4_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:10:34 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
5-split5_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:10:34 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
5-split5_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:10:34 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
6-split6_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:10:34 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
6-split6_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:10:34 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
7-split7_out__PCollection_. 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Samza_Gradle #1274

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-6037] Make Spark runner pipeline translation based on URNs (#7005)

--
[...truncated 46.62 MB...]
INFO: Starting stores in task instance Partition 0
Nov 12, 2018 10:13:56 PM org.apache.samza.util.Logging$class info
INFO: Got non logged storage partition directory as 
/tmp/beam-samza-test/beamStore/Partition_0
Nov 12, 2018 10:13:56 PM org.apache.samza.util.Logging$class info
INFO: Got logged storage partition directory as 
/tmp/beam-samza-test/beamStore/Partition_0
Nov 12, 2018 10:13:56 PM org.apache.samza.util.Logging$class info
INFO: Deleting logged storage partition directory 
/tmp/beam-samza-test/beamStore/Partition_0.
Nov 12, 2018 10:13:56 PM org.apache.samza.util.Logging$class info
INFO: Using non logged storage partition directory: 
/tmp/beam-samza-test/beamStore/Partition_0 for store: beamStore.
Nov 12, 2018 10:13:56 PM org.apache.samza.util.Logging$class info
INFO: Validating change log streams: Map()
Nov 12, 2018 10:13:56 PM org.apache.samza.util.Logging$class info
INFO: Got change log stream metadata: Map()
Nov 12, 2018 10:13:56 PM org.apache.samza.util.Logging$class info
INFO: Assigning oldest change log offsets for taskName Partition 0: Map()
Nov 12, 2018 10:13:56 PM org.apache.samza.util.Logging$class info
INFO: Starting table manager in task instance Partition 0
Nov 12, 2018 10:13:56 PM org.apache.samza.util.Logging$class info
INFO: Starting host statistics monitor
Nov 12, 2018 10:13:56 PM org.apache.samza.util.Logging$class info
INFO: Registering task instances with producers.
Nov 12, 2018 10:13:56 PM org.apache.samza.util.Logging$class info
INFO: Starting producer multiplexer.
Nov 12, 2018 10:13:56 PM org.apache.samza.util.Logging$class info
INFO: Initializing stream tasks.
Nov 12, 2018 10:13:56 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
0-split0_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:13:56 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
0-split0_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:13:56 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
1-split1_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:13:56 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
1-split1_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:13:56 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
2-split2_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:13:56 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
2-split2_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:13:56 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
3-split3_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:13:56 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
3-split3_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:13:56 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
4-split4_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:13:56 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
4-split4_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:13:56 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
5-split5_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:13:56 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
5-split5_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:13:56 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
6-split6_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:13:56 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
6-split6_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:13:56 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 

Build failed in Jenkins: beam_PostCommit_Python_Verify #6535

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[aaltay] [BEAM-3612] Type specialize stats package (#7002)

--
[...truncated 1.38 MB...]
test_delete_error (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_parse_schema_descriptor 

Jenkins build is back to normal : beam_PostCommit_Python_VR_Flink #719

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Jenkins build is back to normal : beam_PostCommit_Java_GradleBuild #1875

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #997

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[aaltay] [BEAM-3612] Type specialize stats package (#7002)

--
[...truncated 241.38 KB...]
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-l-DC39qFDvm1MV0J9A1CXg.jar
Nov 12, 2018 8:33:34 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly uploaded
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/StripNonce/Map
 as step s12
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Window.Remerge/Identity/Map
 as step s13
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PrepareCold/Map
 as step s14
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Flatten.PCollections
 as step s15
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/GroupByKey
 as step s16
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/Combine.GroupedValues
 as step s17
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Values/Values/Map
 as step s18
Nov 12, 2018 8:33:35 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 as step s19
Nov 12, 2018 8:33:35 PM 

Build failed in Jenkins: beam_PostCommit_Python_Verify #6534

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[aaltay] [BEAM-3612] Add a shim generator tool (#7000)

--
[...truncated 1.38 MB...]
test_delete_error (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_delete_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_exists (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_directory_trailing_slash 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_match_file_empty 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_limits 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_match_file_with_zero_limit 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_mkdirs_failed (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_open (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_open_bad_path (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_directory 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_rename_file (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) 
... ok
test_rename_file_error 
(apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... SKIP: This test 
still needs to be fixed on Python 3TODO: BEAM-5627
test_scheme (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_size (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_join (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... ok
test_url_split (apache_beam.io.hadoopfilesystem_test.HadoopFileSystemTest) ... 
ok
test_create_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_bq_dataset (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_delete_table_fails_not_found 
(apache_beam.io.gcp.tests.utils_test.UtilsTest) ... SKIP: Bigquery dependencies 
are not installed.
test_delete_table_succeeds (apache_beam.io.gcp.tests.utils_test.UtilsTest) ... 
SKIP: Bigquery dependencies are not installed.
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: IT is skipped because --test-pipeline-options is not specified
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... SKIP: IT is 
skipped because --test-pipeline-options is not specified
get_test_rows (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: 
GCP dependencies are not installed
test_read_from_query (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_query_sql_format 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_query_unflatten_records 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table (apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... 
SKIP: GCP dependencies are not installed
test_read_from_table_and_job_complete_retry 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_and_multiple_pages 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_read_from_table_as_tablerows 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_table_schema_without_project 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_both_query_and_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_using_neither_query_nor_table_fails 
(apache_beam.io.gcp.bigquery_test.TestBigQueryReader) ... SKIP: GCP 
dependencies are not installed
test_nested_schema_as_json (apache_beam.io.gcp.bigquery_test.TestBigQuerySink) 
... SKIP: GCP dependencies are not installed
test_parse_schema_descriptor 

Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow #1693

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Python_VR_Flink #714

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-6018] Don't use getExitingExecutorService (#6994)

--
[...truncated 4.44 MB...]
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - 
http://localhost:3 was granted leadership with 
leaderSessionID=7694344f-35b7-4208-83d5-4842eb25e3b1
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader http://localhost:3 , 
session=7694344f-35b7-4208-83d5-4842eb25e3b1
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService 
- Starting RPC endpoint for 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher at 
akka://flink/user/dispatcherb78907d7-b0cd-40a7-ba4a-cedd3becbaff .
[flink-runner-job-server] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher@cacf1e2 @ 
akka://flink/user/dispatcherb78907d7-b0cd-40a7-ba4a-cedd3becbaff
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster 
- Flink Mini Cluster started successfully
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Dispatcher 
akka://flink/user/dispatcherb78907d7-b0cd-40a7-ba4a-cedd3becbaff was granted 
leadership with fencing token 285c8552-ff27-45d1-9edf-8d836b61eb26
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Recovering all 
persisted jobs.
[flink-akka.actor.default-dispatcher-5] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Received confirmation of leadership for leader 
akka://flink/user/dispatcherb78907d7-b0cd-40a7-ba4a-cedd3becbaff , 
session=285c8552-ff27-45d1-9edf-8d836b61eb26
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Submitting job 
141e994e60eed48cdedfc5caf110f604 (test_windowing_1542043451.92).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_41 
.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Initializing job 
test_windowing_1542043451.92 (141e994e60eed48cdedfc5caf110f604).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Using restart strategy 
NoRestartStrategy for test_windowing_1542043451.92 
(141e994e60eed48cdedfc5caf110f604).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool at 
akka://flink/user/28ddabe9-defb-454f-beec-7d78af25619b .
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job recovers via 
failover strategy: full graph restart
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Running initialization on master 
for job test_windowing_1542043451.92 (141e994e60eed48cdedfc5caf110f604).
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Successfully ran initialization 
on master in 0 ms.
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - No state backend has been 
configured, using default (Memory / JobManager) MemoryStateBackend (data in 
heap memory / checkpoints to JobManager) (checkpoints: 'null', savepoints: 
'null', asynchronous: TRUE, maxStateSize: 5242880)
[flink-akka.actor.default-dispatcher-4] INFO 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
- Proposing leadership to contender 
org.apache.flink.runtime.jobmaster.JobManagerRunner@4187689d @ 
akka://flink/user/jobmanager_41
[flink-akka.actor.default-dispatcher-2] INFO 
org.apache.flink.runtime.jobmaster.JobManagerRunner - JobManager runner for job 
test_windowing_1542043451.92 (141e994e60eed48cdedfc5caf110f604) was granted 
leadership with session id 88926b26-ca43-4953-bb91-9f6436ba1bac at 
akka://flink/user/jobmanager_41.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.jobmaster.JobMaster - Starting execution of job 
test_windowing_1542043451.92 (141e994e60eed48cdedfc5caf110f604)
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Job 
test_windowing_1542043451.92 (141e994e60eed48cdedfc5caf110f604) switched from 
state CREATED to RUNNING.
[flink-akka.actor.default-dispatcher-3] INFO 
org.apache.flink.runtime.executiongraph.ExecutionGraph - Source: Custom Source 
-> 

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow #1694

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-6018] Don't use getExitingExecutorService (#6994)

--
[...truncated 265.29 KB...]
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

--
XML: 

--
Ran 16 tests in 1037.722s

OK
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_13_39-10123796515196475800?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_22_45-11303520332891248215?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_13_39-18069374393973980900?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_22_49-10757569805158977788?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_13_39-764152738579281400?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_22_27-16933378686762060470?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_13_39-9931524202176754925?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_22_40-14581732704161458093?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_13_39-12734393124555981371?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_22_35-2599031185132129665?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_13_38-6224997237959279023?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_21_09-11633967687181712455?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_13_39-10351759034732533311?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_22_05-8675338138044290841?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_13_39-6984464713774975653?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-12_09_22_35-1228153871926877441?project=apache-beam-testing.
:beam-sdks-python:validatesRunnerBatchTests (Thread[Task worker for 
':',5,main]) completed. Took 17 mins 18.567 secs.
:beam-sdks-python:validatesRunnerStreamingTests (Thread[Task worker for 
':',5,main]) started.

> Task :beam-sdks-python:validatesRunnerStreamingTests
Caching disabled for task ':beam-sdks-python:validatesRunnerStreamingTests': 
Caching has not been enabled for the task
Task ':beam-sdks-python:validatesRunnerStreamingTests' is not up-to-date 
because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: 

 Command: sh -c . 

 && ./scripts/run_integration_test.sh --test_opts "--nocapture --processes=8 
--process-timeout=4500 --attr=ValidatesRunner,!sickbay-streaming" --streaming 
true --worker_jar 

Successfully started process 'command 'sh''


###
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known 

Build failed in Jenkins: beam_PostCommit_Java_GradleBuild #1874

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[robertwb] [BEAM-3253] Specify gradle dependencies for Python tox tasks.

--
[...truncated 57.52 MB...]
Nov 12, 2018 5:58:35 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.apache.httpcomponents/httpcore/4.4.6/e3fd8ced1f52c7574af952e2e6da0df8df08eb82/httpcore-4.4.6.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112175832-4238f490/output/results/staging/httpcore-4.4.6-qfvVA-CAJQfv7q_7VrvfUg.jar
Nov 12, 2018 5:58:35 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.squareup.okhttp/okhttp/2.5.0/4de2b4ed3445c37ec1720a7d214712e845a24636/okhttp-2.5.0.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112175832-4238f490/output/results/staging/okhttp-2.5.0-64v0X4G_nxfR_PsuymOqpg.jar
Nov 12, 2018 5:58:35 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112175832-4238f490/output/results/staging/beam-vendor-java-grpc-v1-2.9.0-SNAPSHOT-9iKsCj5AIeLUgi0xZAzQww.jar
Nov 12, 2018 5:58:35 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.protobuf.nano/protobuf-javanano/3.0.0-alpha-5/357e60f95cebb87c72151e49ba1f570d899734f8/protobuf-javanano-3.0.0-alpha-5.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112175832-4238f490/output/results/staging/protobuf-javanano-3.0.0-alpha-5-SsS7fZe7brffHbAGUgRrnw.jar
Nov 12, 2018 5:58:35 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.esotericsoftware.minlog/minlog/1.2/59bfcd171d82f9981a5e242b9e840191f650e209/minlog-1.2.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112175832-4238f490/output/results/staging/minlog-1.2-98-99jtn3wu_pMfLJgiFvA.jar
Nov 12, 2018 5:58:35 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.4/d94ae6d7d27242eaa4b6c323f881edbb98e48da6/snappy-java-1.1.4.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112175832-4238f490/output/results/staging/snappy-java-1.1.4-SFNwbMuGq13aaoKVzeS1Tw.jar
Nov 12, 2018 5:58:35 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.9/9ce04e34240f674bc72680f8b843b1457383161a/commons-codec-1.9.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112175832-4238f490/output/results/staging/commons-codec-1.9-dWFTVmBcgSgBPanjrGKiSQ.jar
Nov 12, 2018 5:58:35 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.esotericsoftware.reflectasm/reflectasm/1.07/76f11c94a53ee975a0d9154b325c408b210155bd/reflectasm-1.07-shaded.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112175832-4238f490/output/results/staging/reflectasm-1.07-shaded-IELCIoQPsh49E873w5ZYMQ.jar
Nov 12, 2018 5:58:35 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/ru.vyarus/generics-resolver/2.0.1/2182e67f161ddbe3ff8cb055bb54398354fda3f5/generics-resolver-2.0.1.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112175832-4238f490/output/results/staging/generics-resolver-2.0.1-VrpA4CuIscdXm0wF6oMTaQ.jar
Nov 12, 2018 5:58:35 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.beust/jcommander/1.27/58c9cbf0f1fa296f93c712f2cf46de50471920f9/jcommander-1.27.jar
 to 
gs://temp-storage-for-end-to-end-tests/spannerwriteit0testreportfailures-jenkins-1112175832-4238f490/output/results/staging/jcommander-1.27-YxYbyOYD5gurYr1hw0nzcA.jar
Nov 12, 2018 5:58:35 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/org.javaruntype/javaruntype/1.3/26ba963f4b20c751e07b58b990bb41bf850622d8/javaruntype-1.3.jar
 to 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #999

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[github] [BEAM-5446] SplittableDoFn: Remove "internal" methods for public API

--
[...truncated 279.00 KB...]
INFO: Adding Query7/Query7.Snoop as step s3
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/StripNonce/Map
 as step s12
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Window.Remerge/Identity/Map
 as step s13
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PrepareCold/Map
 as step s14
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Flatten.PCollections
 as step s15
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/GroupByKey
 as step s16
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/Combine.GroupedValues
 as step s17
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Values/Values/Map
 as step s18
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 as step s19
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly
 as step s20
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
 as step s21
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Combine.GloballyAsSingletonView/CreateDataflowView 
as step s22
Nov 12, 2018 10:13:10 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Query7.Select as step s23
Nov 12, 2018 10:13:10 PM 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #2178

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[github] [BEAM-5446] SplittableDoFn: Remove "internal" methods for public API

--
[...truncated 28.58 MB...]
[streaming-job-executor-0] INFO org.apache.spark.SparkContext - Starting 
job: foreach at UnboundedDataset.java:79
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2787 (mapToPair at GroupCombineFunctions.java:57)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2815 (mapToPair at GroupCombineFunctions.java:57)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2824 (mapPartitionsToPair at 
SparkGroupAlsoByWindowViaWindowSet.java:564)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Got job 35 (foreach at UnboundedDataset.java:79) with 4 output partitions
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Final stage: ResultStage 839 (foreach at UnboundedDataset.java:79)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Parents of final stage: List(ShuffleMapStage 832, ShuffleMapStage 826, 
ShuffleMapStage 836, ShuffleMapStage 830, ShuffleMapStage 822, ShuffleMapStage 
834, ShuffleMapStage 838, ShuffleMapStage 824, ShuffleMapStage 828)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Missing parents: List(ShuffleMapStage 822)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting ShuffleMapStage 818 (MapPartitionsRDD[2787] at mapToPair at 
GroupCombineFunctions.java:57), which has no missing parents
[dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_143 stored as values in memory (estimated size 177.8 KB, free 
13.5 GB)
[dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_143_piece0 stored as bytes in memory (estimated size 54.7 KB, 
free 13.5 GB)
[dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - 
Added broadcast_143_piece0 in memory on localhost:35693 (size: 54.7 KB, free: 
13.5 GB)
[dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created 
broadcast 143 from broadcast at DAGScheduler.scala:1039
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 4 missing tasks from ShuffleMapStage 818 (MapPartitionsRDD[2787] at 
mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions 
Vector(0, 1, 2, 3))
[dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 818.0 with 4 
tasks
[dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 818.0 (TID 642, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8165 bytes)
[dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 818.0 (TID 643, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 8165 bytes)
[dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 818.0 (TID 644, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 8165 bytes)
[dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 818.0 (TID 645, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 8165 bytes)
[Executor task launch worker for task 644] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 818.0 (TID 644)
[Executor task launch worker for task 643] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 818.0 (TID 643)
[Executor task launch worker for task 642] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 818.0 (TID 642)
[Executor task launch worker for task 645] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 818.0 (TID 645)
[Executor task launch worker for task 644] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_2 locally
[Executor task launch worker for task 643] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_1 locally
[Executor task launch worker for task 645] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_3 locally
[Executor task launch worker for task 642] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_0 locally
[Executor task launch worker for task 643] INFO 
org.apache.spark.executor.Executor - Finished task 1.0 in stage 818.0 (TID 
643). 59466 bytes result sent to driver
[Executor task launch worker for task 644] INFO 
org.apache.spark.executor.Executor - Finished task 2.0 in stage 818.0 (TID 
644). 59466 bytes result sent to driver
[Executor task launch worker for task 642] 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #2188

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[github] [BEAM-5446] SplittableDoFn: Remove "internal" methods for public API

--
[...truncated 67.51 MB...]
Nov 12, 2018 10:23:41 PM 
org.apache.flink.runtime.executiongraph.ExecutionGraph transitionState
INFO: Job metricspushertest0test-jenkins-111340-72a4f897 
(a614683a7f21a4fd5c84fb13af5d07ea) switched from state CREATED to RUNNING.
Nov 12, 2018 10:23:41 PM org.apache.flink.runtime.executiongraph.Execution 
transitionState
INFO: Source: 
GenerateSequence/Read(UnboundedCountingSource)/Create/Read(CreateSource) -> 
GenerateSequence/Read(UnboundedCountingSource)/Split/ParMultiDo(Split) -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Pair with random 
key/ParMultiDo(AssignShard) -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/Window.Into()/Window.Assign.out
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> ToKeyedWorkItem (1/1) (1df8fbe273dd610a36351ccae377e423) switched from 
CREATED to SCHEDULED.
Nov 12, 2018 10:23:41 PM org.apache.flink.runtime.executiongraph.Execution 
transitionState
INFO: 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/GroupByKey 
-> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Values/Values/Map/ParMultiDo(Anonymous)
 -> GenerateSequence/Read(UnboundedCountingSource)/Read/ParMultiDo(Read) -> 
GenerateSequence/Read(UnboundedCountingSource)/StripIds/ParMultiDo(StripIds) -> 
ParDo(Counting)/ParMultiDo(Counting) (1/1) (5fb80d7d718f748f61be1572b1fbe957) 
switched from CREATED to SCHEDULED.
Nov 12, 2018 10:23:41 PM 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool 
stashRequestWaitingForResourceManager
INFO: Cannot serve slot request, no ResourceManager connected. Adding as 
pending request [SlotRequestId{b1ec168ba8da853c3cf20aa80d773298}]
Nov 12, 2018 10:23:41 PM 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
confirmLeader
INFO: Received confirmation of leadership for leader 
akka://flink/user/jobmanager_237 , session=31655701-0d10-4aed-b593-5f2ccb51ceb0
Nov 12, 2018 10:23:41 PM org.apache.flink.runtime.jobmaster.JobMaster 
connectToResourceManager
INFO: Connecting to ResourceManager 
akka://flink/user/resourcemanager_3e136402-b3f5-42e3-99b9-578796816a83(bfd6b52427bac9225ff09642e29d4ce9)
Nov 12, 2018 10:23:41 PM 
org.apache.flink.runtime.registration.RetryingRegistration 
lambda$startRegistration$0
INFO: Resolved ResourceManager address, beginning registration
Nov 12, 2018 10:23:41 PM 
org.apache.flink.runtime.registration.RetryingRegistration register
INFO: Registration at ResourceManager attempt 1 (timeout=100ms)
Nov 12, 2018 10:23:41 PM 
org.apache.flink.runtime.resourcemanager.ResourceManager registerJobManager
INFO: Registering job manager 
b5935f2ccb51ceb0316557010d104aed@akka://flink/user/jobmanager_237 for job 
a614683a7f21a4fd5c84fb13af5d07ea.
Nov 12, 2018 10:23:41 PM 
org.apache.flink.runtime.resourcemanager.ResourceManager 
registerJobMasterInternal
INFO: Registered job manager 
b5935f2ccb51ceb0316557010d104aed@akka://flink/user/jobmanager_237 for job 
a614683a7f21a4fd5c84fb13af5d07ea.
Nov 12, 2018 10:23:41 PM org.apache.flink.runtime.jobmaster.JobMaster 
establishResourceManagerConnection
INFO: JobManager successfully registered at ResourceManager, leader id: 
bfd6b52427bac9225ff09642e29d4ce9.
Nov 12, 2018 10:23:41 PM 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool 
requestSlotFromResourceManager
INFO: Requesting new slot [SlotRequestId{b1ec168ba8da853c3cf20aa80d773298}] 
and profile ResourceProfile{cpuCores=-1.0, heapMemoryInMB=-1, 
directMemoryInMB=0, nativeMemoryInMB=0, networkMemoryInMB=0} from resource 
manager.
Nov 12, 2018 10:23:41 PM 
org.apache.flink.runtime.resourcemanager.ResourceManager requestSlot
INFO: Request slot with profile ResourceProfile{cpuCores=-1.0, 
heapMemoryInMB=-1, directMemoryInMB=0, nativeMemoryInMB=0, networkMemoryInMB=0} 
for job a614683a7f21a4fd5c84fb13af5d07ea with allocation id 
AllocationID{dd6a77fc1616f36f616baa2fe181832a}.
Nov 12, 2018 10:23:41 PM org.apache.flink.runtime.taskexecutor.TaskExecutor 
requestSlot
INFO: Receive slot request AllocationID{dd6a77fc1616f36f616baa2fe181832a} 
for job 

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Dataflow #1000

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-6037] Make Spark runner pipeline translation based on URNs (#7005)

--
[...truncated 247.22 KB...]
at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at 
com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel$UploadOperation.call(AbstractGoogleAsyncWriteChannel.java:358)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

Nov 12, 2018 10:26:10 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 

 to 
gs://temp-storage-for-perf-tests/nexmark/staging/beam-runners-google-cloud-dataflow-java-legacy-worker-2.9.0-SNAPSHOT-utFw88TMDEqgz1d4XD5K-g.jar
Nov 12, 2018 10:26:13 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 114 files cached, 29 files newly uploaded
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7.ReadBounded as step s1
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Monitor as step s2
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7.Snoop as step s3
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/IsBid/ParDo(Anonymous) as step s4
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/justBids/AsBid as step s5
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/Window.Into()/Window.Assign as step s6
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Query7/Query7/BidToPrice as step s7
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/WithKeys/AddKeys/Map
 as step s8
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/AddNonce
 as step s9
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/GroupByKey
 as step s10
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PreCombineHot/Combine.GroupedValues
 as step s11
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/StripNonce/Map
 as step s12
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Window.Remerge/Identity/Map
 as step s13
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PrepareCold/Map
 as step s14
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/Flatten.PCollections
 as step s15
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/GroupByKey
 as step s16
Nov 12, 2018 10:26:13 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding 
Query7/Query7/Combine.GloballyAsSingletonView/Combine.globally(MaxLong)/Combine.perKeyWithFanout(MaxLong)/PostCombine/Combine.GroupedValues
 as step s17
Nov 12, 2018 10:26:13 PM 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Gearpump_Gradle #1304

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[github] [BEAM-5446] SplittableDoFn: Remove "internal" methods for public API

--
[...truncated 9.82 MB...]
INFO: received start, clock: -9223372036854775808, sessionId: 12
Nov 12, 2018 10:32:44 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 13
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 19
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 14
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 15
Nov 12, 2018 10:32:44 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 16
Nov 12, 2018 10:32:44 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 17
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 18
Nov 12, 2018 10:32:44 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 20
Nov 12, 2018 10:32:44 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 21
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 22
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 23
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 24
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 26
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 27
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 25
Nov 12, 2018 10:32:44 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 31
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 29
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 30
Nov 12, 2018 10:32:44 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #2179

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-6037] Make Spark runner pipeline translation based on URNs (#7005)

--
[...truncated 29.79 MB...]
[streaming-job-executor-0] INFO org.apache.spark.SparkContext - Starting 
job: foreach at UnboundedDataset.java:79
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2787 (mapToPair at GroupCombineFunctions.java:57)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2815 (mapToPair at GroupCombineFunctions.java:57)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Registering RDD 2824 (mapPartitionsToPair at 
SparkGroupAlsoByWindowViaWindowSet.java:564)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Got job 35 (foreach at UnboundedDataset.java:79) with 4 output partitions
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Final stage: ResultStage 839 (foreach at UnboundedDataset.java:79)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Parents of final stage: List(ShuffleMapStage 832, ShuffleMapStage 818, 
ShuffleMapStage 836, ShuffleMapStage 822, ShuffleMapStage 816, ShuffleMapStage 
834, ShuffleMapStage 820, ShuffleMapStage 838, ShuffleMapStage 824)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Missing parents: List(ShuffleMapStage 832)
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting ShuffleMapStage 829 (MapPartitionsRDD[2787] at mapToPair at 
GroupCombineFunctions.java:57), which has no missing parents
[dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_143 stored as values in memory (estimated size 177.8 KB, free 
13.5 GB)
[dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore 
- Block broadcast_143_piece0 stored as bytes in memory (estimated size 54.6 KB, 
free 13.5 GB)
[dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - 
Added broadcast_143_piece0 in memory on localhost:44567 (size: 54.6 KB, free: 
13.5 GB)
[dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created 
broadcast 143 from broadcast at DAGScheduler.scala:1039
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - 
Submitting 4 missing tasks from ShuffleMapStage 829 (MapPartitionsRDD[2787] at 
mapToPair at GroupCombineFunctions.java:57) (first 15 tasks are for partitions 
Vector(0, 1, 2, 3))
[dag-scheduler-event-loop] INFO 
org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 829.0 with 4 
tasks
[dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 0.0 in stage 829.0 (TID 642, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8165 bytes)
[dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 1.0 in stage 829.0 (TID 643, localhost, executor driver, 
partition 1, PROCESS_LOCAL, 8165 bytes)
[dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 2.0 in stage 829.0 (TID 644, localhost, executor driver, 
partition 2, PROCESS_LOCAL, 8165 bytes)
[dispatcher-event-loop-0] INFO org.apache.spark.scheduler.TaskSetManager - 
Starting task 3.0 in stage 829.0 (TID 645, localhost, executor driver, 
partition 3, PROCESS_LOCAL, 8165 bytes)
[Executor task launch worker for task 643] INFO 
org.apache.spark.executor.Executor - Running task 1.0 in stage 829.0 (TID 643)
[Executor task launch worker for task 644] INFO 
org.apache.spark.executor.Executor - Running task 2.0 in stage 829.0 (TID 644)
[Executor task launch worker for task 642] INFO 
org.apache.spark.executor.Executor - Running task 0.0 in stage 829.0 (TID 642)
[Executor task launch worker for task 645] INFO 
org.apache.spark.executor.Executor - Running task 3.0 in stage 829.0 (TID 645)
[Executor task launch worker for task 645] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_3 locally
[Executor task launch worker for task 642] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_0 locally
[Executor task launch worker for task 643] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_1 locally
[Executor task launch worker for task 644] INFO 
org.apache.spark.storage.BlockManager - Found block rdd_2559_2 locally
[Executor task launch worker for task 645] INFO 
org.apache.spark.executor.Executor - Finished task 3.0 in stage 829.0 (TID 
645). 59466 bytes result sent to driver
[Executor task launch worker for task 644] INFO 
org.apache.spark.executor.Executor - Finished task 2.0 in stage 829.0 (TID 
644). 59466 bytes result sent to driver
[Executor task launch worker for task 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Flink_Gradle #2189

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-6037] Make Spark runner pipeline translation based on URNs (#7005)

--
[...truncated 67.56 MB...]
Nov 12, 2018 10:37:37 PM 
org.apache.flink.runtime.executiongraph.ExecutionGraph transitionState
INFO: Job metricspushertest0test-jenkins-1112223736-9d906a17 
(28ccbfe9a25d4856afa3f699cca0721b) switched from state CREATED to RUNNING.
Nov 12, 2018 10:37:37 PM org.apache.flink.runtime.executiongraph.Execution 
transitionState
INFO: Source: 
GenerateSequence/Read(UnboundedCountingSource)/Create/Read(CreateSource) -> 
GenerateSequence/Read(UnboundedCountingSource)/Split/ParMultiDo(Split) -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Pair with random 
key/ParMultiDo(AssignShard) -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/Window.Into()/Window.Assign.out
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/ReifyOriginalTimestamps/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> ToKeyedWorkItem (1/1) (36afd1ea66cb30cc949fd8a53d00b4cc) switched from 
CREATED to SCHEDULED.
Nov 12, 2018 10:37:37 PM org.apache.flink.runtime.executiongraph.Execution 
transitionState
INFO: 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/GroupByKey 
-> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/ExpandIterable/ParMultiDo(Anonymous)
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/ReifyTimestamps.RemoveWildcard/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous)
 -> 
GenerateSequence/Read(UnboundedCountingSource)/Reshuffle/Values/Values/Map/ParMultiDo(Anonymous)
 -> GenerateSequence/Read(UnboundedCountingSource)/Read/ParMultiDo(Read) -> 
GenerateSequence/Read(UnboundedCountingSource)/StripIds/ParMultiDo(StripIds) -> 
ParDo(Counting)/ParMultiDo(Counting) (1/1) (111616757c89f90148cfe803f6bfdf66) 
switched from CREATED to SCHEDULED.
Nov 12, 2018 10:37:37 PM 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool 
stashRequestWaitingForResourceManager
INFO: Cannot serve slot request, no ResourceManager connected. Adding as 
pending request [SlotRequestId{9379c955da57d89ed064eb4166cfc01c}]
Nov 12, 2018 10:37:37 PM 
org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService 
confirmLeader
INFO: Received confirmation of leadership for leader 
akka://flink/user/jobmanager_237 , session=5d5d1f75-dfe9-4a94-8d0a-f4862d2f48f4
Nov 12, 2018 10:37:37 PM org.apache.flink.runtime.jobmaster.JobMaster 
connectToResourceManager
INFO: Connecting to ResourceManager 
akka://flink/user/resourcemanager_430e9d46-33c6-49ca-b74a-8d6e34b138dc(9123a28a8bf6410025daca39f7344c39)
Nov 12, 2018 10:37:37 PM 
org.apache.flink.runtime.registration.RetryingRegistration 
lambda$startRegistration$0
INFO: Resolved ResourceManager address, beginning registration
Nov 12, 2018 10:37:37 PM 
org.apache.flink.runtime.registration.RetryingRegistration register
INFO: Registration at ResourceManager attempt 1 (timeout=100ms)
Nov 12, 2018 10:37:37 PM 
org.apache.flink.runtime.resourcemanager.ResourceManager registerJobManager
INFO: Registering job manager 
8d0af4862d2f48f45d5d1f75dfe94a94@akka://flink/user/jobmanager_237 for job 
28ccbfe9a25d4856afa3f699cca0721b.
Nov 12, 2018 10:37:37 PM 
org.apache.flink.runtime.resourcemanager.ResourceManager 
registerJobMasterInternal
INFO: Registered job manager 
8d0af4862d2f48f45d5d1f75dfe94a94@akka://flink/user/jobmanager_237 for job 
28ccbfe9a25d4856afa3f699cca0721b.
Nov 12, 2018 10:37:37 PM org.apache.flink.runtime.jobmaster.JobMaster 
establishResourceManagerConnection
INFO: JobManager successfully registered at ResourceManager, leader id: 
9123a28a8bf6410025daca39f7344c39.
Nov 12, 2018 10:37:37 PM 
org.apache.flink.runtime.jobmaster.slotpool.SlotPool 
requestSlotFromResourceManager
INFO: Requesting new slot [SlotRequestId{9379c955da57d89ed064eb4166cfc01c}] 
and profile ResourceProfile{cpuCores=-1.0, heapMemoryInMB=-1, 
directMemoryInMB=0, nativeMemoryInMB=0, networkMemoryInMB=0} from resource 
manager.
Nov 12, 2018 10:37:37 PM 
org.apache.flink.runtime.resourcemanager.ResourceManager requestSlot
INFO: Request slot with profile ResourceProfile{cpuCores=-1.0, 
heapMemoryInMB=-1, directMemoryInMB=0, nativeMemoryInMB=0, networkMemoryInMB=0} 
for job 28ccbfe9a25d4856afa3f699cca0721b with allocation id 
AllocationID{8ab67a11c1489f9ee3143a4f4f4e720d}.
Nov 12, 2018 10:37:37 PM org.apache.flink.runtime.taskexecutor.TaskExecutor 
requestSlot
INFO: Receive slot request AllocationID{8ab67a11c1489f9ee3143a4f4f4e720d} 
for job 

Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Samza_Gradle #1275

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-3608] Vendor guava 20.0 (#6809)

--
[...truncated 46.54 MB...]
INFO: Got non logged storage partition directory as 
/tmp/beam-samza-test/beamStore/Partition_0
Nov 12, 2018 10:47:16 PM org.apache.samza.util.Logging$class info
INFO: Got logged storage partition directory as 
/tmp/beam-samza-test/beamStore/Partition_0
Nov 12, 2018 10:47:16 PM org.apache.samza.util.Logging$class info
INFO: Deleting logged storage partition directory 
/tmp/beam-samza-test/beamStore/Partition_0.
Nov 12, 2018 10:47:16 PM org.apache.samza.util.Logging$class info
INFO: Using non logged storage partition directory: 
/tmp/beam-samza-test/beamStore/Partition_0 for store: beamStore.
Nov 12, 2018 10:47:16 PM org.apache.samza.util.Logging$class info
INFO: Validating change log streams: Map()
Nov 12, 2018 10:47:16 PM org.apache.samza.util.Logging$class info
INFO: Got change log stream metadata: Map()
Nov 12, 2018 10:47:16 PM org.apache.samza.util.Logging$class info
INFO: Assigning oldest change log offsets for taskName Partition 0: Map()
Nov 12, 2018 10:47:16 PM org.apache.samza.util.Logging$class info
INFO: Starting table manager in task instance Partition 0
Nov 12, 2018 10:47:16 PM org.apache.samza.util.Logging$class info
INFO: Starting host statistics monitor
Nov 12, 2018 10:47:16 PM org.apache.samza.util.Logging$class info
INFO: Registering task instances with producers.
Nov 12, 2018 10:47:16 PM org.apache.samza.util.Logging$class info
INFO: Starting producer multiplexer.
Nov 12, 2018 10:47:16 PM org.apache.samza.util.Logging$class info
INFO: Initializing stream tasks.
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
0-split0_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
0-split0_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
1-split1_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
1-split1_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
2-split2_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
2-split2_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
3-split3_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
3-split3_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
4-split4_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
4-split4_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
5-split5_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
5-split5_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
6-split6_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value serde for stream 
6-split6_out__PCollection_. Values will not be (de)serialized
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the key serde for stream 
7-split7_out__PCollection_. Keys will not be (de)serialized
Nov 12, 2018 10:47:16 PM org.apache.samza.operators.StreamGraphImpl 
getKVSerdes
INFO: Using NoOpSerde as the value 

Jenkins build is back to normal : beam_PostCommit_Python_VR_Flink #721

2018-11-12 Thread Apache Jenkins Server
See 



-
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org



Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Gearpump_Gradle #1305

2018-11-12 Thread Apache Jenkins Server
See 


Changes:

[lcwik] [BEAM-6037] Make Spark runner pipeline translation based on URNs (#7005)

--
[...truncated 9.79 MB...]
INFO: received start, clock: -9223372036854775808, sessionId: 13
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 12
Nov 12, 2018 10:57:37 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 10:57:37 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 10:57:37 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 16
Nov 12, 2018 10:57:37 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 17
Nov 12, 2018 10:57:37 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 18
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 19
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 21
Nov 12, 2018 10:57:37 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at -9223372036854775808
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 20
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 24
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 23
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 25
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 22
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 28
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 33
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 26
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 27
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 29
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 30
Nov 12, 2018 10:57:37 PM org.apache.gearpump.streaming.task.TaskActor 
org$apache$gearpump$streaming$task$TaskActor$$onStartTask
INFO: received start, clock: -9223372036854775808, sessionId: 31
Nov 12, 2018 10:57:37 PM 
org.apache.gearpump.streaming.source.DataSourceTask onStart
INFO: opening data source at