Jenkins build is back to normal : beam_PerformanceTests_JDBC #12

2017-03-14 Thread Apache Jenkins Server
See 




Build failed in Jenkins: beam_PerformanceTests_JDBC #11

2017-03-14 Thread Apache Jenkins Server
See 


--
GitHub pull request #2196 of commit df36b7a8d0b75ab1438010150333bb1ff7b99984, 
no merge conflicts.
Setting status of df36b7a8d0b75ab1438010150333bb1ff7b99984 to PENDING with url 
https://builds.apache.org/job/beam_PerformanceTests_JDBC/11/ and message: 
'Build started sha1 is merged.'
Using context: Jenkins: Dataflow Runner JDBC Performance Tests
[EnvInject] - Loading node environment variables.
Building remotely on beam1 (beam) in workspace 

 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* +refs/pull/*:refs/remotes/origin/pr/*
 > git rev-parse refs/remotes/origin/pr/2196/merge^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/pr/2196/merge^{commit} # timeout=10
Checking out Revision 7716a23fc67d513381d0581d9a189329ed399f95 
(refs/remotes/origin/pr/2196/merge)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7716a23fc67d513381d0581d9a189329ed399f95
First time build. Skipping changelog.
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_JDBC] $ /bin/bash -xe /tmp/hudson7184358260618019774.sh
+ git clone -b beam-pkb --single-branch 
https://github.com/jasonkuster/PerfKitBenchmarker.git
fatal: destination path 'PerfKitBenchmarker' already exists and is not an empty 
directory.
Build step 'Execute shell' marked build as failure


Build failed in Jenkins: beam_PerformanceTests_JDBC #10

2017-03-14 Thread Apache Jenkins Server
See 


--
[...truncated 53.54 KB...]
containers in a privileged mode. Note that some benchmarks execute commands
which are only allowed in privileged mode.
(default: 'true')
  --kubeconfig: Path to kubeconfig to be used by kubectl
(default: '')
  --kubectl: Path to kubectl tool
(default: 'kubectl')
  --kubernetes_nodes: IP addresses of Kubernetes Nodes. These need to be
accessible from the machine running Perfkit benchmarker. Example:
"10.20.30.40,10.20.30.41"
(default: '')
(a comma separated list)
  --rbd_pool: Name of RBD pool for Ceph volumes.
(default: 'rbd')
  --rbd_user: Name of RADOS user.
(default: 'admin')
  --username: User name that Perfkit will attempt to use in order to SSH into
Docker instance.
(default: 'root')

perfkitbenchmarker.providers.mesos.flags:
  --docker_cpus: CPU limit for docker containers.
(default: '1.0')
(a number)
  --docker_memory_mb: Memory limit for docker containers.
(default: '2048')
(an integer)
  --marathon_address: Marathon IP address and port.
(default: 'localhost:8080')
  --[no]mesos_privileged_docker: If set to True, will attempt to create Docker
containers in a privileged mode. Note that some benchmarks execute commands
which are only allowed in privileged mode.
(default: 'false')

perfkitbenchmarker.providers.openstack.flags:
  --openstack_additional_flags: Additional flags to pass to every OpenStack CLI
command. See "openstack --help" for more.
(default: '[]')
  --[no]openstack_boot_from_volume: Boot from volume instead of an image
(default: 'false')
  --openstack_cli_path: The path to the OpenStack CLI binary.
(default: 'openstack')
  --[no]openstack_config_drive: Add possibilities to get metadata from external
drive
(default: 'false')
  --openstack_floating_ip_pool: Name of OpenStack floating IP-address pool. If
set, a floating-ip address from this pool will be associatedto each instance
and will be used for communicating with it. To use this flag, an internally
routable network must also be specified via the openstack_network flag.
  --openstack_image_username: Ssh username for cloud image
(default: 'ubuntu')
  --openstack_network: Name of OpenStack network. This network provides
automatically allocated fixed-IP addresses to attached instances. Typically,
this network is used for internal communication between instances. If
openstack_floating_ip_pool is not set then this network will be used to
communicate with the instance.
(default: 'private')
  --openstack_neutron_path: The path to the Neutron CLI binary.
(default: 'neutron')
  --openstack_nova_path: The path to the Nova CLI binary.
(default: 'nova')
  --openstack_private_network: (DEPRECATED: Use openstack_network) Name of
OpenStack private network.
(default: 'private')
  --openstack_public_network: (DEPRECATED: Use openstack_floating_ip_pool) Name
of OpenStack public network.
  --openstack_scheduler_policy: : Add possibility
to use affinity or anti-affinity policy in scheduling process
(default: 'None')
  --openstack_volume_size: Size of the volume (GB)
(an integer)
  --openstack_volume_type: Optional Cinder volume type to use.

perfkitbenchmarker.providers.profitbricks.flags:
  --profitbricks_boot_volume_size: Choose the boot volume size in GB.
(default: '10')
(an integer)
  --profitbricks_boot_volume_type: : Choose between HDD or SSD boot
volume types.
(default: 'HDD')
  --profitbricks_config: Path to config file containing your email and password.
Can also be set via $PROFITBRICKS_CONFIG environment variable.
(File format: email:password)
(default: '~/.config/profitbricks-auth.cfg')
  --profitbricks_location: : Location of data center to be
provisioned (us/las, de/fkb, de/fra)
(default: 'us/las')

perfkitbenchmarker.providers.rackspace.flags:
  --additional_rackspace_flags: Additional global flags to pass to every RackCLI
command. See "rack --help" for more.
(default: '')
(a comma separated list)
  --rack_path: The path for the rack CLI binary.
(default: 'rack')
  --rack_profile: A string indicating which RackCLI profile to use. If none is
specified default profile is used (see https://developer.rackspace.com/docs
/rack-cli/configuration/#config-file)
  --[no]rackspace_boot_from_cbs_volume: When flag is included the instance will
use a remote disk as its boot disk, if machine_type supports it.
(default: 'false')
  --rackspace_network_id: (EXPERIMENTAL) The ID of an already created network to
use instead of creating a new one. Must have a subnet already associated
with the network.
  --rackspace_region: A string indicating which Rackspace region to use.
(default: 'IAD')
  

Jenkins build is back to normal : beam_PostCommit_Python_Verify #1510

2017-03-14 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-1691) Dynamic properties supported in PipelineOptions

2017-03-14 Thread Xu Mingmin (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1691?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15925479#comment-15925479
 ] 

Xu Mingmin commented on BEAM-1691:
--

Sure,take an example that I have one job, it may be run on any runner, like 
Flink, Spark,, --This is the case when i was testing different runners.
Now my PipelineOptions extends both SparkPipelineOptions and 
FlinkPipelineOptions, and package two runners together. *So far it's workable*, 
however may break in these scenarios:
1). when SparkPipelineOptions and FlinkPipelineOptions have one same parameter, 
it results IllegalArgumentException;
2). suppose SparkPipelineOptions has one option without default value, it's 
required when running with FlinkRunner;
Another effect is the package jar is large.  

> Dynamic properties supported in PipelineOptions
> ---
>
> Key: BEAM-1691
> URL: https://issues.apache.org/jira/browse/BEAM-1691
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Xu Mingmin
>Assignee: Davor Bonaci
>
> Usually the two lines to create a new Beam pipeline are:
> {code}
> Options options = 
> PipelineOptionsFactory.fromArgs(args).withValidation().as(Options.class);
> Pipeline pipeline = Pipeline.create(options);
> {code} 
> As each runner has its own PipelineOptions, one piece of code is hardly to 
> run on different runners without code change, --as least Options needs to be 
> updated.
> Dynamic property could be a choice, similar as
> {code}
> -D property1=value1 -D property2=value2 ...
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-1530) BigQueryIO should support value-dependent windows

2017-03-14 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1530?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15925455#comment-15925455
 ] 

ASF GitHub Bot commented on BEAM-1530:
--

Github user reuvenlax closed the pull request at:

https://github.com/apache/beam/pull/2070


> BigQueryIO should support value-dependent windows
> -
>
> Key: BEAM-1530
> URL: https://issues.apache.org/jira/browse/BEAM-1530
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Reuven Lax
>Assignee: Reuven Lax
>




--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2070: [BEAM-1530] Support record-determined output tables

2017-03-14 Thread reuvenlax
Github user reuvenlax closed the pull request at:

https://github.com/apache/beam/pull/2070


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-1676) SdkCoreApiSurfaceTest Failed in JDK7&8 and OpenJDK7&8 on Jenkins

2017-03-14 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1676?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15925449#comment-15925449
 ] 

Kenneth Knowles commented on BEAM-1676:
---

Is it true that a maven command is run as such? The symptoms made me wonder, so 
I've done some browsing through Jenkins Maven  support (native and plugin) and 
I don't have an answer yet.

On the other hand, this test is inspecting a static (not behavioral) property 
of our code, so there's no value added by checking it in different environments 
other than making sure a user/dev in that environment can successfully build it 
from source. So perhaps it isn't important for this matrix. I would rather not 
introduce exceptions, just to keep the tests simple and clean, but I would not 
rule it out as an option.

> SdkCoreApiSurfaceTest Failed in JDK7&8 and OpenJDK7&8 on Jenkins
> 
>
> Key: BEAM-1676
> URL: https://issues.apache.org/jira/browse/BEAM-1676
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Mark Liu
>Assignee: Mark Liu
>
> After running beam_PostCommit_Java_MavenInstall in different JDK versions 
> (JDK7&8, OpenJDK7&8) on Jenkins, all failed due to following error:
> {code}
> Failed tests: 
>   SdkCoreApiSurfaceTest.testSdkApiSurface:59 
> Expected: API surface to include only:
>   Classes in package "org.apache.beam"
>   Classes in package "com.google.api.client"
>   Classes in package "com.google.api.services.bigquery"
>   Classes in package "com.google.api.services.cloudresourcemanager"
>   Classes in package "com.google.api.services.pubsub"
>   Classes in package "com.google.api.services.storage"
>   Classes in package "com.google.auth"
>   Classes in package "com.google.protobuf"
>   Classes in package "com.fasterxml.jackson.annotation"
>   Classes in package "com.fasterxml.jackson.core"
>   Classes in package "com.fasterxml.jackson.databind"
>   Classes in package "org.apache.avro"
>   Classes in package "org.hamcrest"
>   Classes in package "org.codehaus.jackson"
>   Classes in package "org.joda.time"
>   Classes in package "org.junit"
>   
>  but: The following white-listed scopes did not have matching classes on 
> the API surface:
>   No Classes in package "com.fasterxml.jackson.annotation"
>   No Classes in package "com.fasterxml.jackson.core"
>   No Classes in package "com.fasterxml.jackson.databind"
>   No Classes in package "com.google.api.client"
>   No Classes in package "com.google.api.services.bigquery"
>   No Classes in package "com.google.api.services.cloudresourcemanager"
>   No Classes in package "com.google.api.services.pubsub"
>   No Classes in package "com.google.api.services.storage"
>   No Classes in package "com.google.auth"
>   No Classes in package "com.google.protobuf"
>   No Classes in package "org.apache.avro"
>   No Classes in package "org.apache.beam"
>   No Classes in package "org.codehaus.jackson"
>   No Classes in package "org.hamcrest"
>   No Classes in package "org.joda.time"
>   No Classes in package "org.junit"
> {code}
> Job link:
> https://builds.apache.org/job/beam_PostCommit_Java_Version_Test/14/
> Multi-JDK version test is based on this PR:
> https://github.com/apache/beam/pull/2204/files
> Our beam_PostCommit_Java_MavenInstall is using JDK 1.8 (latest), which in 
> good health. And the maven command in version test is the same as 
> beam_PostCommit_Java_MavenInstall.
> Any ideas?



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Jenkins build is back to stable : beam_PostCommit_Java_RunnableOnService_Flink #1938

2017-03-14 Thread Apache Jenkins Server
See 




[jira] [Commented] (BEAM-1718) Returning Duration.millis(Long.MAX_VALUE) in DoFn.getAllowedTimestampSkew() causes Overflow/Underflow

2017-03-14 Thread Kenneth Knowles (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1718?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15925440#comment-15925440
 ] 

Kenneth Knowles commented on BEAM-1718:
---

Is this a basic overflow error where we should do the checking, or is it 
https://bugs.openjdk.java.net/browse/JDK-8074032?

> Returning Duration.millis(Long.MAX_VALUE) in DoFn.getAllowedTimestampSkew() 
> causes Overflow/Underflow
> -
>
> Key: BEAM-1718
> URL: https://issues.apache.org/jira/browse/BEAM-1718
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Affects Versions: 0.5.0
>Reporter: Tobias Feldhaus
>Assignee: Thomas Groh
>
> Overriding getAllowedTimestampSkew() in DoFn and returning 
> Duration.millis(Long.MAX_VALUE) (as suggested in the JavaDoc for allowing 
> infinite skew) causes an Overflow/Underflow



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Build failed in Jenkins: beam_PostCommit_Python_Verify #1509

2017-03-14 Thread Apache Jenkins Server
See 


--
[...truncated 453.30 KB...]
"label": "Transform Function", 
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
"type": "STRING", 
"value": ""
  }, 
  {
"key": "fn", 
"label": "Transform Function", 
"namespace": "apache_beam.transforms.core.ParDo", 
"shortValue": "CallableWrapperDoFn", 
"type": "STRING", 
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
  }
], 
"non_parallel_inputs": {}, 
"output_info": [
  {
"encoding": {
  "@type": "kind:windowed_value", 
  "component_encodings": [
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": [
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": []
}, 
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": []
}
  ], 
  "is_pair_like": true
}, 
{
  "@type": "kind:global_window"
}
  ], 
  "is_wrapper": true
}, 
"output_name": "out", 
"user_name": "write/Write/WriteImpl/Extract.out"
  }
], 
"parallel_input": {
  "@type": "OutputReference", 
  "output_name": "out", 
  "step_name": "s13"
}, 
"serialized_fn": "", 
"user_name": "write/Write/WriteImpl/Extract"
  }
}, 
{
  "kind": "CollectionToSingleton", 
  "name": "s15", 
  "properties": {
"display_data": [], 
"output_info": [
  {
"encoding": {
  "@type": "kind:windowed_value", 
  "component_encodings": [
{
  "@type": "kind:windowed_value", 
  "component_encodings": [
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": [
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": []
}, 
{
  "@type": 
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
 
  "component_encodings": []
}
  ], 
  "is_pair_like": true
}, 
{
  "@type": "kind:global_window"
}
  ], 
  "is_wrapper": true
}
  ]
}, 
"output_name": "out", 
"user_name": 
"write/Write/WriteImpl/ViewAsIterable(write|Write|WriteImpl|Extract.None)/CreatePCollectionView.out"
  }
], 
"parallel_input": {
  "@type": "OutputReference", 
  "output_name": "out", 
  "step_name": "s14"
}, 
"user_name": 
"write/Write/WriteImpl/ViewAsIterable(write|Write|WriteImpl|Extract.None)/CreatePCollectionView"
  }
}, 
{
  "kind": "ParallelDo", 
  "name": "s16", 
  "properties": {
"display_data": [
  {
"key": "fn", 
"label": "Transform Function", 
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
"type": "STRING", 
"value": "_finalize_write"
  }, 
  {
"key": "fn", 
"label": "Transform Function", 
"namespace": "apache_beam.transforms.core.ParDo", 
"shortValue": "CallableWrapperDoFn", 
"type": "STRING", 
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
  }
], 
"non_parallel_inputs": {
  "s15": {
"@type": "OutputReference", 
  

[jira] [Commented] (BEAM-1691) Dynamic properties supported in PipelineOptions

2017-03-14 Thread Davor Bonaci (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1691?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15925408#comment-15925408
 ] 

Davor Bonaci commented on BEAM-1691:


[~mingmxu], can you say more please?

Generally speaking, runners may require different pipeline options. Options are 
provided by individual runners and they have to be on the classpath to be able 
to set such options. If users wants multiple runners on the classpath, this 
will just work -- options for multiple runners can be set.

However -- users need to tweak at least one option: --runner. Tweaking any 
runner-specific along the way isn't that different.

> Dynamic properties supported in PipelineOptions
> ---
>
> Key: BEAM-1691
> URL: https://issues.apache.org/jira/browse/BEAM-1691
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Xu Mingmin
>Assignee: Davor Bonaci
>
> Usually the two lines to create a new Beam pipeline are:
> {code}
> Options options = 
> PipelineOptionsFactory.fromArgs(args).withValidation().as(Options.class);
> Pipeline pipeline = Pipeline.create(options);
> {code} 
> As each runner has its own PipelineOptions, one piece of code is hardly to 
> run on different runners without code change, --as least Options needs to be 
> updated.
> Dynamic property could be a choice, similar as
> {code}
> -D property1=value1 -D property2=value2 ...
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-1676) SdkCoreApiSurfaceTest Failed in JDK7&8 and OpenJDK7&8 on Jenkins

2017-03-14 Thread Davor Bonaci (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1676?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15925403#comment-15925403
 ] 

Davor Bonaci commented on BEAM-1676:


[~markflyhigh], I'm sorry, but I don't follow. As Kenn said above, we should 
understand the environment in which this happens.

At some point, a Maven command is invoked. We need to figure out what that 
command is. Then, there's environment -- in what directory is it run, what is 
the contents of that directory at the time of invocation, and what are the 
environment variables. This can be captured easily, e.g., running pwd && ls 
-alR && export, before running Maven. At that point, we have a pretty hermetic 
view of this problem, which can be reproduced and debugged accordingly. Even 
further, we can log into the machine executing this and inspect it *locally*.

> SdkCoreApiSurfaceTest Failed in JDK7&8 and OpenJDK7&8 on Jenkins
> 
>
> Key: BEAM-1676
> URL: https://issues.apache.org/jira/browse/BEAM-1676
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Mark Liu
>Assignee: Mark Liu
>
> After running beam_PostCommit_Java_MavenInstall in different JDK versions 
> (JDK7&8, OpenJDK7&8) on Jenkins, all failed due to following error:
> {code}
> Failed tests: 
>   SdkCoreApiSurfaceTest.testSdkApiSurface:59 
> Expected: API surface to include only:
>   Classes in package "org.apache.beam"
>   Classes in package "com.google.api.client"
>   Classes in package "com.google.api.services.bigquery"
>   Classes in package "com.google.api.services.cloudresourcemanager"
>   Classes in package "com.google.api.services.pubsub"
>   Classes in package "com.google.api.services.storage"
>   Classes in package "com.google.auth"
>   Classes in package "com.google.protobuf"
>   Classes in package "com.fasterxml.jackson.annotation"
>   Classes in package "com.fasterxml.jackson.core"
>   Classes in package "com.fasterxml.jackson.databind"
>   Classes in package "org.apache.avro"
>   Classes in package "org.hamcrest"
>   Classes in package "org.codehaus.jackson"
>   Classes in package "org.joda.time"
>   Classes in package "org.junit"
>   
>  but: The following white-listed scopes did not have matching classes on 
> the API surface:
>   No Classes in package "com.fasterxml.jackson.annotation"
>   No Classes in package "com.fasterxml.jackson.core"
>   No Classes in package "com.fasterxml.jackson.databind"
>   No Classes in package "com.google.api.client"
>   No Classes in package "com.google.api.services.bigquery"
>   No Classes in package "com.google.api.services.cloudresourcemanager"
>   No Classes in package "com.google.api.services.pubsub"
>   No Classes in package "com.google.api.services.storage"
>   No Classes in package "com.google.auth"
>   No Classes in package "com.google.protobuf"
>   No Classes in package "org.apache.avro"
>   No Classes in package "org.apache.beam"
>   No Classes in package "org.codehaus.jackson"
>   No Classes in package "org.hamcrest"
>   No Classes in package "org.joda.time"
>   No Classes in package "org.junit"
> {code}
> Job link:
> https://builds.apache.org/job/beam_PostCommit_Java_Version_Test/14/
> Multi-JDK version test is based on this PR:
> https://github.com/apache/beam/pull/2204/files
> Our beam_PostCommit_Java_MavenInstall is using JDK 1.8 (latest), which in 
> good health. And the maven command in version test is the same as 
> beam_PostCommit_Java_MavenInstall.
> Any ideas?



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Jenkins build is still unstable: beam_PostCommit_Java_RunnableOnService_Flink #1937

2017-03-14 Thread Apache Jenkins Server
See 




Jenkins build became unstable: beam_PostCommit_Java_RunnableOnService_Flink #1936

2017-03-14 Thread Apache Jenkins Server
See 




[jira] [Created] (BEAM-1725) SparkRunner should deduplicate when an UnboundedSource requires Deduping

2017-03-14 Thread Thomas Groh (JIRA)
Thomas Groh created BEAM-1725:
-

 Summary: SparkRunner should deduplicate when an UnboundedSource 
requires Deduping
 Key: BEAM-1725
 URL: https://issues.apache.org/jira/browse/BEAM-1725
 Project: Beam
  Issue Type: Bug
  Components: runner-spark
Reporter: Thomas Groh


The implementation of an Unbounded Read does not inspect the requiresDeduping 
property of the source, and as such does not appropriately deduplicate sources 
that require it.

https://github.com/apache/beam/blob/master/runners/spark/src/main/java/org/apache/beam/runners/spark/io/SparkUnboundedSource.java



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (BEAM-1724) ApexRunner should deduplicate when an UnboundedSource requires Deduping

2017-03-14 Thread Thomas Groh (JIRA)
Thomas Groh created BEAM-1724:
-

 Summary: ApexRunner should deduplicate when an UnboundedSource 
requires Deduping
 Key: BEAM-1724
 URL: https://issues.apache.org/jira/browse/BEAM-1724
 Project: Beam
  Issue Type: Bug
  Components: runner-apex
Reporter: Thomas Groh


The Apex implementation of UnboundedRead does not inspect the requiresDeduping 
property of an Unbounded Source. It should do so.

https://github.com/apache/beam/blob/master/runners/apex/src/main/java/org/apache/beam/runners/apex/translation/operators/ApexReadUnboundedInputOperator.java



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (BEAM-1723) FlinkRunner should deduplicate when an UnboundedSource requires Deduping

2017-03-14 Thread Thomas Groh (JIRA)
Thomas Groh created BEAM-1723:
-

 Summary: FlinkRunner should deduplicate when an UnboundedSource 
requires Deduping
 Key: BEAM-1723
 URL: https://issues.apache.org/jira/browse/BEAM-1723
 Project: Beam
  Issue Type: Bug
  Components: runner-flink
Reporter: Thomas Groh


UnboundedSource implementations can require deduping, and the FlinkRunner 
currently logs a warning that this is not supported.

https://github.com/apache/beam/blob/master/runners/flink/runner/src/main/java/org/apache/beam/runners/flink/translation/wrappers/streaming/io/UnboundedSourceWrapper.java#L139



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


Jenkins build is still unstable: beam_PostCommit_Java_MavenInstall #2908

2017-03-14 Thread Apache Jenkins Server
See 




Jenkins build is back to stable : beam_PostCommit_Java_RunnableOnService_Flink #1935

2017-03-14 Thread Apache Jenkins Server
See 




Jenkins build is still unstable: beam_PostCommit_Java_RunnableOnService_Flink #1934

2017-03-14 Thread Apache Jenkins Server
See 




Jenkins build is still unstable: beam_PostCommit_Java_RunnableOnService_Flink #1933

2017-03-14 Thread Apache Jenkins Server
See 




[jira] [Updated] (BEAM-1722) Get PubsubIO and the remaining parts of the GCP IOs out of the core SDK

2017-03-14 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-1722?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía updated BEAM-1722:
---
Affects Version/s: (was: First stable release)

> Get PubsubIO and the remaining parts of the GCP IOs out of the core SDK
> ---
>
> Key: BEAM-1722
> URL: https://issues.apache.org/jira/browse/BEAM-1722
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core, sdk-java-gcp
>Reporter: Ismaël Mejía
> Fix For: First stable release
>
>
> As discussed with Davor, we still have to wait for some ongoing changes for 
> this, but I am going to create the JIRA to track this as an issue for the 
> first stable release.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (BEAM-1722) Get PubsubIO and the remaining parts of the GCP IOs out of the core SDK

2017-03-14 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-1722?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía updated BEAM-1722:
---
Fix Version/s: First stable release

> Get PubsubIO and the remaining parts of the GCP IOs out of the core SDK
> ---
>
> Key: BEAM-1722
> URL: https://issues.apache.org/jira/browse/BEAM-1722
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core, sdk-java-gcp
>Reporter: Ismaël Mejía
> Fix For: First stable release
>
>
> As discussed with Davor, we still have to wait for some ongoing changes for 
> this, but I am going to create the JIRA to track this as an issue for the 
> first stable release.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (BEAM-911) Most of IOs API should be flagged by @Experimental

2017-03-14 Thread JIRA

 [ 
https://issues.apache.org/jira/browse/BEAM-911?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Ismaël Mejía updated BEAM-911:
--
Fix Version/s: First stable release

> Most of IOs API should be flagged by @Experimental
> --
>
> Key: BEAM-911
> URL: https://issues.apache.org/jira/browse/BEAM-911
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-extensions
>Reporter: Jean-Baptiste Onofré
>Assignee: Jean-Baptiste Onofré
> Fix For: First stable release
>
>
> As the IO API can still change (for instance we changed {{JdbcIO}}, {{JmsIO}} 
> will also changed, ...), even the IO by itself can be used, we should 
> indicate to the users that the API can change.
> I propose to flag those IOs with {{@Experimental}}.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[1/3] beam-site git commit: Add htaccess back

2017-03-14 Thread davor
Repository: beam-site
Updated Branches:
  refs/heads/asf-site 4490a1f9b -> 4b64a0306


Add htaccess back


Project: http://git-wip-us.apache.org/repos/asf/beam-site/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam-site/commit/dd81235b
Tree: http://git-wip-us.apache.org/repos/asf/beam-site/tree/dd81235b
Diff: http://git-wip-us.apache.org/repos/asf/beam-site/diff/dd81235b

Branch: refs/heads/asf-site
Commit: dd81235bd29faa14168e73ccf5f230de4b916540
Parents: 4490a1f
Author: Sourabh Bajaj 
Authored: Tue Mar 14 16:17:26 2017 -0700
Committer: Sourabh Bajaj 
Committed: Tue Mar 14 16:17:26 2017 -0700

--
 _config.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/beam-site/blob/dd81235b/_config.yml
--
diff --git a/_config.yml b/_config.yml
index 2305c77..7d36162 100644
--- a/_config.yml
+++ b/_config.yml
@@ -36,7 +36,7 @@ collections:
 - beam_team
 
 # Things to include that are subdirectories created by sphinx
-include: ['_static', '_modules', '_sources']
+include: ['_static', '_modules', '_sources', '.htaccess']
 
 # Things to ignore in the build
 exclude: ['README.md', 'Gemfile.lock', 'Gemfile', 'Rakefile', 'vendor/']



svn commit: r18748 - /dev/beam/0.6.0/

2017-03-14 Thread altay
Author: altay
Date: Tue Mar 14 22:59:09 2017
New Revision: 18748

Log:
Remove RC2



Removed:
dev/beam/0.6.0/



[GitHub] beam pull request #2248: [BEAM-1709] Implement Single-Output ParDo as a comp...

2017-03-14 Thread tgroh
GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/2248

[BEAM-1709] Implement Single-Output ParDo as a composite

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---
This rolls-forwards after commit 8766b03.

Reset back to spark translations, and re-merge

Implement a Primitive ParDoSingle override in the Dataflow Runner.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam no_par_do_single

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2248.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2248


commit 9e8e5c9b3b7d449485f2f0f0c69e82e18337b5a5
Author: Thomas Groh 
Date:   2017-03-14T19:39:57Z

Migrate ForwardingPTransform to core-construction

commit 0145825e38f11f6730bf5d0c02aca1b9b9370c28
Author: Thomas Groh 
Date:   2017-03-14T18:02:15Z

Implement Single-Output ParDo as a composite

This rolls-forwards after commit 8766b03eb31b4f16de8fbf5a6902378a2c1151e0.

Reset back to spark translations, and re-merge

Implement a Primitive ParDoSingle override in the Dataflow Runner.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Closed] (BEAM-1427) BigQueryIO should comply with PTransform style guide

2017-03-14 Thread Eugene Kirpichov (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1427?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eugene Kirpichov closed BEAM-1427.
--
   Resolution: Fixed
Fix Version/s: (was: First stable release)
   Not applicable

> BigQueryIO should comply with PTransform style guide
> 
>
> Key: BEAM-1427
> URL: https://issues.apache.org/jira/browse/BEAM-1427
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-gcp
>Reporter: Eugene Kirpichov
>Assignee: Eugene Kirpichov
>  Labels: backward-incompatible
> Fix For: Not applicable
>
>
> Suggested changes:
> - Remove Read/Write.Bound classes - Read and Write themselves should be the 
> transform classes
> - Remove static builder-like .withBlah() methods
> - (optional) use AutoValue



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[07/10] beam git commit: Simplify configuration of StreamWithDedup

2017-03-14 Thread tgroh
Simplify configuration of StreamWithDedup


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/1adcbaea
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/1adcbaea
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/1adcbaea

Branch: refs/heads/master
Commit: 1adcbaea799e83016ec91f7b7155c3a25804ce6c
Parents: 5c71589
Author: Eugene Kirpichov 
Authored: Thu Mar 2 18:19:51 2017 -0800
Committer: Thomas Groh 
Committed: Tue Mar 14 15:54:32 2017 -0700

--
 .../beam/sdk/io/gcp/bigquery/BigQueryIO.java| 73 +++-
 1 file changed, 26 insertions(+), 47 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/1adcbaea/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
index d2f6ba6..e039c8c 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
@@ -40,7 +40,6 @@ import com.google.auto.value.AutoValue;
 import com.google.cloud.hadoop.util.ApiErrorExtractor;
 import com.google.common.annotations.VisibleForTesting;
 import com.google.common.base.Function;
-import com.google.common.base.MoreObjects;
 import com.google.common.base.Strings;
 import com.google.common.collect.ImmutableList;
 import com.google.common.collect.Lists;
@@ -462,7 +461,7 @@ public class BigQueryIO {
 abstract boolean getValidate();
 @Nullable abstract Boolean getFlattenResults();
 @Nullable abstract Boolean getUseLegacySql();
-@Nullable abstract BigQueryServices getBigQueryServices();
+abstract BigQueryServices getBigQueryServices();
 
 abstract Builder toBuilder();
 
@@ -645,8 +644,6 @@ public class BigQueryIO {
   jobUuid, new BeamJobUuidToBigQueryJobUuid());
 
   BoundedSource source;
-  final BigQueryServices bqServices =
-  MoreObjects.firstNonNull(getBigQueryServices(), new 
BigQueryServicesImpl());
 
   final String extractDestinationDir;
   String tempLocation = bqOptions.getTempLocation();
@@ -670,11 +667,11 @@ public class BigQueryIO {
 getFlattenResults(),
 getUseLegacySql(),
 extractDestinationDir,
-bqServices);
+getBigQueryServices());
   } else {
 ValueProvider inputTable = 
getTableWithDefaultProject(bqOptions);
 source = BigQueryTableSource.create(
-jobIdToken, inputTable, extractDestinationDir, bqServices,
+jobIdToken, inputTable, extractDestinationDir, 
getBigQueryServices(),
 StaticValueProvider.of(executingProject));
   }
   PassThroughThenCleanup.CleanupOperation cleanupOperation =
@@ -687,7 +684,7 @@ public class BigQueryIO {
   .setProjectId(executingProject)
   .setJobId(getExtractJobId(jobIdToken));
 
-  Job extractJob = bqServices.getJobService(bqOptions)
+  Job extractJob = getBigQueryServices().getJobService(bqOptions)
   .getJob(jobRef);
 
   Collection extractFiles = null;
@@ -1390,7 +1387,7 @@ public class BigQueryIO {
 @Nullable abstract String getTableDescription();
 /** An option to indicate if table validation is desired. Default is true. 
*/
 abstract boolean getValidate();
-@Nullable abstract BigQueryServices getBigQueryServices();
+abstract BigQueryServices getBigQueryServices();
 
 abstract Builder toBuilder();
 
@@ -1650,12 +1647,10 @@ public class BigQueryIO {
   "CreateDisposition is CREATE_IF_NEEDED, however no schema was 
provided.");
 
   // The user specified a table.
-  BigQueryServices bqServices =
-  MoreObjects.firstNonNull(getBigQueryServices(), new 
BigQueryServicesImpl());
   if (getJsonTableRef() != null && getValidate()) {
 TableReference table = getTableWithDefaultProject(options).get();
 
-DatasetService datasetService = bqServices.getDatasetService(options);
+DatasetService datasetService = 
getBigQueryServices().getDatasetService(options);
 // Check for destination table presence and emptiness for early 
failure notification.
 // Note that a presence check can fail when the table or dataset is 
created by an earlier
 // stage of the pipeline. For these cases the #withoutValidation 
method can be used to
@@ -1693,7 +1688,7 @@ public class 

[06/10] beam git commit: Use AutoValue for BigQueryIO.Write

2017-03-14 Thread tgroh
Use AutoValue for BigQueryIO.Write


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/5c715896
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/5c715896
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/5c715896

Branch: refs/heads/master
Commit: 5c715896e683f7bec9f150ea17c78d1dae00ee4c
Parents: 32cba32
Author: Eugene Kirpichov 
Authored: Thu Mar 2 18:14:39 2017 -0800
Committer: Thomas Groh 
Committed: Tue Mar 14 15:54:30 2017 -0700

--
 .../beam/sdk/io/gcp/bigquery/BigQueryIO.java| 314 ++-
 .../sdk/io/gcp/bigquery/BigQueryIOTest.java |   8 +-
 2 files changed, 108 insertions(+), 214 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/5c715896/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
index e5db60e..d2f6ba6 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
@@ -471,9 +471,9 @@ public class BigQueryIO {
   abstract Builder setJsonTableRef(ValueProvider jsonTableRef);
   abstract Builder setQuery(ValueProvider query);
   abstract Builder setValidate(boolean validate);
-  abstract Builder setFlattenResults(@Nullable Boolean flattenResults);
-  abstract Builder setUseLegacySql(@Nullable Boolean useLegacySql);
-  abstract Builder setBigQueryServices(@Nullable BigQueryServices 
bigQueryServices);
+  abstract Builder setFlattenResults(Boolean flattenResults);
+  abstract Builder setUseLegacySql(Boolean useLegacySql);
+  abstract Builder setBigQueryServices(BigQueryServices bigQueryServices);
 
   abstract Read build();
 }
@@ -1364,10 +1364,13 @@ public class BigQueryIO {
*   }
* }}
*/
-  public static class Write extends PTransform {
+  @AutoValue
+  public abstract static class Write extends PTransform {
+@VisibleForTesting
 // Maximum number of files in a single partition.
 static final int MAX_NUM_FILES = 1;
 
+@VisibleForTesting
 // Maximum number of bytes in a single partition -- 11 TiB just under BQ's 
12 TiB limit.
 static final long MAX_SIZE_BYTES = 11 * (1L << 40);
 
@@ -1378,28 +1381,41 @@ public class BigQueryIO {
 // It sets to {@code Integer.MAX_VALUE} to block until the BigQuery job 
finishes.
 private static final int LOAD_JOB_POLL_MAX_RETRIES = Integer.MAX_VALUE;
 
-@Nullable private final ValueProvider jsonTableRef;
-
-@Nullable private final SerializableFunction tableRefFunction;
-
-// Table schema. The schema is required only if the table does not exist.
-@Nullable private final ValueProvider jsonSchema;
-
-// Options for creating the table. Valid values are CREATE_IF_NEEDED and
-// CREATE_NEVER.
-final CreateDisposition createDisposition;
+@Nullable abstract ValueProvider getJsonTableRef();
+@Nullable abstract SerializableFunction 
getTableRefFunction();
+/** Table schema. The schema is required only if the table does not exist. 
*/
+@Nullable abstract ValueProvider getJsonSchema();
+abstract CreateDisposition getCreateDisposition();
+abstract WriteDisposition getWriteDisposition();
+@Nullable abstract String getTableDescription();
+/** An option to indicate if table validation is desired. Default is true. 
*/
+abstract boolean getValidate();
+@Nullable abstract BigQueryServices getBigQueryServices();
 
-// Options for writing to the table. Valid values are WRITE_TRUNCATE,
-// WRITE_APPEND and WRITE_EMPTY.
-final WriteDisposition writeDisposition;
+abstract Builder toBuilder();
 
-@Nullable
-final String tableDescription;
+@AutoValue.Builder
+abstract static class Builder {
+  abstract Builder setJsonTableRef(ValueProvider jsonTableRef);
+  abstract Builder setTableRefFunction(
+  SerializableFunction 
tableRefFunction);
+  abstract Builder setJsonSchema(ValueProvider jsonSchema);
+  abstract Builder setCreateDisposition(CreateDisposition 
createDisposition);
+  abstract Builder setWriteDisposition(WriteDisposition writeDisposition);
+  abstract Builder setTableDescription(String tableDescription);
+  abstract Builder 

[05/10] beam git commit: Use AutoValue for BigQueryIO.Read

2017-03-14 Thread tgroh
Use AutoValue for BigQueryIO.Read


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/32cba321
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/32cba321
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/32cba321

Branch: refs/heads/master
Commit: 32cba321c04c5e3fb18856c84ea10b3513264dd5
Parents: d6b3e11
Author: Eugene Kirpichov 
Authored: Thu Mar 2 17:51:04 2017 -0800
Committer: Thomas Groh 
Committed: Tue Mar 14 15:54:27 2017 -0700

--
 .../beam/sdk/io/gcp/bigquery/BigQueryIO.java| 215 +++
 .../sdk/io/gcp/bigquery/BigQueryIOTest.java |  21 +-
 2 files changed, 84 insertions(+), 152 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/32cba321/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
index 90d7f67..e5db60e 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
@@ -36,9 +36,11 @@ import com.google.api.services.bigquery.model.Table;
 import com.google.api.services.bigquery.model.TableReference;
 import com.google.api.services.bigquery.model.TableRow;
 import com.google.api.services.bigquery.model.TableSchema;
+import com.google.auto.value.AutoValue;
 import com.google.cloud.hadoop.util.ApiErrorExtractor;
 import com.google.common.annotations.VisibleForTesting;
 import com.google.common.base.Function;
+import com.google.common.base.MoreObjects;
 import com.google.common.base.Strings;
 import com.google.common.collect.ImmutableList;
 import com.google.common.collect.Lists;
@@ -453,97 +455,86 @@ public class BigQueryIO {
*   }
* }}
*/
-  public static class Read extends PTransform {
-@Nullable final ValueProvider jsonTableRef;
-@Nullable final ValueProvider query;
+  @AutoValue
+  public abstract static class Read extends PTransform {
+@Nullable abstract ValueProvider getJsonTableRef();
+@Nullable abstract ValueProvider getQuery();
+abstract boolean getValidate();
+@Nullable abstract Boolean getFlattenResults();
+@Nullable abstract Boolean getUseLegacySql();
+@Nullable abstract BigQueryServices getBigQueryServices();
+
+abstract Builder toBuilder();
+
+@AutoValue.Builder
+abstract static class Builder {
+  abstract Builder setJsonTableRef(ValueProvider jsonTableRef);
+  abstract Builder setQuery(ValueProvider query);
+  abstract Builder setValidate(boolean validate);
+  abstract Builder setFlattenResults(@Nullable Boolean flattenResults);
+  abstract Builder setUseLegacySql(@Nullable Boolean useLegacySql);
+  abstract Builder setBigQueryServices(@Nullable BigQueryServices 
bigQueryServices);
+
+  abstract Read build();
+}
+
+private static Builder builder() {
+  return new AutoValue_BigQueryIO_Read.Builder()
+  .setValidate(true)
+  .setBigQueryServices(new BigQueryServicesImpl());
+}
 
 /**
  * Reads a BigQuery table specified as {@code 
"[project_id]:[dataset_id].[table_id]"} or
  * {@code "[dataset_id].[table_id]"} for tables within the current project.
  */
 public static Read from(String tableSpec) {
-  return new Read().from(StaticValueProvider.of(tableSpec));
+  return from(StaticValueProvider.of(tableSpec));
 }
 
 /**
  * Same as {@code from(String)}, but with a {@link ValueProvider}.
  */
 public static Read from(ValueProvider tableSpec) {
-  return new Read().from(tableSpec);
+  return builder()
+  .setJsonTableRef(
+  NestedValueProvider.of(
+  NestedValueProvider.of(tableSpec, new TableSpecToTableRef()),
+  new TableRefToJson())).build();
 }
 
 /**
  * Reads results received after executing the given query.
  */
 public static Read fromQuery(String query) {
-  return new Read().fromQuery(StaticValueProvider.of(query));
+  return fromQuery(StaticValueProvider.of(query));
 }
 
 /**
  * Same as {@code from(String)}, but with a {@link ValueProvider}.
  */
 public static Read fromQuery(ValueProvider query) {
-  return new Read().fromQuery(query);
+  return 
builder().setQuery(query).setFlattenResults(true).setUseLegacySql(true).build();
 }
 
 /**
  * Reads a BigQuery table 

[02/10] beam git commit: Condense BigQueryIO.Read.Bound into BigQueryIO.Read

2017-03-14 Thread tgroh
Condense BigQueryIO.Read.Bound into BigQueryIO.Read


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/825338aa
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/825338aa
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/825338aa

Branch: refs/heads/master
Commit: 825338aaa5d7e5ead1afa13f63c65fb316e1aa6a
Parents: 30f3634
Author: Eugene Kirpichov 
Authored: Thu Mar 2 17:15:47 2017 -0800
Committer: Thomas Groh 
Committed: Tue Mar 14 15:54:22 2017 -0700

--
 .../beam/sdk/io/gcp/bigquery/BigQueryIO.java| 690 +--
 .../sdk/io/gcp/bigquery/BigQueryIOTest.java | 117 ++--
 2 files changed, 359 insertions(+), 448 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/825338aa/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
index 2902c2b..f6c8575 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
@@ -39,7 +39,6 @@ import com.google.api.services.bigquery.model.TableSchema;
 import com.google.cloud.hadoop.util.ApiErrorExtractor;
 import com.google.common.annotations.VisibleForTesting;
 import com.google.common.base.Function;
-import com.google.common.base.MoreObjects;
 import com.google.common.base.Strings;
 import com.google.common.collect.ImmutableList;
 import com.google.common.collect.Lists;
@@ -454,448 +453,379 @@ public class BigQueryIO {
*   }
* }}
*/
-  public static class Read {
+  public static class Read extends PTransform {
+@Nullable final ValueProvider jsonTableRef;
+@Nullable final ValueProvider query;
 
 /**
  * Reads a BigQuery table specified as {@code 
"[project_id]:[dataset_id].[table_id]"} or
  * {@code "[dataset_id].[table_id]"} for tables within the current project.
  */
-public static Bound from(String tableSpec) {
-  return new Bound().from(StaticValueProvider.of(tableSpec));
+public static Read from(String tableSpec) {
+  return new Read().from(StaticValueProvider.of(tableSpec));
 }
 
 /**
  * Same as {@code from(String)}, but with a {@link ValueProvider}.
  */
-public static Bound from(ValueProvider tableSpec) {
-  return new Bound().from(tableSpec);
+public static Read from(ValueProvider tableSpec) {
+  return new Read().from(tableSpec);
 }
 
 /**
  * Reads results received after executing the given query.
  */
-public static Bound fromQuery(String query) {
-  return new Bound().fromQuery(StaticValueProvider.of(query));
+public static Read fromQuery(String query) {
+  return new Read().fromQuery(StaticValueProvider.of(query));
 }
 
 /**
  * Same as {@code from(String)}, but with a {@link ValueProvider}.
  */
-public static Bound fromQuery(ValueProvider query) {
-  return new Bound().fromQuery(query);
+public static Read fromQuery(ValueProvider query) {
+  return new Read().fromQuery(query);
 }
 
 /**
  * Reads a BigQuery table specified as a {@link TableReference} object.
  */
-public static Bound from(TableReference table) {
-  return new Bound().from(table);
+public static Read from(TableReference table) {
+  return new Read().from(table);
 }
 
 /**
- * Disables BigQuery table validation, which is enabled by default.
+ * Disable validation that the table exists or the query succeeds prior to 
pipeline
+ * submission. Basic validation (such as ensuring that a query or table is 
specified) still
+ * occurs.
  */
-public static Bound withoutValidation() {
-  return new Bound().withoutValidation();
+final boolean validate;
+@Nullable final Boolean flattenResults;
+@Nullable final Boolean useLegacySql;
+@Nullable BigQueryServices bigQueryServices;
+
+@VisibleForTesting @Nullable String stepUuid;
+@VisibleForTesting @Nullable ValueProvider jobUuid;
+
+private static final String QUERY_VALIDATION_FAILURE_ERROR =
+"Validation of query \"%1$s\" failed. If the query depends on an 
earlier stage of the"
++ " pipeline, This validation can be disabled using 
#withoutValidation.";
+
+private Read() {
+  this(
+  null /* name */,
+  null /* query */,
+  null /* jsonTableRef */,
+  true /* validate 

[03/10] beam git commit: Condense BigQueryIO.Write.Bound into BigQueryIO.Write

2017-03-14 Thread tgroh
Condense BigQueryIO.Write.Bound into BigQueryIO.Write


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/7d1f4400
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/7d1f4400
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/7d1f4400

Branch: refs/heads/master
Commit: 7d1f4400ab844c7b4e636482891be55174390431
Parents: 825338a
Author: Eugene Kirpichov 
Authored: Thu Mar 2 17:26:09 2017 -0800
Committer: Thomas Groh 
Committed: Tue Mar 14 15:54:24 2017 -0700

--
 .../beam/sdk/io/gcp/bigquery/BigQueryIO.java| 1080 +-
 .../sdk/io/gcp/bigquery/BigQueryIOTest.java |  116 +-
 2 files changed, 552 insertions(+), 644 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/7d1f4400/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
index f6c8575..90d7f67 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
@@ -1415,7 +1415,43 @@ public class BigQueryIO {
*   }
* }}
*/
-  public static class Write {
+  public static class Write extends PTransform {
+// Maximum number of files in a single partition.
+static final int MAX_NUM_FILES = 1;
+
+// Maximum number of bytes in a single partition -- 11 TiB just under BQ's 
12 TiB limit.
+static final long MAX_SIZE_BYTES = 11 * (1L << 40);
+
+// The maximum number of retry jobs.
+private static final int MAX_RETRY_JOBS = 3;
+
+// The maximum number of retries to poll the status of a job.
+// It sets to {@code Integer.MAX_VALUE} to block until the BigQuery job 
finishes.
+private static final int LOAD_JOB_POLL_MAX_RETRIES = Integer.MAX_VALUE;
+
+@Nullable private final ValueProvider jsonTableRef;
+
+@Nullable private final SerializableFunction tableRefFunction;
+
+// Table schema. The schema is required only if the table does not exist.
+@Nullable private final ValueProvider jsonSchema;
+
+// Options for creating the table. Valid values are CREATE_IF_NEEDED and
+// CREATE_NEVER.
+final CreateDisposition createDisposition;
+
+// Options for writing to the table. Valid values are WRITE_TRUNCATE,
+// WRITE_APPEND and WRITE_EMPTY.
+final WriteDisposition writeDisposition;
+
+@Nullable
+final String tableDescription;
+
+// An option to indicate if table validation is desired. Default is true.
+final boolean validate;
+
+@Nullable private BigQueryServices bigQueryServices;
+
 /**
  * An enumeration type for the BigQuery create disposition strings.
  *
@@ -1488,18 +1524,18 @@ public class BigQueryIO {
  *
  * Refer to {@link #parseTableSpec(String)} for the specification 
format.
  */
-public static Bound to(String tableSpec) {
-  return new Bound().to(tableSpec);
+public static Write to(String tableSpec) {
+  return new Write().withTableSpec(tableSpec);
 }
 
 /** Creates a write transformation for the given table. */
-public static Bound to(ValueProvider tableSpec) {
-  return new Bound().to(tableSpec);
+public static Write to(ValueProvider tableSpec) {
+  return new Write().withTableSpec(tableSpec);
 }
 
 /** Creates a write transformation for the given table. */
-public static Bound to(TableReference table) {
-  return new Bound().to(table);
+public static Write to(TableReference table) {
+  return new Write().withTableRef(table);
 }
 
 /**
@@ -1513,8 +1549,8 @@ public class BigQueryIO {
  * {@code tableSpecFunction} should be deterministic. When given the 
same window, it should
  * always return the same table specification.
  */
-public static Bound to(SerializableFunction 
tableSpecFunction) {
-  return new Bound().to(tableSpecFunction);
+public static Write to(SerializableFunction 
tableSpecFunction) {
+  return new Write().withTableSpec(tableSpecFunction);
 }
 
 /**
@@ -1524,634 +1560,547 @@ public class BigQueryIO {
  * {@code tableRefFunction} should be deterministic. When given the 
same window, it should
  * always return the same table reference.
  */
-public static Bound toTableReference(
+private static Write toTableReference(

[09/10] beam git commit: Replace BigQueryIO.Write.to() with BigQueryIO.write().to()

2017-03-14 Thread tgroh
Replace BigQueryIO.Write.to() with BigQueryIO.write().to()


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/101715a7
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/101715a7
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/101715a7

Branch: refs/heads/master
Commit: 101715a788509aa9bdfdefd54eaceca35feca485
Parents: 1a252a7
Author: Eugene Kirpichov 
Authored: Mon Mar 13 16:21:39 2017 -0700
Committer: Thomas Groh 
Committed: Tue Mar 14 15:54:39 2017 -0700

--
 .../beam/examples/complete/AutoComplete.java|  2 +-
 .../examples/complete/StreamingWordExtract.java |  2 +-
 .../examples/complete/TrafficMaxLaneFlow.java   |  2 +-
 .../beam/examples/complete/TrafficRoutes.java   |  2 +-
 .../examples/cookbook/BigQueryTornadoes.java|  2 +-
 .../cookbook/CombinePerKeyExamples.java |  2 +-
 .../beam/examples/cookbook/FilterExamples.java  |  2 +-
 .../examples/cookbook/MaxPerKeyExamples.java|  2 +-
 .../beam/examples/cookbook/TriggerExample.java  |  2 +-
 .../complete/game/utils/WriteToBigQuery.java|  2 +-
 .../game/utils/WriteWindowedToBigQuery.java |  2 +-
 .../beam/sdk/io/gcp/bigquery/BigQueryIO.java| 46 
 .../sdk/io/gcp/bigquery/BigQueryIOTest.java | 46 ++--
 13 files changed, 62 insertions(+), 52 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/101715a7/examples/java/src/main/java/org/apache/beam/examples/complete/AutoComplete.java
--
diff --git 
a/examples/java/src/main/java/org/apache/beam/examples/complete/AutoComplete.java
 
b/examples/java/src/main/java/org/apache/beam/examples/complete/AutoComplete.java
index 861a292..fba3dc0 100644
--- 
a/examples/java/src/main/java/org/apache/beam/examples/complete/AutoComplete.java
+++ 
b/examples/java/src/main/java/org/apache/beam/examples/complete/AutoComplete.java
@@ -491,7 +491,7 @@ public class AutoComplete {
 
   toWrite
 .apply(ParDo.of(new FormatForBigquery()))
-.apply(BigQueryIO.Write
+.apply(BigQueryIO.write()
.to(tableRef)
.withSchema(FormatForBigquery.getSchema())

.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)

http://git-wip-us.apache.org/repos/asf/beam/blob/101715a7/examples/java/src/main/java/org/apache/beam/examples/complete/StreamingWordExtract.java
--
diff --git 
a/examples/java/src/main/java/org/apache/beam/examples/complete/StreamingWordExtract.java
 
b/examples/java/src/main/java/org/apache/beam/examples/complete/StreamingWordExtract.java
index e8d8950..2e7d451 100644
--- 
a/examples/java/src/main/java/org/apache/beam/examples/complete/StreamingWordExtract.java
+++ 
b/examples/java/src/main/java/org/apache/beam/examples/complete/StreamingWordExtract.java
@@ -136,7 +136,7 @@ public class StreamingWordExtract {
 .apply(ParDo.of(new ExtractWords()))
 .apply(ParDo.of(new Uppercase()))
 .apply(ParDo.of(new StringToRowConverter()))
-.apply(BigQueryIO.Write.to(tableSpec)
+.apply(BigQueryIO.write().to(tableSpec)
 .withSchema(StringToRowConverter.getSchema()));
 
 PipelineResult result = pipeline.run();

http://git-wip-us.apache.org/repos/asf/beam/blob/101715a7/examples/java/src/main/java/org/apache/beam/examples/complete/TrafficMaxLaneFlow.java
--
diff --git 
a/examples/java/src/main/java/org/apache/beam/examples/complete/TrafficMaxLaneFlow.java
 
b/examples/java/src/main/java/org/apache/beam/examples/complete/TrafficMaxLaneFlow.java
index 412f7fb..c9508eb 100644
--- 
a/examples/java/src/main/java/org/apache/beam/examples/complete/TrafficMaxLaneFlow.java
+++ 
b/examples/java/src/main/java/org/apache/beam/examples/complete/TrafficMaxLaneFlow.java
@@ -348,7 +348,7 @@ public class TrafficMaxLaneFlow {
 Duration.standardMinutes(options.getWindowDuration())).
 every(Duration.standardMinutes(options.getWindowSlideEvery()
 .apply(new MaxLaneFlow())
-.apply(BigQueryIO.Write.to(tableRef)
+.apply(BigQueryIO.write().to(tableRef)
 .withSchema(FormatMaxesFn.getSchema()));
 
 // Run the pipeline.

http://git-wip-us.apache.org/repos/asf/beam/blob/101715a7/examples/java/src/main/java/org/apache/beam/examples/complete/TrafficRoutes.java
--
diff --git 
a/examples/java/src/main/java/org/apache/beam/examples/complete/TrafficRoutes.java
 
b/examples/java/src/main/java/org/apache/beam/examples/complete/TrafficRoutes.java
index 50d3ae4..fc5eb89 

[jira] [Commented] (BEAM-1427) BigQueryIO should comply with PTransform style guide

2017-03-14 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1427?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15925203#comment-15925203
 ] 

ASF GitHub Bot commented on BEAM-1427:
--

Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2149


> BigQueryIO should comply with PTransform style guide
> 
>
> Key: BEAM-1427
> URL: https://issues.apache.org/jira/browse/BEAM-1427
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-gcp
>Reporter: Eugene Kirpichov
>Assignee: Eugene Kirpichov
>  Labels: backward-incompatible
> Fix For: First stable release
>
>
> Suggested changes:
> - Remove Read/Write.Bound classes - Read and Write themselves should be the 
> transform classes
> - Remove static builder-like .withBlah() methods
> - (optional) use AutoValue



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[08/10] beam git commit: Replace BigQueryIO.Read.from() with BigQueryIO.read().from()

2017-03-14 Thread tgroh
Replace BigQueryIO.Read.from() with BigQueryIO.read().from()


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/1a252a77
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/1a252a77
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/1a252a77

Branch: refs/heads/master
Commit: 1a252a771127febe551fda5d499c7ecb3b95cf23
Parents: 1adcbae
Author: Eugene Kirpichov 
Authored: Mon Mar 13 16:15:21 2017 -0700
Committer: Thomas Groh 
Committed: Tue Mar 14 15:54:36 2017 -0700

--
 .../examples/cookbook/BigQueryTornadoes.java|  2 +-
 .../cookbook/CombinePerKeyExamples.java |  2 +-
 .../beam/examples/cookbook/FilterExamples.java  |  2 +-
 .../beam/examples/cookbook/JoinExamples.java|  4 +--
 .../examples/cookbook/MaxPerKeyExamples.java|  2 +-
 .../org/apache/beam/sdk/io/package-info.java|  2 +-
 .../beam/sdk/io/gcp/bigquery/BigQueryIO.java| 36 +---
 .../sdk/io/gcp/bigquery/BigQueryIOTest.java | 36 ++--
 8 files changed, 48 insertions(+), 38 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/1a252a77/examples/java/src/main/java/org/apache/beam/examples/cookbook/BigQueryTornadoes.java
--
diff --git 
a/examples/java/src/main/java/org/apache/beam/examples/cookbook/BigQueryTornadoes.java
 
b/examples/java/src/main/java/org/apache/beam/examples/cookbook/BigQueryTornadoes.java
index 079674a..d3c9167 100644
--- 
a/examples/java/src/main/java/org/apache/beam/examples/cookbook/BigQueryTornadoes.java
+++ 
b/examples/java/src/main/java/org/apache/beam/examples/cookbook/BigQueryTornadoes.java
@@ -156,7 +156,7 @@ public class BigQueryTornadoes {
 fields.add(new 
TableFieldSchema().setName("tornado_count").setType("INTEGER"));
 TableSchema schema = new TableSchema().setFields(fields);
 
-p.apply(BigQueryIO.Read.from(options.getInput()))
+p.apply(BigQueryIO.read().from(options.getInput()))
  .apply(new CountTornadoes())
  .apply(BigQueryIO.Write
 .to(options.getOutput())

http://git-wip-us.apache.org/repos/asf/beam/blob/1a252a77/examples/java/src/main/java/org/apache/beam/examples/cookbook/CombinePerKeyExamples.java
--
diff --git 
a/examples/java/src/main/java/org/apache/beam/examples/cookbook/CombinePerKeyExamples.java
 
b/examples/java/src/main/java/org/apache/beam/examples/cookbook/CombinePerKeyExamples.java
index 37f9d79..fc54b13 100644
--- 
a/examples/java/src/main/java/org/apache/beam/examples/cookbook/CombinePerKeyExamples.java
+++ 
b/examples/java/src/main/java/org/apache/beam/examples/cookbook/CombinePerKeyExamples.java
@@ -200,7 +200,7 @@ public class CombinePerKeyExamples {
 fields.add(new TableFieldSchema().setName("all_plays").setType("STRING"));
 TableSchema schema = new TableSchema().setFields(fields);
 
-p.apply(BigQueryIO.Read.from(options.getInput()))
+p.apply(BigQueryIO.read().from(options.getInput()))
  .apply(new PlaysForWord())
  .apply(BigQueryIO.Write
 .to(options.getOutput())

http://git-wip-us.apache.org/repos/asf/beam/blob/1a252a77/examples/java/src/main/java/org/apache/beam/examples/cookbook/FilterExamples.java
--
diff --git 
a/examples/java/src/main/java/org/apache/beam/examples/cookbook/FilterExamples.java
 
b/examples/java/src/main/java/org/apache/beam/examples/cookbook/FilterExamples.java
index fb6b507..714a8f2 100644
--- 
a/examples/java/src/main/java/org/apache/beam/examples/cookbook/FilterExamples.java
+++ 
b/examples/java/src/main/java/org/apache/beam/examples/cookbook/FilterExamples.java
@@ -238,7 +238,7 @@ public class FilterExamples {
 
 TableSchema schema = buildWeatherSchemaProjection();
 
-p.apply(BigQueryIO.Read.from(options.getInput()))
+p.apply(BigQueryIO.read().from(options.getInput()))
  .apply(ParDo.of(new ProjectionFn()))
  .apply(new BelowGlobalMean(options.getMonthFilter()))
  .apply(BigQueryIO.Write

http://git-wip-us.apache.org/repos/asf/beam/blob/1a252a77/examples/java/src/main/java/org/apache/beam/examples/cookbook/JoinExamples.java
--
diff --git 
a/examples/java/src/main/java/org/apache/beam/examples/cookbook/JoinExamples.java
 
b/examples/java/src/main/java/org/apache/beam/examples/cookbook/JoinExamples.java
index 7cf0942..05a3ad3 100644
--- 
a/examples/java/src/main/java/org/apache/beam/examples/cookbook/JoinExamples.java
+++ 
b/examples/java/src/main/java/org/apache/beam/examples/cookbook/JoinExamples.java
@@ -166,8 +166,8 @@ public class JoinExamples {
 Pipeline p = Pipeline.create(options);

[01/10] beam git commit: Remove two unused fields

2017-03-14 Thread tgroh
Repository: beam
Updated Branches:
  refs/heads/master cc12fd378 -> 806c53c18


Remove two unused fields


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/30f36344
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/30f36344
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/30f36344

Branch: refs/heads/master
Commit: 30f363444679293727171f2417c9d991a4bf7852
Parents: cc12fd3
Author: Eugene Kirpichov 
Authored: Thu Mar 2 17:54:55 2017 -0800
Committer: Thomas Groh 
Committed: Tue Mar 14 15:54:19 2017 -0700

--
 .../java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java  | 7 ---
 1 file changed, 7 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/beam/blob/30f36344/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
--
diff --git 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
index 0e1c6fc..2902c2b 100644
--- 
a/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
+++ 
b/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java
@@ -130,7 +130,6 @@ import org.apache.beam.sdk.values.PCollectionView;
 import org.apache.beam.sdk.values.PDone;
 import org.apache.beam.sdk.values.TupleTag;
 import org.apache.beam.sdk.values.TupleTagList;
-import org.joda.time.Duration;
 import org.joda.time.Instant;
 import org.slf4j.Logger;
 import org.slf4j.LoggerFactory;
@@ -1189,15 +1188,9 @@ public class BigQueryIO {
* ...
*/
   private abstract static class BigQuerySourceBase extends 
BoundedSource {
-// The maximum number of retries to verify temp files.
-private static final int MAX_FILES_VERIFY_RETRIES = 9;
-
 // The maximum number of retries to poll a BigQuery job.
 protected static final int JOB_POLL_MAX_RETRIES = Integer.MAX_VALUE;
 
-// The initial backoff for verifying temp files.
-private static final Duration INITIAL_FILES_VERIFY_BACKOFF = 
Duration.standardSeconds(1);
-
 protected final ValueProvider jobIdToken;
 protected final String extractDestinationDir;
 protected final BigQueryServices bqServices;



[10/10] beam git commit: This closes #2149

2017-03-14 Thread tgroh
This closes #2149


Project: http://git-wip-us.apache.org/repos/asf/beam/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam/commit/806c53c1
Tree: http://git-wip-us.apache.org/repos/asf/beam/tree/806c53c1
Diff: http://git-wip-us.apache.org/repos/asf/beam/diff/806c53c1

Branch: refs/heads/master
Commit: 806c53c18793b43e7ccd28f99662ee73a97237b4
Parents: cc12fd3 101715a
Author: Thomas Groh 
Authored: Tue Mar 14 15:54:42 2017 -0700
Committer: Thomas Groh 
Committed: Tue Mar 14 15:54:42 2017 -0700

--
 .../beam/examples/complete/AutoComplete.java|2 +-
 .../examples/complete/StreamingWordExtract.java |2 +-
 .../examples/complete/TrafficMaxLaneFlow.java   |2 +-
 .../beam/examples/complete/TrafficRoutes.java   |2 +-
 .../examples/cookbook/BigQueryTornadoes.java|4 +-
 .../cookbook/CombinePerKeyExamples.java |4 +-
 .../beam/examples/cookbook/FilterExamples.java  |4 +-
 .../beam/examples/cookbook/JoinExamples.java|4 +-
 .../examples/cookbook/MaxPerKeyExamples.java|4 +-
 .../beam/examples/cookbook/TriggerExample.java  |2 +-
 .../complete/game/utils/WriteToBigQuery.java|2 +-
 .../game/utils/WriteWindowedToBigQuery.java |2 +-
 .../org/apache/beam/sdk/io/package-info.java|2 +-
 .../org/apache/beam/sdk/util/NameUtils.java |5 +
 .../org/apache/beam/sdk/util/NameUtilsTest.java |   12 +
 .../beam/sdk/io/gcp/bigquery/BigQueryIO.java| 1707 --
 .../sdk/io/gcp/bigquery/BigQueryIOTest.java |  288 ++-
 17 files changed, 851 insertions(+), 1197 deletions(-)
--




[GitHub] beam pull request #2149: [BEAM-1427] BigQueryIO should comply with PTransfor...

2017-03-14 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam/pull/2149


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Updated] (BEAM-1709) Implement Single-output ParDo as Multi-output ParDo

2017-03-14 Thread Davor Bonaci (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1709?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Davor Bonaci updated BEAM-1709:
---
Fix Version/s: (was: 0.6.0)
   First stable release

> Implement Single-output ParDo as Multi-output ParDo
> ---
>
> Key: BEAM-1709
> URL: https://issues.apache.org/jira/browse/BEAM-1709
> Project: Beam
>  Issue Type: Bug
>  Components: runner-core
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>Priority: Minor
>  Labels: starter
> Fix For: First stable release
>
>
> Runners who want a primitive Single-output ParDo should implement an override 
> to do so.
> Initially implemented in https://github.com/apache/beam/pull/2145, rolled 
> back in https://github.com/apache/beam/pull/2170



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Updated] (BEAM-1645) Display data not populated on Window.Assign

2017-03-14 Thread Davor Bonaci (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1645?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Davor Bonaci updated BEAM-1645:
---
Fix Version/s: (was: 0.6.0)
   First stable release

> Display data not populated on Window.Assign
> ---
>
> Key: BEAM-1645
> URL: https://issues.apache.org/jira/browse/BEAM-1645
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Kenneth Knowles
>Assignee: Ben Chambers
> Fix For: First stable release
>
>
> In 
> https://github.com/apache/beam/commit/eaf9b9b36dec1cc421335b27f225663ce42d0cca
>  the display data was put only on the composite, where no runner actually 
> locates it today.
> As a mitigation we can populate it on the {{Window.Assign}} transform, though 
> the DataflowRunner should likely override this to surface it at the top level.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam-site pull request #175: Documentation changes for 0.6.0 release

2017-03-14 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/beam-site/pull/175


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[38/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowRunner.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowRunner.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowRunner.html
new file mode 100644
index 000..bf09b29
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowRunner.html
@@ -0,0 +1,462 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+DataflowRunner
+
+
+
+
+
+
+var methods = {"i0":9,"i1":9,"i2":10,"i3":10,"i4":10,"i5":10};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.dataflow
+Class DataflowRunner
+
+
+
+java.lang.Object
+
+
+org.apache.beam.sdk.runners.PipelineRunnerDataflowPipelineJob
+
+
+org.apache.beam.runners.dataflow.DataflowRunner
+
+
+
+
+
+
+
+
+
+
+public class DataflowRunner
+extends PipelineRunnerDataflowPipelineJob
+A PipelineRunner 
that executes the operations in the pipeline by first translating them
+ to the Dataflow representation using the DataflowPipelineTranslator 
and then submitting
+ them to a Dataflow service for execution.
+
+ Permissions
+
+ When reading from a Dataflow source or writing to a Dataflow sink using
+ DataflowRunner, the Google cloudservices account and the Google 
compute engine service
+ account of the GCP project running the Dataflow Job will need access to the 
corresponding
+ source/sink.
+
+ Please see https://cloud.google.com/dataflow/security-and-permissions;>Google Cloud
+ Dataflow Security and Permissions for more details.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+Nested Classes
+
+Modifier and Type
+Class and Description
+
+
+static class
+DataflowRunner.StreamingPCollectionViewWriterFnT
+Deprecated.
+
+
+
+
+
+
+
+
+
+
+Field Summary
+
+Fields
+
+Modifier and Type
+Field and Description
+
+
+static java.lang.String
+PROJECT_ID_REGEXP
+Project IDs must contain lowercase letters, digits, or 
dashes.
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors
+
+Modifier
+Constructor and Description
+
+
+protected 
+DataflowRunner(DataflowPipelineOptionsoptions)
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsStatic MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+protected static 
java.util.Listjava.lang.String
+detectClassPathResourcesToStage(java.lang.ClassLoaderclassLoader)
+Attempts to detect all the resources the class loader has 
access to.
+
+
+
+static DataflowRunner
+fromOptions(PipelineOptionsoptions)
+Construct a runner from the provided options.
+
+
+
+DataflowPipelineTranslator
+getTranslator()
+Returns the DataflowPipelineTranslator associated with this 
object.
+
+
+
+DataflowPipelineJob
+run(Pipelinepipeline)
+Processes the given Pipeline, returning the results.
+
+
+
+void
+setHooks(DataflowRunnerHookshooks)
+Sets callbacks to invoke during execution see 
DataflowRunnerHooks.
+
+
+
+java.lang.String
+toString()
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, wait, 
wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Field Detail
+
+
+
+
+
+PROJECT_ID_REGEXP
+public static finaljava.lang.String PROJECT_ID_REGEXP
+Project IDs must contain lowercase letters, digits, or 
dashes.
+ IDs must start with a letter and may not end with a dash.
+ This regex isn't exact - this allows for patterns that would be rejected by
+ the service, but this is sufficient for basic validation of project IDs.
+
+See Also:
+Constant
 Field Values
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+DataflowRunner
+protectedDataflowRunner(DataflowPipelineOptionsoptions)
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+fromOptions
+public staticDataflowRunnerfromOptions(PipelineOptionsoptions)
+Construct a runner from the provided options.
+
+Parameters:
+options - Properties that configure the runner.
+Returns:
+The newly 

[44/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/index-all.html
--
diff --git a/content/documentation/sdks/javadoc/0.6.0/index-all.html 
b/content/documentation/sdks/javadoc/0.6.0/index-all.html
new file mode 100644
index 000..eec4133
--- /dev/null
+++ b/content/documentation/sdks/javadoc/0.6.0/index-all.html
@@ -0,0 +1,19962 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+Index
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+ABCDEFGHIJKLMNOPQRSTUVWXYZ
+
+
+A
+
+absolute(String,
 String...) - Static method in class 
org.apache.beam.sdk.transforms.display.DisplayData.Path
+
+Construct a path from an absolute component path 
hierarchy.
+
+AccumulatingCombineFn()
 - Constructor for class org.apache.beam.sdk.transforms.Combine.AccumulatingCombineFn
+
+accumulatingFiredPanes()
 - Static method in class org.apache.beam.sdk.transforms.windowing.Window
+
+Returns a new Window PTransform 
that uses the registered WindowFn and
+ Triggering behavior, and that accumulates elements in a pane after they are 
triggered.
+
+accumulatingFiredPanes()
 - Method in class org.apache.beam.sdk.transforms.windowing.Window.Bound
+
+Returns a new Window PTransform 
that uses the registered WindowFn and
+ Triggering behavior, and that accumulates elements in a pane after they are 
triggered.
+
+AccumulatorCheckpointingSparkListener()
 - Constructor for class org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator.AccumulatorCheckpointingSparkListener
+
+AccumulatorCheckpointingSparkListener()
 - Constructor for class org.apache.beam.runners.spark.metrics.MetricsAccumulator.AccumulatorCheckpointingSparkListener
+
+AccumulatorCombiningStateInputT,AccumT,OutputT - Interface 
in org.apache.beam.sdk.util.state
+
+State for a single value that is managed by a Combine.CombineFn.
+
+ackDeadlineSeconds(PubsubClient.SubscriptionPath)
 - Method in class org.apache.beam.sdk.util.PubsubClient
+
+Return the ack deadline, in seconds, for 
subscription.
+
+ackDeadlineSeconds(PubsubClient.SubscriptionPath)
 - Method in class org.apache.beam.sdk.util.PubsubGrpcClient
+
+ackDeadlineSeconds(PubsubClient.SubscriptionPath)
 - Method in class org.apache.beam.sdk.util.PubsubJsonClient
+
+ackDeadlineSeconds(PubsubClient.SubscriptionPath)
 - Method in class org.apache.beam.sdk.util.PubsubTestClient
+
+ackId
 - Variable in class org.apache.beam.sdk.util.PubsubClient.IncomingMessage
+
+Id to pass back to Pubsub to acknowledge receipt of this 
message.
+
+acknowledge(PubsubClient.SubscriptionPath,
 ListString) - Method in class org.apache.beam.sdk.util.PubsubClient
+
+Acknowldege messages from subscription with 
ackIds.
+
+acknowledge(PubsubClient.SubscriptionPath,
 ListString) - Method in class org.apache.beam.sdk.util.PubsubGrpcClient
+
+acknowledge(PubsubClient.SubscriptionPath,
 ListString) - Method in class org.apache.beam.sdk.util.PubsubJsonClient
+
+acknowledge(PubsubClient.SubscriptionPath,
 ListString) - Method in class org.apache.beam.sdk.util.PubsubTestClient
+
+add(int,
 GlobalWatermarkHolder.SparkWatermarks) - Static method in class 
org.apache.beam.runners.spark.util.GlobalWatermarkHolder
+
+add(KVbyte[],
 byte[]) - Method in class 
org.apache.beam.sdk.extensions.sorter.BufferedExternalSorter
+
+add(Long)
 - Method in class org.apache.beam.sdk.transforms.ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
+
+Adds a value to the heap, returning whether the value is 
(large enough
+ to be) in the heap.
+
+add(DisplayData.ItemSpec?)
 - Method in interface org.apache.beam.sdk.transforms.display.DisplayData.Builder
+
+Register the given display item.
+
+add(long, 
long) - Method in class org.apache.beam.sdk.util.BucketingFunction
+
+Add one sample of value (to bucket) at 
timeMsSinceEpoch.
+
+add(long, 
long) - Method in class org.apache.beam.sdk.util.MovingFunction
+
+Add value at 
nowMsSinceEpoch.
+
+add(InputT)
 - Method in interface org.apache.beam.sdk.util.state.CombiningState
+
+Add a value to the buffer.
+
+addAccum(AccumT)
 - Method in interface org.apache.beam.sdk.util.state.AccumulatorCombiningState
+
+Add an accumulator to this combining value.
+
+addAccumulator(NamedAggregators,
 NamedAggregators) - Method in class 
org.apache.beam.runners.spark.aggregators.AggAccumParam
+
+addAll(MapInteger,
 QueueGlobalWatermarkHolder.SparkWatermarks) 

[22/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/ModelEnforcement.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/ModelEnforcement.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/ModelEnforcement.html
new file mode 100644
index 000..787f8fd
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/ModelEnforcement.html
@@ -0,0 +1,279 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+ModelEnforcement
+
+
+
+
+
+
+var methods = {"i0":6,"i1":6,"i2":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.direct
+Interface 
ModelEnforcementT
+
+
+
+
+
+
+
+public interface ModelEnforcementT
+Enforcement tools that verify that executing code conforms 
to the model.
+
+ ModelEnforcement is performed on a per-element and per-bundle basis. The
+ ModelEnforcement is provided 
with the input bundle as part of
+ ModelEnforcementFactory#forBundle(CommittedBundle, 
AppliedPTransform), each element
+ before and after that element is provided to an underlying TransformEvaluator, and the
+ output TransformResult and committed 
output bundles after the
+ TransformEvaluator has 
completed.
+
+ Typically, ModelEnforcement will obtain 
required metadata (such as the Coder
+ of the input PCollection on construction, 
and then enforce per-element behavior
+ (such as the immutability of input elements). When the element is output or 
the bundle is
+ completed, the required conditions can be enforced across all elements.
+
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsAbstract Methods
+
+Modifier and Type
+Method and Description
+
+
+void
+afterElement(WindowedValueTelement)
+Called after a call to TransformEvaluator.processElement(WindowedValue)
 on the
+ provided WindowedValue.
+
+
+
+void
+afterFinish(org.apache.beam.runners.direct.DirectRunner.CommittedBundleTinput,
+   TransformResultTresult,
+   java.lang.Iterable? extends 
org.apache.beam.runners.direct.DirectRunner.CommittedBundle?outputs)
+Called after a bundle has been completed and TransformEvaluator.finishBundle()
 has been
+ called, producing the provided TransformResult and
+ output bundles.
+
+
+
+void
+beforeElement(WindowedValueTelement)
+Called before a call to TransformEvaluator.processElement(WindowedValue)
 on the
+ provided WindowedValue.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+beforeElement
+voidbeforeElement(WindowedValueTelement)
+Called before a call to TransformEvaluator.processElement(WindowedValue)
 on the
+ provided WindowedValue.
+
+
+
+
+
+
+
+afterElement
+voidafterElement(WindowedValueTelement)
+Called after a call to TransformEvaluator.processElement(WindowedValue)
 on the
+ provided WindowedValue.
+
+
+
+
+
+
+
+afterFinish
+voidafterFinish(org.apache.beam.runners.direct.DirectRunner.CommittedBundleTinput,
+ TransformResultTresult,
+ java.lang.Iterable? extends 
org.apache.beam.runners.direct.DirectRunner.CommittedBundle?outputs)
+Called after a bundle has been completed and TransformEvaluator.finishBundle()
 has been
+ called, producing the provided TransformResult and
+ output bundles.
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/ModelEnforcementFactory.html

[02/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/util/SparkSideInputReader.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/util/SparkSideInputReader.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/util/SparkSideInputReader.html
new file mode 100644
index 000..e3eb218
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/util/SparkSideInputReader.html
@@ -0,0 +1,331 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+SparkSideInputReader
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10,"i2":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.spark.util
+Class 
SparkSideInputReader
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.spark.util.SparkSideInputReader
+
+
+
+
+
+
+
+All Implemented Interfaces:
+SideInputReader
+
+
+
+public class SparkSideInputReader
+extends java.lang.Object
+implements SideInputReader
+A SideInputReader 
for thw SparkRunner.
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors
+
+Constructor and Description
+
+
+SparkSideInputReader(java.util.MapTupleTag?,KVWindowingStrategy?,?,SideInputBroadcast?sideInputs)
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+Tboolean
+contains(PCollectionViewTview)
+Returns true if the given PCollectionView is valid for this 
reader.
+
+
+
+TT
+get(PCollectionViewTview,
+   BoundedWindowwindow)
+Returns the value of the given PCollectionView for the given BoundedWindow.
+
+
+
+boolean
+isEmpty()
+Returns true if there are no side inputs in this 
reader.
+
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, 
toString, wait, wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+SparkSideInputReader
+publicSparkSideInputReader(java.util.MapTupleTag?,KVWindowingStrategy?,?,SideInputBroadcast?sideInputs)
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+get
+@Nullable
+publicTTget(PCollectionViewTview,
+   BoundedWindowwindow)
+Description copied from 
interface:SideInputReader
+Returns the value of the given PCollectionView for the given BoundedWindow.
+
+ It is valid for a side input to be null. It is not 
valid for this to
+ return null for any other reason.
+
+Specified by:
+getin
 interfaceSideInputReader
+
+
+
+
+
+
+
+
+contains
+publicTbooleancontains(PCollectionViewTview)
+Description copied from 
interface:SideInputReader
+Returns true if the given PCollectionView is valid for this 
reader.
+
+Specified by:
+containsin
 interfaceSideInputReader
+
+
+
+
+
+
+
+
+isEmpty
+publicbooleanisEmpty()
+Description copied from 
interface:SideInputReader
+Returns true if there are no side inputs in this 
reader.
+
+Specified by:
+isEmptyin
 interfaceSideInputReader
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/util/package-frame.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/util/package-frame.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/util/package-frame.html
new file mode 100644
index 000..2ca02ba
--- /dev/null
+++ 

[25/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/package-tree.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/package-tree.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/package-tree.html
new file mode 100644
index 000..93b2143
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/package-tree.html
@@ -0,0 +1,179 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+org.apache.beam.runners.dataflow.util Class Hierarchy
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+Hierarchy For Package 
org.apache.beam.runners.dataflow.util
+Package Hierarchies:
+
+All Packages
+
+
+
+Class Hierarchy
+
+java.lang.Object
+
+java.util.AbstractMapK,V (implements 
java.util.MapK,V)
+
+com.google.api.client.util.GenericData (implements 
java.lang.Cloneable)
+
+com.google.api.client.json.GenericJson (implements 
java.lang.Cloneable)
+
+org.apache.beam.runners.dataflow.util.OutputReference
+
+
+
+
+
+
+org.apache.beam.runners.dataflow.DataflowPipelineJob (implements 
org.apache.beam.sdk.PipelineResult)
+
+org.apache.beam.runners.dataflow.util.DataflowTemplateJob
+
+
+org.apache.beam.runners.dataflow.util.DataflowTransport
+org.apache.beam.runners.dataflow.util.DoFnInfoInputT,OutputT (implements 
java.io.Serializable)
+org.apache.beam.runners.dataflow.util.GcsStager (implements 
org.apache.beam.runners.dataflow.util.Stager)
+org.apache.beam.runners.dataflow.util.MonitoringUtil
+org.apache.beam.runners.dataflow.util.MonitoringUtil.LoggingHandler (implements 
org.apache.beam.runners.dataflow.util.MonitoringUtil.JobMessagesHandler)
+org.apache.beam.runners.dataflow.util.MonitoringUtil.TimeStampComparator (implements 
java.util.ComparatorT)
+org.apache.beam.runners.dataflow.util.RandomAccessData
+org.apache.beam.runners.dataflow.util.RandomAccessData.UnsignedLexicographicalComparator
 (implements java.util.ComparatorT)
+org.apache.beam.sdk.coders.StandardCoderT (implements 
org.apache.beam.sdk.coders.CoderT)
+
+org.apache.beam.sdk.coders.DeterministicStandardCoderT
+
+org.apache.beam.sdk.coders.AtomicCoderT
+
+org.apache.beam.runners.dataflow.util.RandomAccessData.RandomAccessDataCoder
+
+
+
+
+
+
+org.apache.beam.runners.dataflow.util.TimeUtil
+
+
+
+Interface Hierarchy
+
+org.apache.beam.runners.dataflow.util.MonitoringUtil.JobMessagesHandler
+org.apache.beam.runners.dataflow.util.Stager
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/AggregatorContainer.AggregatorKey.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/AggregatorContainer.AggregatorKey.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/AggregatorContainer.AggregatorKey.html
new file mode 100644
index 000..8190148
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/AggregatorContainer.AggregatorKey.html
@@ -0,0 +1,302 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+AggregatorContainer.AggregatorKey
+
+
+
+
+
+
+var methods = {"i0":6,"i1":9,"i2":6};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],4:["t3","Abstract 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+

[42/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/ApexRunnerRegistrar.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/ApexRunnerRegistrar.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/ApexRunnerRegistrar.html
new file mode 100644
index 000..434c9da
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/ApexRunnerRegistrar.html
@@ -0,0 +1,227 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+ApexRunnerRegistrar
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.apex
+Class 
ApexRunnerRegistrar
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.apex.ApexRunnerRegistrar
+
+
+
+
+
+
+
+
+public final class ApexRunnerRegistrar
+extends java.lang.Object
+Contains the PipelineRunnerRegistrar and PipelineOptionsRegistrar for the
+ ApexRunner.
+
+ AutoService will register Apex's implementations of the PipelineRunner
+ and PipelineOptions as available 
pipeline runner services.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+Nested Classes
+
+Modifier and Type
+Class and Description
+
+
+static class
+ApexRunnerRegistrar.Options
+Registers the ApexPipelineOptions.
+
+
+
+static class
+ApexRunnerRegistrar.Runner
+Registers the ApexRunner.
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, 
toString, wait, wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/ApexRunnerResult.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/ApexRunnerResult.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/ApexRunnerResult.html
new file mode 100644
index 000..c6c972d
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/ApexRunnerResult.html
@@ -0,0 +1,446 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+ApexRunnerResult
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10,"i2":10,"i3":10,"i4":10,"i5":10,"i6":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.apex
+Class ApexRunnerResult
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.apex.ApexRunnerResult
+
+
+
+
+
+
+
+All Implemented Interfaces:
+PipelineResult
+
+
+
+public class ApexRunnerResult
+extends java.lang.Object
+implements PipelineResult
+Result of executing a Pipeline with Apex in embedded mode.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.sdk.PipelineResult

[35/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowPipelineWorkerPoolOptions.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowPipelineWorkerPoolOptions.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowPipelineWorkerPoolOptions.html
new file mode 100644
index 000..f364df5
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowPipelineWorkerPoolOptions.html
@@ -0,0 +1,662 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+DataflowPipelineWorkerPoolOptions
+
+
+
+
+
+
+var methods = 
{"i0":6,"i1":6,"i2":6,"i3":6,"i4":6,"i5":6,"i6":6,"i7":6,"i8":6,"i9":6,"i10":6,"i11":6,"i12":6,"i13":6,"i14":6,"i15":6,"i16":6,"i17":6,"i18":6,"i19":6,"i20":6,"i21":6,"i22":6,"i23":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.dataflow.options
+Interface DataflowPipelineWorkerPoolOptions
+
+
+
+
+
+
+All Superinterfaces:
+HasDisplayData, PipelineOptions
+
+
+All Known Subinterfaces:
+DataflowPipelineOptions, DataflowWorkerHarnessOptions, TestDataflowPipelineOptions
+
+
+
+public interface DataflowPipelineWorkerPoolOptions
+extends PipelineOptions
+Options that are used to configure the Dataflow pipeline 
worker pool.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+Nested Classes
+
+Modifier and Type
+Interface and Description
+
+
+static class
+DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
+Type of autoscaling algorithm to use.
+
+
+
+static class
+DataflowPipelineWorkerPoolOptions.WorkerHarnessContainerImageFactory
+Returns the default Docker container image that executes 
Dataflow worker harness, residing in
+ Google Container Registry.
+
+
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.sdk.options.PipelineOptions
+PipelineOptions.AtomicLongFactory, PipelineOptions.CheckEnabled, 
PipelineOptions.DirectRunner, 
PipelineOptions.JobNameFactory
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsAbstract Methods
+
+Modifier and Type
+Method and Description
+
+
+DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType
+getAutoscalingAlgorithm()
+[Experimental] The autoscaling algorithm to use for the 
workerpool.
+
+
+
+int
+getDiskSizeGb()
+Remote worker disk size, in gigabytes, or 0 to use the 
default size.
+
+
+
+java.util.Listjava.lang.String
+getFilesToStage()
+List of local files to make available to workers.
+
+
+
+int
+getMaxNumWorkers()
+The maximum number of workers to use for the 
workerpool.
+
+
+
+java.lang.String
+getNetwork()
+GCE https://cloud.google.com/compute/docs/networking;>network for 
launching
+ workers.
+
+
+
+int
+getNumWorkers()
+Number of workers to use when executing the Dataflow 
job.
+
+
+
+java.lang.String
+getSubnetwork()
+GCE https://cloud.google.com/compute/docs/networking;>subnetwork for 
launching
+ workers.
+
+
+
+java.lang.Boolean
+getUsePublicIps()
+Specifies whether worker pools should be started with 
public IP addresses.
+
+
+
+java.lang.String
+getWorkerDiskType()
+Specifies what type of persistent disk is used.
+
+
+
+java.lang.String
+getWorkerHarnessContainerImage()
+Docker container image that executes Dataflow worker 
harness, residing in Google Container
+ Registry.
+
+
+
+java.lang.String
+getWorkerMachineType()
+Machine type to create Dataflow worker VMs as.
+
+
+
+java.lang.String
+getZone()
+GCE https://developers.google.com/compute/docs/zones;
+ >availability zone for launching workers.
+
+
+
+void
+setAutoscalingAlgorithm(DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmTypevalue)
+
+
+void
+setDiskSizeGb(intvalue)
+
+
+void
+setFilesToStage(java.util.Listjava.lang.Stringvalue)
+
+
+void
+setMaxNumWorkers(intvalue)
+
+
+void
+setNetwork(java.lang.Stringvalue)
+
+
+void
+setNumWorkers(intvalue)
+
+
+void
+setSubnetwork(java.lang.Stringvalue)
+
+
+void

[08/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/hadoop/HadoopIO.Write.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/hadoop/HadoopIO.Write.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/hadoop/HadoopIO.Write.html
new file mode 100644
index 000..4724864
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/hadoop/HadoopIO.Write.html
@@ -0,0 +1,267 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+HadoopIO.Write
+
+
+
+
+
+
+var methods = {"i0":9};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.spark.io.hadoop
+Class HadoopIO.Write
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.spark.io.hadoop.HadoopIO.Write
+
+
+
+
+
+
+
+Enclosing class:
+HadoopIO
+
+
+
+public static final class HadoopIO.Write
+extends java.lang.Object
+Write operation on HDFS.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+Nested Classes
+
+Modifier and Type
+Class and Description
+
+
+static class
+HadoopIO.Write.BoundK,V
+A PTransform 
writing PCollection on 
HDFS.
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsStatic MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+static K,VHadoopIO.Write.BoundK,V
+to(java.lang.StringfilenamePrefix,
+  java.lang.Class? extends 
org.apache.hadoop.mapreduce.lib.output.FileOutputFormatK,Vformat,
+  java.lang.ClassKkey,
+  java.lang.ClassVvalue)
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, 
toString, wait, wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+to
+public staticK,VHadoopIO.Write.BoundK,Vto(java.lang.StringfilenamePrefix,
+ java.lang.Class? extends 
org.apache.hadoop.mapreduce.lib.output.FileOutputFormatK,Vformat,
+ 
java.lang.ClassKkey,
+ 
java.lang.ClassVvalue)
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/hadoop/HadoopIO.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/hadoop/HadoopIO.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/hadoop/HadoopIO.html
new file mode 100644
index 000..9eda200
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/hadoop/HadoopIO.html
@@ -0,0 +1,223 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+HadoopIO
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+

[20/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/WatermarkManager.TimerUpdate.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/WatermarkManager.TimerUpdate.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/WatermarkManager.TimerUpdate.html
new file mode 100644
index 000..629edd5
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/WatermarkManager.TimerUpdate.html
@@ -0,0 +1,336 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+WatermarkManager.TimerUpdate
+
+
+
+
+
+
+var methods = {"i0":9,"i1":9,"i2":10,"i3":10,"i4":10};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.direct
+Class 
WatermarkManager.TimerUpdate
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.direct.WatermarkManager.TimerUpdate
+
+
+
+
+
+
+
+Enclosing class:
+WatermarkManager
+
+
+
+public static class WatermarkManager.TimerUpdate
+extends java.lang.Object
+A collection of newly set, deleted, and completed timers.
+
+ setTimers and deletedTimers are collections of 
TimerInternals.TimerData that have been added to the
+ TimerInternals of an executed step. completedTimers are timers 
that were delivered as
+ the input to the executed step.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+Nested Classes
+
+Modifier and Type
+Class and Description
+
+
+static class
+WatermarkManager.TimerUpdate.TimerUpdateBuilder
+A WatermarkManager.TimerUpdate 
builder that needs to be provided with set timers and deleted timers.
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsStatic MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+static WatermarkManager.TimerUpdate.TimerUpdateBuilder
+builder(org.apache.beam.runners.direct.StructuralKey?key)
+Creates a new WatermarkManager.TimerUpdate 
builder with the provided completed timers that needs the
+ set and deleted timers to be added to it.
+
+
+
+static WatermarkManager.TimerUpdate
+empty()
+Returns a TimerUpdate for a null key with no timers.
+
+
+
+boolean
+equals(java.lang.Objectother)
+
+
+int
+hashCode()
+
+
+WatermarkManager.TimerUpdate
+withCompletedTimers(java.lang.Iterableorg.apache.beam.runners.core.TimerInternals.TimerDatacompletedTimers)
+Returns a WatermarkManager.TimerUpdate 
that is like this one, but with the specified completed timers.
+
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, finalize, getClass, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+empty
+public staticWatermarkManager.TimerUpdateempty()
+Returns a TimerUpdate for a null key with no timers.
+
+
+
+
+
+
+
+builder
+public staticWatermarkManager.TimerUpdate.TimerUpdateBuilderbuilder(org.apache.beam.runners.direct.StructuralKey?key)
+Creates a new WatermarkManager.TimerUpdate 
builder with the provided completed timers that needs the
+ set and deleted timers to be added to it.
+
+
+
+
+
+
+
+withCompletedTimers
+publicWatermarkManager.TimerUpdatewithCompletedTimers(java.lang.Iterableorg.apache.beam.runners.core.TimerInternals.TimerDatacompletedTimers)
+Returns a WatermarkManager.TimerUpdate 
that is like this one, but with the specified completed timers.
+
+
+
+
+
+
+
+hashCode
+publicinthashCode()
+
+Overrides:
+hashCodein classjava.lang.Object
+
+
+
+
+
+
+
+
+equals
+publicbooleanequals(java.lang.Objectother)
+
+Overrides:
+equalsin classjava.lang.Object
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|

[13/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/aggregators/NamedAggregators.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/aggregators/NamedAggregators.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/aggregators/NamedAggregators.html
new file mode 100644
index 000..218733a
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/aggregators/NamedAggregators.html
@@ -0,0 +1,401 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+NamedAggregators
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10,"i2":10,"i3":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.spark.aggregators
+Class NamedAggregators
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.spark.aggregators.NamedAggregators
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable
+
+
+
+public class NamedAggregators
+extends java.lang.Object
+implements java.io.Serializable
+This class wraps a map of named aggregators. Spark expects 
that all accumulators be declared
+ before a job is launched. Beam allows aggregators to be used and incremented 
on the fly.
+ We create a map of named aggregators and instantiate in the the spark context 
before the job
+ is launched. We can then add aggregators on the fly in Spark.
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+Nested Classes
+
+Modifier and Type
+Class and Description
+
+
+static class
+NamedAggregators.CombineFunctionStateInputT,InterT,OutputT
+
+
+static interface
+NamedAggregators.StateInputT,InterT,OutputT
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors
+
+Constructor and Description
+
+
+NamedAggregators()
+Constructs a new NamedAggregators instance.
+
+
+
+NamedAggregators(java.lang.Stringname,
+NamedAggregators.State?,?,?state)
+Constructs a new named aggregators instance that contains a 
mapping from the specified
+ `named` to the associated initial state.
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+TT
+getValue(java.lang.Stringname,
+java.lang.ClassTtypeClass)
+
+
+NamedAggregators
+merge(NamedAggregatorsother)
+Merges another NamedAggregators instance with this 
instance.
+
+
+
+java.util.Mapjava.lang.String,?
+renderAll()
+
+
+java.lang.String
+toString()
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, wait, 
wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+NamedAggregators
+publicNamedAggregators()
+Constructs a new NamedAggregators instance.
+
+
+
+
+
+
+
+NamedAggregators
+publicNamedAggregators(java.lang.Stringname,
+NamedAggregators.State?,?,?state)
+Constructs a new named aggregators instance that contains a 
mapping from the specified
+ `named` to the associated initial state.
+
+Parameters:
+name - Name of aggregator.
+state - Associated State.
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+getValue
+publicTTgetValue(java.lang.Stringname,
+  java.lang.ClassTtypeClass)
+
+Type Parameters:
+T - Type to be returned.
+Parameters:
+name - Name of aggregator to retrieve.
+typeClass - Type class to cast the value to.
+Returns:
+the value of the aggregator associated with the specified name,
+ or null if the specified aggregator could not be found.
+
+
+
+
+
+
+
+
+renderAll
+publicjava.util.Mapjava.lang.String,?renderAll()
+
+Returns:
+a map of all the aggregator names and their rendered values
+
+
+
+
+
+
+
+
+merge
+publicNamedAggregatorsmerge(NamedAggregatorsother)
+Merges another NamedAggregators instance with this 
instance.
+
+Parameters:
+other - The other instance of named aggregators ot merge.
+Returns:
+This instance of Named aggregators with associated states updated to 
reflect the
+ other instance's aggregators.

[09/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/SparkUnboundedSource.Metadata.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/SparkUnboundedSource.Metadata.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/SparkUnboundedSource.Metadata.html
new file mode 100644
index 000..a105f90
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/SparkUnboundedSource.Metadata.html
@@ -0,0 +1,313 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+SparkUnboundedSource.Metadata
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10,"i2":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.spark.io
+Class 
SparkUnboundedSource.Metadata
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.spark.io.SparkUnboundedSource.Metadata
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable
+
+
+Enclosing class:
+SparkUnboundedSource
+
+
+
+public static class SparkUnboundedSource.Metadata
+extends java.lang.Object
+implements java.io.Serializable
+A metadata holder for an input stream partition.
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors
+
+Constructor and Description
+
+
+Metadata(longnumRecords,
+http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in org.joda.time">InstantlowWatermark,
+http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in 
org.joda.time">InstanthighWatermark)
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in org.joda.time">Instant
+getHighWatermark()
+
+
+http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in org.joda.time">Instant
+getLowWatermark()
+
+
+long
+getNumRecords()
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, 
toString, wait, wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+Metadata
+publicMetadata(longnumRecords,
+http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in org.joda.time">InstantlowWatermark,
+http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in 
org.joda.time">InstanthighWatermark)
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+getNumRecords
+publiclonggetNumRecords()
+
+
+
+
+
+
+
+getLowWatermark
+publichttp://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in 
org.joda.time">InstantgetLowWatermark()
+
+
+
+
+
+
+
+getHighWatermark
+publichttp://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in 
org.joda.time">InstantgetHighWatermark()
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/SparkUnboundedSource.html
--
diff --git 

[17/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/flink/package-summary.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/flink/package-summary.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/flink/package-summary.html
new file mode 100644
index 000..d16fc74
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/flink/package-summary.html
@@ -0,0 +1,214 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+org.apache.beam.runners.flink
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevPackage
+NextPackage
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+Packageorg.apache.beam.runners.flink
+
+Internal implementation of the Beam runner for Apache 
Flink.
+
+See:Description
+
+
+
+
+
+Interface Summary
+
+Interface
+Description
+
+
+
+FlinkPipelineOptions
+
+Options which can be used to configure a Flink 
PipelineRunner.
+
+
+
+
+
+
+
+Class Summary
+
+Class
+Description
+
+
+
+DefaultParallelismFactory
+
+DefaultValueFactory for getting a 
default value for the parallelism option
+ on FlinkPipelineOptions.
+
+
+
+FlinkDetachedRunnerResult
+
+Result of a detached execution of a Pipeline with Flink.
+
+
+
+FlinkRunner
+
+A PipelineRunner 
that executes the operations in the
+ pipeline by first translating them to a Flink Plan and then executing them 
either locally
+ or on a Flink cluster, depending on the configuration.
+
+
+
+FlinkRunnerRegistrar
+
+AutoService registrar - will register FlinkRunner and 
FlinkOptions
+ as possible pipeline runner services.
+
+
+
+FlinkRunnerRegistrar.Options
+
+Pipeline options registrar.
+
+
+
+FlinkRunnerRegistrar.Runner
+
+Pipeline runner registrar.
+
+
+
+FlinkRunnerResult
+
+Result of executing a Pipeline with Flink.
+
+
+
+TestFlinkRunner
+
+Test Flink runner.
+
+
+
+
+
+
+
+
+
+Package 
org.apache.beam.runners.flink Description
+Internal implementation of the Beam runner for Apache 
Flink.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevPackage
+NextPackage
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/flink/package-tree.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/flink/package-tree.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/flink/package-tree.html
new file mode 100644
index 000..eee9050
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/flink/package-tree.html
@@ -0,0 +1,173 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+org.apache.beam.runners.flink Class Hierarchy
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+Hierarchy For Package org.apache.beam.runners.flink
+Package Hierarchies:
+
+All Packages
+
+
+
+Class Hierarchy
+
+java.lang.Object
+
+org.apache.beam.runners.flink.DefaultParallelismFactory (implements 
org.apache.beam.sdk.options.DefaultValueFactoryT)
+org.apache.beam.runners.flink.FlinkDetachedRunnerResult (implements 
org.apache.beam.sdk.PipelineResult)
+org.apache.beam.runners.flink.FlinkRunnerRegistrar
+org.apache.beam.runners.flink.FlinkRunnerRegistrar.Options (implements 
org.apache.beam.sdk.options.PipelineOptionsRegistrar)
+org.apache.beam.runners.flink.FlinkRunnerRegistrar.Runner 

[16/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/SparkPipelineOptions.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/SparkPipelineOptions.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/SparkPipelineOptions.html
new file mode 100644
index 000..afb20b9
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/SparkPipelineOptions.html
@@ -0,0 +1,543 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+SparkPipelineOptions
+
+
+
+
+
+
+var methods = 
{"i0":6,"i1":6,"i2":6,"i3":6,"i4":6,"i5":6,"i6":6,"i7":6,"i8":6,"i9":6,"i10":6,"i11":6,"i12":6,"i13":6,"i14":6,"i15":6,"i16":6,"i17":6,"i18":6,"i19":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.spark
+Interface 
SparkPipelineOptions
+
+
+
+
+
+
+All Superinterfaces:
+ApplicationNameOptions, HasDisplayData, PipelineOptions, StreamingOptions
+
+
+All Known Subinterfaces:
+SparkContextOptions, TestSparkPipelineOptions
+
+
+
+public interface SparkPipelineOptions
+extends PipelineOptions, StreamingOptions, ApplicationNameOptions
+Spark runner PipelineOptions handles Spark 
execution-related configurations,
+ such as the master address, batch-interval, and other user-related 
knobs.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+Nested Classes
+
+Modifier and Type
+Interface and Description
+
+
+static class
+SparkPipelineOptions.TmpCheckpointDirFactory
+Returns the default checkpoint directory of 
/tmp/${job.name}.
+
+
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.sdk.options.PipelineOptions
+PipelineOptions.AtomicLongFactory, PipelineOptions.CheckEnabled, 
PipelineOptions.DirectRunner, 
PipelineOptions.JobNameFactory
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsAbstract Methods
+
+Modifier and Type
+Method and Description
+
+
+java.lang.Long
+getBatchIntervalMillis()
+
+
+java.lang.String
+getCheckpointDir()
+
+
+java.lang.Long
+getCheckpointDurationMillis()
+
+
+java.lang.Boolean
+getEnableSparkMetricSinks()
+
+
+java.lang.Long
+getMaxRecordsPerBatch()
+
+
+java.lang.Long
+getMinReadTimeMillis()
+
+
+java.lang.Double
+getReadTimePercentage()
+
+
+java.lang.String
+getSparkMaster()
+
+
+java.lang.String
+getStorageLevel()
+
+
+boolean
+getUsesProvidedSparkContext()
+
+
+void
+setBatchIntervalMillis(java.lang.LongbatchInterval)
+
+
+void
+setCheckpointDir(java.lang.StringcheckpointDir)
+
+
+void
+setCheckpointDurationMillis(java.lang.LongdurationMillis)
+
+
+void
+setEnableSparkMetricSinks(java.lang.BooleanenableSparkMetricSinks)
+
+
+void
+setMaxRecordsPerBatch(java.lang.LongmaxRecordsPerBatch)
+
+
+void
+setMinReadTimeMillis(java.lang.LongminReadTimeMillis)
+
+
+void
+setReadTimePercentage(java.lang.DoublereadTimePercentage)
+
+
+void
+setSparkMaster(java.lang.Stringmaster)
+
+
+void
+setStorageLevel(java.lang.StringstorageLevel)
+
+
+void
+setUsesProvidedSparkContext(booleanvalue)
+
+
+
+
+
+
+Methods inherited from interfaceorg.apache.beam.sdk.options.StreamingOptions
+isStreaming,
 setStreaming
+
+
+
+
+
+Methods inherited from interfaceorg.apache.beam.sdk.options.ApplicationNameOptions
+getAppName,
 setAppName
+
+
+
+
+
+Methods inherited from interfaceorg.apache.beam.sdk.options.PipelineOptions
+as,
 getJobName,
 getOptionsId,
 getRunner,
 getStableUniqueNames,
 getTempLocation,
 outputRuntimeOptions,
 setJobName,
 setOptionsId, setRunner,
 setStableUniqueNames,
 setTempLocation
+
+
+
+
+
+Methods inherited from 
interfaceorg.apache.beam.sdk.transforms.display.HasDisplayData
+populateDisplayData
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+getSparkMaster
+@Default.String(value="local[4]")
+java.lang.StringgetSparkMaster()
+
+
+
+
+
+
+
+setSparkMaster
+voidsetSparkMaster(java.lang.Stringmaster)
+
+
+
+
+
+
+
+getBatchIntervalMillis
+@Default.Long(value=500L)
+java.lang.LonggetBatchIntervalMillis()
+
+
+
+
+
+
+

[33/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowWorkerLoggingOptions.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowWorkerLoggingOptions.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowWorkerLoggingOptions.html
new file mode 100644
index 000..bd67116
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowWorkerLoggingOptions.html
@@ -0,0 +1,398 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+DataflowWorkerLoggingOptions
+
+
+
+
+
+
+var methods = {"i0":6,"i1":6,"i2":6,"i3":6,"i4":6,"i5":6,"i6":6,"i7":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.dataflow.options
+Interface 
DataflowWorkerLoggingOptions
+
+
+
+
+
+
+All Superinterfaces:
+HasDisplayData, PipelineOptions
+
+
+All Known Subinterfaces:
+DataflowPipelineOptions, DataflowWorkerHarnessOptions, TestDataflowPipelineOptions
+
+
+
+public interface DataflowWorkerLoggingOptions
+extends PipelineOptions
+Options that are used to control logging configuration on 
the Dataflow worker.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+Nested Classes
+
+Modifier and Type
+Interface and Description
+
+
+static class
+DataflowWorkerLoggingOptions.Level
+The set of log levels that can be used on the Dataflow 
worker.
+
+
+
+static class
+DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
+Defines a log level override for a specific class, package, 
or name.
+
+
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.sdk.options.PipelineOptions
+PipelineOptions.AtomicLongFactory, PipelineOptions.CheckEnabled, 
PipelineOptions.DirectRunner, 
PipelineOptions.JobNameFactory
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsAbstract Methods
+
+Modifier and Type
+Method and Description
+
+
+DataflowWorkerLoggingOptions.Level
+getDefaultWorkerLogLevel()
+This option controls the default log level of all loggers 
without a log level override.
+
+
+
+DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
+getWorkerLogLevelOverrides()
+This option controls the log levels for specifically named 
loggers.
+
+
+
+DataflowWorkerLoggingOptions.Level
+getWorkerSystemErrMessageLevel()
+Controls the log level given to messages printed to 
System.err.
+
+
+
+DataflowWorkerLoggingOptions.Level
+getWorkerSystemOutMessageLevel()
+Controls the log level given to messages printed to 
System.out.
+
+
+
+void
+setDefaultWorkerLogLevel(DataflowWorkerLoggingOptions.Levellevel)
+
+
+void
+setWorkerLogLevelOverrides(DataflowWorkerLoggingOptions.WorkerLogLevelOverridesvalue)
+
+
+void
+setWorkerSystemErrMessageLevel(DataflowWorkerLoggingOptions.Levellevel)
+
+
+void
+setWorkerSystemOutMessageLevel(DataflowWorkerLoggingOptions.Levellevel)
+
+
+
+
+
+
+Methods inherited from interfaceorg.apache.beam.sdk.options.PipelineOptions
+as,
 getJobName,
 getOptionsId,
 getRunner,
 getStableUniqueNames,
 getTempLocation,
 outputRuntimeOptions,
 setJobName,
 setOptionsId, setRunner,
 setStableUniqueNames,
 setTempLocation
+
+
+
+
+
+Methods inherited from 
interfaceorg.apache.beam.sdk.transforms.display.HasDisplayData
+populateDisplayData
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+getDefaultWorkerLogLevel
+@Default.Enum(value="INFO")
+DataflowWorkerLoggingOptions.LevelgetDefaultWorkerLogLevel()
+This option controls the default log level of all loggers 
without a log level override.
+
+
+
+
+
+
+
+setDefaultWorkerLogLevel
+voidsetDefaultWorkerLogLevel(DataflowWorkerLoggingOptions.Levellevel)
+
+
+
+
+
+
+
+getWorkerSystemOutMessageLevel
+@Default.Enum(value="INFO")
+DataflowWorkerLoggingOptions.LevelgetWorkerSystemOutMessageLevel()
+Controls the log level given to messages printed to 
System.out.
+
+ Note that the message may be filtered depending on the
+ defaultWorkerLogLevel
 or if a 

[39/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowPipelineRegistrar.Runner.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowPipelineRegistrar.Runner.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowPipelineRegistrar.Runner.html
new file mode 100644
index 000..dcda6a3
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowPipelineRegistrar.Runner.html
@@ -0,0 +1,288 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+DataflowPipelineRegistrar.Runner
+
+
+
+
+
+
+var methods = {"i0":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.dataflow
+Class 
DataflowPipelineRegistrar.Runner
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.dataflow.DataflowPipelineRegistrar.Runner
+
+
+
+
+
+
+
+All Implemented Interfaces:
+PipelineRunnerRegistrar
+
+
+Enclosing class:
+DataflowPipelineRegistrar
+
+
+
+@AutoService(value=PipelineRunnerRegistrar.class)
+public static class DataflowPipelineRegistrar.Runner
+extends java.lang.Object
+implements PipelineRunnerRegistrar
+Register the DataflowRunner.
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors
+
+Constructor and Description
+
+
+Runner()
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+java.lang.Iterablejava.lang.Class? extends 
PipelineRunner?
+getPipelineRunners()
+Get the set of PipelineRunners 
to register.
+
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, 
toString, wait, wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+Runner
+publicRunner()
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+getPipelineRunners
+publicjava.lang.Iterablejava.lang.Class? extends PipelineRunner?getPipelineRunners()
+Description copied from 
interface:PipelineRunnerRegistrar
+Get the set of PipelineRunners 
to register.
+
+Specified by:
+getPipelineRunnersin
 interfacePipelineRunnerRegistrar
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowPipelineRegistrar.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowPipelineRegistrar.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowPipelineRegistrar.html
new file mode 100644
index 000..3fa3014
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowPipelineRegistrar.html
@@ -0,0 +1,224 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+DataflowPipelineRegistrar
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+var methods = 
{"i0":6,"i1":6,"i2":6,"i3":6,"i4":6,"i5":6,"i6":6,"i7":6,"i8":6,"i9":6,"i10":6,"i11":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.dataflow.options
+Interface 
DataflowPipelineOptions
+
+
+
+
+
+
+All Superinterfaces:
+ApplicationNameOptions, 
BigQueryOptions, CloudDebuggerOptions, DataflowPipelineDebugOptions, DataflowPipelineWorkerPoolOptions,
 DataflowProfilingOptions, DataflowWorkerLoggingOptions, GcpOptions, GcsOptions, GoogleApiDebugOptions, HasDisplayData, PipelineOptions, PubsubOptions, StreamingOptions
+
+
+All Known Subinterfaces:
+DataflowWorkerHarnessOptions, TestDataflowPipelineOptions
+
+
+
+public interface DataflowPipelineOptions
+extends PipelineOptions, GcpOptions, ApplicationNameOptions, 
DataflowPipelineDebugOptions, DataflowPipelineWorkerPoolOptions,
 BigQueryOp
 tions, GcsOptions, StreamingOptions, CloudDebuggerOptions, DataflowWorkerLoggingOptions, DataflowProfilingOptions, Pub
 subOptions
+Options that can be used to configure the DataflowRunner.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+Nested Classes
+
+Modifier and Type
+Interface and Description
+
+
+static class
+DataflowPipelineOptions.StagingLocationFactory
+Returns a default staging location under GcpOptions.getGcpTempLocation().
+
+
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
+DataflowPipelineDebugOptions.DataflowClientFactory,
 DataflowPipelineDebugOptions.StagerFactory
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
+DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType,
 DataflowPipelineWorkerPoolOptions.WorkerHarnessContainerImageFactory
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.sdk.options.GcsOptions
+GcsOptions.ExecutorServiceFactory, GcsOptions.PathValidatorFactory
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
+DataflowWorkerLoggingOptions.Level,
 DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.runners.dataflow.options.DataflowProfilingOptions
+DataflowProfilingOptions.DataflowProfilingAgentConfiguration
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.sdk.options.GcpOptions
+GcpOptions.DefaultProjectFactory, GcpOptions.GcpTempLocationFactory, GcpOptions.GcpUserCredentialsFactory
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.sdk.options.GoogleApiDebugOptions
+GoogleApiDebugOptions.GoogleApiTracer
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsAbstract Methods
+
+Modifier and Type
+Method and Description
+
+
+java.lang.String
+getProject()
+Project id to use when launching jobs.
+
+
+
+java.lang.String
+getRegion()
+The Google Compute Engine
+ https://cloud.google.com/compute/docs/regions-zones/regions-zones;>region
+ for creating Dataflow jobs.
+
+
+
+java.lang.String
+getServiceAccount()
+Run the job as a specific service account, instead of the 
default GCE robot.
+
+
+
+java.lang.String
+getStagingLocation()
+GCS path for staging local files, e.g.
+
+
+
+java.lang.String

[06/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/metrics/AggregatorMetric.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/metrics/AggregatorMetric.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/metrics/AggregatorMetric.html
new file mode 100644
index 000..bc7b511
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/metrics/AggregatorMetric.html
@@ -0,0 +1,241 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+AggregatorMetric
+
+
+
+
+
+
+var methods = {"i0":9};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.spark.metrics
+Class AggregatorMetric
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.spark.metrics.AggregatorMetric
+
+
+
+
+
+
+
+All Implemented Interfaces:
+com.codahale.metrics.Metric
+
+
+
+public class AggregatorMetric
+extends java.lang.Object
+implements com.codahale.metrics.Metric
+An adapter between the NamedAggregators 
and Codahale's Metric interface.
+
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsStatic MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+static AggregatorMetric
+of(NamedAggregatorsnamedAggregators)
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, 
toString, wait, wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+of
+public staticAggregatorMetricof(NamedAggregatorsnamedAggregators)
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/metrics/AggregatorMetricSource.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/metrics/AggregatorMetricSource.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/metrics/AggregatorMetricSource.html
new file mode 100644
index 000..16d296a
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/metrics/AggregatorMetricSource.html
@@ -0,0 +1,299 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+AggregatorMetricSource
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.spark.metrics
+Class 
AggregatorMetricSource
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.spark.metrics.AggregatorMetricSource
+
+
+
+
+
+
+
+All Implemented Interfaces:

[29/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/testing/package-tree.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/testing/package-tree.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/testing/package-tree.html
new file mode 100644
index 000..ca62251
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/testing/package-tree.html
@@ -0,0 +1,398 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+org.apache.beam.runners.dataflow.testing Class Hierarchy
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+Hierarchy For Package 
org.apache.beam.runners.dataflow.testing
+Package Hierarchies:
+
+All Packages
+
+
+
+Class Hierarchy
+
+java.lang.Object
+
+org.apache.beam.sdk.runners.PipelineRunnerResultT
+
+org.apache.beam.runners.dataflow.testing.TestDataflowRunner
+
+
+
+
+
+Interface Hierarchy
+
+org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
+
+org.apache.beam.runners.dataflow.options.DataflowPipelineOptions (also extends 
org.apache.beam.sdk.options.ApplicationNameOptions, 
org.apache.beam.sdk.options.BigQueryOptions, 
org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions, 
org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions,
 org.apache.beam.runners.dataflow.options.DataflowProfilingOptions, 
org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions, 
org.apache.beam.sdk.options.GcpOptions, 
org.apache.beam.sdk.options.GcsOptions, 
org.apache.beam.sdk.options.PipelineOptions, 
org.apache.beam.sdk.options.PubsubOptions, 
org.apache.beam.sdk.options.StreamingOptions)
+
+org.apache.beam.runners.dataflow.testing.TestDataflowPipelineOptions (also extends 
org.apache.beam.sdk.testing.TestPipelineOptions)
+
+
+
+
+org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
+
+org.apache.beam.runners.dataflow.options.DataflowPipelineOptions (also extends 
org.apache.beam.sdk.options.ApplicationNameOptions, 
org.apache.beam.sdk.options.BigQueryOptions, 
org.apache.beam.runners.dataflow.options.CloudDebuggerOptions, 
org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions, 
org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions,
 org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions, 
org.apache.beam.sdk.options.GcpOptions, 
org.apache.beam.sdk.options.GcsOptions, 
org.apache.beam.sdk.options.PipelineOptions, 
org.apache.beam.sdk.options.PubsubOptions, 
org.apache.beam.sdk.options.StreamingOptions)
+
+org.apache.beam.runners.dataflow.testing.TestDataflowPipelineOptions (also extends 
org.apache.beam.sdk.testing.TestPipelineOptions)
+
+
+
+
+org.apache.beam.sdk.transforms.display.HasDisplayData
+
+org.apache.beam.sdk.options.PipelineOptions
+
+org.apache.beam.sdk.options.ApplicationNameOptions
+
+org.apache.beam.sdk.options.BigQueryOptions (also extends 
org.apache.beam.sdk.options.GcpOptions, 
org.apache.beam.sdk.options.PipelineOptions, 
org.apache.beam.sdk.options.StreamingOptions)
+
+org.apache.beam.runners.dataflow.options.DataflowPipelineOptions (also extends 
org.apache.beam.sdk.options.ApplicationNameOptions, 
org.apache.beam.runners.dataflow.options.CloudDebuggerOptions, 
org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions, 
org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions,
 org.apache.beam.runners.dataflow.options.DataflowProfilingOptions, 
org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions, 
org.apache.beam.sdk.options.GcpOptions, 
org.apache.beam.sdk.options.GcsOptions
 , org.apache.beam.sdk.options.PipelineOptions, 
org.apache.beam.sdk.options.PubsubOptions, 
org.apache.beam.sdk.options.StreamingOptions)
+
+org.apache.beam.runners.dataflow.testing.TestDataflowPipelineOptions (also extends 
org.apache.beam.sdk.testing.TestPipelineOptions)
+
+
+
+
+org.apache.beam.runners.dataflow.options.DataflowPipelineOptions (also extends 
org.apache.beam.sdk.options.BigQueryOptions, 
org.apache.beam.runners.dataflow.options.CloudDebuggerOptions, 
org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions, 

[45/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/deprecated-list.html
--
diff --git a/content/documentation/sdks/javadoc/0.6.0/deprecated-list.html 
b/content/documentation/sdks/javadoc/0.6.0/deprecated-list.html
new file mode 100644
index 000..872556e
--- /dev/null
+++ b/content/documentation/sdks/javadoc/0.6.0/deprecated-list.html
@@ -0,0 +1,459 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+Deprecated List
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+Deprecated API
+Contents
+
+Deprecated Interfaces
+Deprecated Classes
+Deprecated Fields
+Deprecated Methods
+Deprecated Constructors
+Deprecated Enum Constants
+
+
+
+
+
+
+
+
+Deprecated Interfaces
+
+Interface and Description
+
+
+
+org.apache.beam.sdk.util.IOChannelFactory
+This is under redesign, 
see: https://issues.apache.org/jira/browse/BEAM-59.
+
+
+
+org.apache.beam.sdk.testing.StreamingIT
+
+
+
+
+
+
+
+
+
+
+
+Deprecated Classes
+
+Class and Description
+
+
+
+org.apache.beam.sdk.util.AttemptAndTimeBoundedExponentialBackOff
+
+
+org.apache.beam.sdk.util.AttemptBoundedExponentialBackOff
+
+
+org.apache.beam.sdk.transforms.Combine.SimpleCombineFn
+
+
+org.apache.beam.runners.dataflow.DataflowRunner.StreamingPCollectionViewWriterFn
+
+
+org.apache.beam.sdk.util.IntervalBoundedExponentialBackOff
+
+
+org.apache.beam.sdk.util.PCollectionViews.IterablePCollectionView
+Runners should not inspect 
the PCollectionView subclass, as it 
is an
+ implementation detail. To specialize a side input, a runner should inspect the
+ language-independent metadata of the ViewFn.
+
+
+
+org.apache.beam.sdk.util.PCollectionViews.ListPCollectionView
+Runners should not inspect 
the PCollectionView subclass, as it 
is an
+ implementation detail. To specialize a side input, a runner should inspect the
+ language-independent metadata of the ViewFn.
+
+
+
+org.apache.beam.sdk.util.PCollectionViews.MapPCollectionView
+Runners should not inspect 
the PCollectionView subclass, as it 
is an
+ implementation detail. To specialize a side input, a runner should inspect the
+ language-independent metadata of the ViewFn.
+
+
+
+org.apache.beam.sdk.util.PCollectionViews.MultimapPCollectionView
+Runners should not inspect 
the PCollectionView subclass, as it 
is an
+ implementation detail. To specialize a side input, a runner should inspect the
+ language-independent metadata of the ViewFn.
+
+
+
+org.apache.beam.sdk.util.PCollectionViews.SingletonPCollectionView
+Runners should not inspect 
the PCollectionView subclass, as it 
is an
+ implementation detail. To specialize a side input, a runner should inspect the
+ language-independent metadata of the ViewFn.
+
+
+
+
+
+
+
+
+
+
+
+
+Deprecated Fields
+
+Field and Description
+
+
+
+org.apache.beam.sdk.util.AppEngineEnvironment.IS_APP_ENGINE
+
+
+
+
+
+
+
+
+
+
+
+Deprecated Methods
+
+Method and Description
+
+
+
+org.apache.beam.sdk.io.PubsubIO.PubsubSubscription.asV1Beta1Path()
+the v1beta1 API for Cloud 
Pub/Sub is deprecated.
+
+
+
+org.apache.beam.sdk.io.PubsubIO.PubsubTopic.asV1Beta1Path()
+the v1beta1 API for Cloud 
Pub/Sub is deprecated.
+
+
+
+org.apache.beam.sdk.io.PubsubIO.PubsubSubscription.asV1Beta2Path()
+the v1beta2 API for Cloud 
Pub/Sub is deprecated.
+
+
+
+org.apache.beam.sdk.io.PubsubIO.PubsubTopic.asV1Beta2Path()
+the v1beta2 API for Cloud 
Pub/Sub is deprecated.
+
+
+
+org.apache.beam.sdk.coders.AvroCoder.createDatumReader()
+For AvroCoder 
internal use only.
+
+
+
+org.apache.beam.sdk.coders.AvroCoder.createDatumWriter()
+For AvroCoder 
internal use only.
+
+
+
+org.apache.beam.sdk.values.PValue.expand()
+
+
+org.apache.beam.sdk.values.POutput.finishSpecifyingOutput(PInput,
 PTransform?, ?)
+see BEAM-1199
+
+
+
+org.apache.beam.runners.dataflow.util.DoFnInfo.forFn(Serializable,
 WindowingStrategy?, ?, IterablePCollectionView?, 
CoderInputT, long, MapLong, TupleTag?)
+
+
+org.apache.beam.sdk.values.PCollectionView.getCoderInternal()
+this method will be 
removed entirely. The PCollection 
underlying a side
+ input, including its Coder, is part 
of the side input's specification with a ParDo transform, which will 
obtain that information via a package-private channel.
+
+
+
+org.apache.beam.runners.dataflow.util.DoFnInfo.getFn()
+
+
+org.apache.beam.sdk.util.IdentityWindowFn.getOutputTime(Instant,
 

[31/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/package-frame.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/package-frame.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/package-frame.html
new file mode 100644
index 000..b35065e
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/package-frame.html
@@ -0,0 +1,39 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+org.apache.beam.runners.dataflow
+
+
+
+
+
+org.apache.beam.runners.dataflow
+
+Classes
+
+BatchStatefulParDoOverrides
+BatchStatefulParDoOverrides.BatchStatefulDoFn
+DataflowClient
+DataflowPipelineJob
+DataflowPipelineRegistrar
+DataflowPipelineRegistrar.Options
+DataflowPipelineRegistrar.Runner
+DataflowPipelineTranslator
+DataflowPipelineTranslator.JobSpecification
+DataflowRunner
+DataflowRunner.StreamingPCollectionViewWriterFn
+DataflowRunnerHooks
+DataflowRunnerInfo
+
+Exceptions
+
+DataflowJobAlreadyExistsException
+DataflowJobAlreadyUpdatedException
+DataflowJobException
+DataflowServiceException
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/package-summary.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/package-summary.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/package-summary.html
new file mode 100644
index 000..9764bed
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/package-summary.html
@@ -0,0 +1,268 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+org.apache.beam.runners.dataflow
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevPackage
+NextPackage
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+Packageorg.apache.beam.runners.dataflow
+
+Provides a Beam runner that executes pipelines on the 
Google Cloud Dataflow service.
+
+See:Description
+
+
+
+
+
+Class Summary
+
+Class
+Description
+
+
+
+BatchStatefulParDoOverrides
+
+PTransformOverrideFactories that 
expands to correctly implement
+ stateful ParDo using 
window-unaware BatchViewOverrides.GroupByKeyAndSortValuesOnly to 
linearize
+ processing per key.
+
+
+
+BatchStatefulParDoOverrides.BatchStatefulDoFnK,V,OutputT
+
+A key-preserving DoFn that explodes an iterable 
that has been grouped by key and
+ window.
+
+
+
+DataflowClient
+
+Wrapper around the generated Dataflow client 
to provide common functionality.
+
+
+
+DataflowPipelineJob
+
+A DataflowPipelineJob represents a job submitted to 
Dataflow using
+ DataflowRunner.
+
+
+
+DataflowPipelineRegistrar
+
+Contains the PipelineOptionsRegistrar and PipelineRunnerRegistrar for the
+ DataflowRunner.
+
+
+
+DataflowPipelineRegistrar.Options
+
+Register the DataflowPipelineOptions.
+
+
+
+DataflowPipelineRegistrar.Runner
+
+Register the DataflowRunner.
+
+
+
+DataflowPipelineTranslator
+
+DataflowPipelineTranslator 
knows how to translate Pipeline objects
+ into Cloud Dataflow Service API Jobs.
+
+
+
+DataflowPipelineTranslator.JobSpecification
+
+The result of a job translation.
+
+
+
+DataflowRunner
+
+A PipelineRunner 
that executes the operations in the pipeline by first translating them
+ to the Dataflow representation using the DataflowPipelineTranslator 
and then submitting
+ them to a Dataflow service for execution.
+
+
+
+DataflowRunner.StreamingPCollectionViewWriterFnT
+Deprecated
+
+
+DataflowRunnerHooks
+
+An instance of this class can be passed to the
+ DataflowRunner to add user 
defined hooks to be
+ invoked at various times during pipeline execution.
+
+
+
+DataflowRunnerInfo
+
+Populates versioning and other information for DataflowRunner.
+
+
+
+
+
+
+
+Exception Summary
+
+Exception
+Description
+
+
+
+DataflowJobAlreadyExistsException
+
+An exception that is thrown if the unique job name 
constraint of the Dataflow
+ service is broken because an existing job with the same job name is currently 
active.
+
+
+
+DataflowJobAlreadyUpdatedException
+
+An exception that is thrown if the existing job has already 
been updated within the Dataflow
+ service and is no longer able to be updated.
+
+
+

[19/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/package-tree.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/package-tree.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/package-tree.html
new file mode 100644
index 000..4ea6124
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/package-tree.html
@@ -0,0 +1,186 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+org.apache.beam.runners.direct Class Hierarchy
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+Hierarchy For Package org.apache.beam.runners.direct
+Package Hierarchies:
+
+All Packages
+
+
+
+Class Hierarchy
+
+java.lang.Object
+
+org.apache.beam.runners.direct.AggregatorContainer
+org.apache.beam.runners.direct.AggregatorContainer.AggregatorKey
+org.apache.beam.runners.direct.AggregatorContainer.Mutator (implements 
org.apache.beam.runners.core.AggregatorFactory)
+org.apache.beam.runners.direct.CopyOnAccessInMemoryStateInternalsK 
(implements org.apache.beam.runners.core.StateInternalsK)
+org.apache.beam.runners.direct.DirectOptions.AvailableParallelismFactory 
(implements org.apache.beam.sdk.options.DefaultValueFactoryT)
+org.apache.beam.runners.direct.DirectRegistrar
+org.apache.beam.runners.direct.DirectRegistrar.Options (implements 
org.apache.beam.sdk.options.PipelineOptionsRegistrar)
+org.apache.beam.runners.direct.DirectRegistrar.Runner (implements 
org.apache.beam.sdk.runners.PipelineRunnerRegistrar)
+org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult (implements 
org.apache.beam.sdk.PipelineResult)
+org.apache.beam.runners.direct.NanosOffsetClock (implements 
org.apache.beam.runners.direct.Clock)
+org.apache.beam.sdk.runners.PipelineRunnerResultT
+
+org.apache.beam.runners.direct.DirectRunner
+
+
+org.apache.beam.sdk.transforms.PTransformInputT,OutputT (implements 
org.apache.beam.sdk.transforms.display.HasDisplayData, 
java.io.Serializable)
+
+org.apache.beam.runners.direct.ForwardingPTransformInputT,OutputT
+
+
+org.apache.beam.runners.direct.StepTransformResultInputT (implements 
org.apache.beam.runners.direct.TransformResultInputT)
+org.apache.beam.runners.direct.StepTransformResult.BuilderInputT
+org.apache.beam.runners.direct.WatermarkManager
+org.apache.beam.runners.direct.WatermarkManager.FiredTimers
+org.apache.beam.runners.direct.WatermarkManager.TimerUpdate
+org.apache.beam.runners.direct.WatermarkManager.TimerUpdate.TimerUpdateBuilder
+org.apache.beam.runners.direct.WatermarkManager.TransformWatermarks
+
+
+
+Interface Hierarchy
+
+org.apache.beam.runners.direct.BundleFactory
+org.apache.beam.runners.direct.Clock
+org.apache.beam.runners.direct.ExecutorServiceFactory
+org.apache.beam.sdk.transforms.display.HasDisplayData
+
+org.apache.beam.sdk.options.PipelineOptions
+
+org.apache.beam.sdk.options.ApplicationNameOptions
+
+org.apache.beam.runners.direct.DirectOptions (also extends 
org.apache.beam.sdk.options.PipelineOptions)
+
+
+org.apache.beam.runners.direct.DirectOptions (also extends 
org.apache.beam.sdk.options.ApplicationNameOptions)
+
+
+
+
+org.apache.beam.runners.direct.ModelEnforcementT
+org.apache.beam.runners.direct.ModelEnforcementFactory
+org.apache.beam.runners.direct.TransformEvaluatorInputT
+org.apache.beam.runners.direct.TransformEvaluatorFactory
+org.apache.beam.runners.direct.TransformResultInputT
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/flink/DefaultParallelismFactory.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/flink/DefaultParallelismFactory.html
 

[12/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/coders/BeamSparkRunnerRegistrator.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/coders/BeamSparkRunnerRegistrator.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/coders/BeamSparkRunnerRegistrator.html
new file mode 100644
index 000..4341ca3
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/coders/BeamSparkRunnerRegistrator.html
@@ -0,0 +1,279 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+BeamSparkRunnerRegistrator
+
+
+
+
+
+
+var methods = {"i0":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.spark.coders
+Class 
BeamSparkRunnerRegistrator
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.spark.coders.BeamSparkRunnerRegistrator
+
+
+
+
+
+
+
+All Implemented Interfaces:
+org.apache.spark.serializer.KryoRegistrator
+
+
+
+public class BeamSparkRunnerRegistrator
+extends java.lang.Object
+implements org.apache.spark.serializer.KryoRegistrator
+Custom KryoRegistrators for Beam's Spark 
runner needs.
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors
+
+Constructor and Description
+
+
+BeamSparkRunnerRegistrator()
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+void
+registerClasses(com.esotericsoftware.kryo.Kryokryo)
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, 
toString, wait, wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+BeamSparkRunnerRegistrator
+publicBeamSparkRunnerRegistrator()
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+registerClasses
+publicvoidregisterClasses(com.esotericsoftware.kryo.Kryokryo)
+
+Specified by:
+registerClassesin 
interfaceorg.apache.spark.serializer.KryoRegistrator
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/coders/CoderHelpers.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/coders/CoderHelpers.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/coders/CoderHelpers.html
new file mode 100644
index 000..631b65b
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/coders/CoderHelpers.html
@@ -0,0 +1,467 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+CoderHelpers
+
+
+
+
+
+
+var methods = {"i0":9,"i1":9,"i2":9,"i3":9,"i4":9,"i5":9,"i6":9,"i7":9,"i8":9};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+var methods = {"i0":6,"i1":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.direct
+Interface 
TransformEvaluatorInputT
+
+
+
+
+
+
+Type Parameters:
+InputT - the type of elements that will be passed to processElement(org.apache.beam.sdk.util.WindowedValueInputT)
+
+
+
+public interface TransformEvaluatorInputT
+An evaluator of a specific application of a transform. Will 
be used for at least one
+ DirectRunner.CommittedBundle.
+
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsAbstract Methods
+
+Modifier and Type
+Method and Description
+
+
+TransformResultInputT
+finishBundle()
+Finish processing the bundle of this TransformEvaluator.
+
+
+
+void
+processElement(WindowedValueInputTelement)
+Process an element in the input 
DirectRunner.CommittedBundle.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+processElement
+voidprocessElement(WindowedValueInputTelement)
+ throws java.lang.Exception
+Process an element in the input 
DirectRunner.CommittedBundle.
+
+Parameters:
+element - the element to process
+Throws:
+java.lang.Exception
+
+
+
+
+
+
+
+
+finishBundle
+TransformResultInputTfinishBundle()
+  throws java.lang.Exception
+Finish processing the bundle of this TransformEvaluator.
+
+ After finishBundle()
 is called, the TransformEvaluator will not be 
reused,
+ and no more elements will be processed.
+
+Returns:
+an TransformResult containing the 
results of this bundle evaluation.
+Throws:
+java.lang.Exception
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/TransformEvaluatorFactory.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/TransformEvaluatorFactory.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/TransformEvaluatorFactory.html
new file mode 100644
index 000..b68e2b2
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/TransformEvaluatorFactory.html
@@ -0,0 +1,266 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+TransformEvaluatorFactory
+
+
+
+
+
+
+var methods = {"i0":6,"i1":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass

[27/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/MonitoringUtil.LoggingHandler.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/MonitoringUtil.LoggingHandler.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/MonitoringUtil.LoggingHandler.html
new file mode 100644
index 000..5c56fd6
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/MonitoringUtil.LoggingHandler.html
@@ -0,0 +1,287 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+MonitoringUtil.LoggingHandler
+
+
+
+
+
+
+var methods = {"i0":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.dataflow.util
+Class 
MonitoringUtil.LoggingHandler
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.dataflow.util.MonitoringUtil.LoggingHandler
+
+
+
+
+
+
+
+All Implemented Interfaces:
+MonitoringUtil.JobMessagesHandler
+
+
+Enclosing class:
+MonitoringUtil
+
+
+
+public static class MonitoringUtil.LoggingHandler
+extends java.lang.Object
+implements MonitoringUtil.JobMessagesHandler
+A handler that logs monitoring messages.
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors
+
+Constructor and Description
+
+
+LoggingHandler()
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+void
+process(java.util.Listcom.google.api.services.dataflow.model.JobMessagemessages)
+Process the rows.
+
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, 
toString, wait, wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+LoggingHandler
+publicLoggingHandler()
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+process
+publicvoidprocess(java.util.Listcom.google.api.services.dataflow.model.JobMessagemessages)
+Description copied from 
interface:MonitoringUtil.JobMessagesHandler
+Process the rows.
+
+Specified by:
+processin
 interfaceMonitoringUtil.JobMessagesHandler
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/MonitoringUtil.TimeStampComparator.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/MonitoringUtil.TimeStampComparator.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/MonitoringUtil.TimeStampComparator.html
new file mode 100644
index 000..f3730fa
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/MonitoringUtil.TimeStampComparator.html
@@ -0,0 +1,292 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+MonitoringUtil.TimeStampComparator
+
+
+
+
+
+
+var methods = {"i0":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview

[43/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/index.html
--
diff --git a/content/documentation/sdks/javadoc/0.6.0/index.html 
b/content/documentation/sdks/javadoc/0.6.0/index.html
new file mode 100644
index 000..9a22d6c
--- /dev/null
+++ b/content/documentation/sdks/javadoc/0.6.0/index.html
@@ -0,0 +1,75 @@
+http://www.w3.org/TR/html4/frameset.dtd;>
+
+
+
+
+Generated Documentation (Untitled)
+
+tmpTargetPage = "" + window.location.search;
+if (tmpTargetPage != "" && tmpTargetPage != "undefined")
+tmpTargetPage = tmpTargetPage.substring(1);
+if (tmpTargetPage.indexOf(":") != -1 || (tmpTargetPage != "" && 
!validURL(tmpTargetPage)))
+tmpTargetPage = "undefined";
+targetPage = tmpTargetPage;
+function validURL(url) {
+try {
+url = decodeURIComponent(url);
+}
+catch (error) {
+return false;
+}
+var pos = url.indexOf(".html");
+if (pos == -1 || pos != url.length - 5)
+return false;
+var allowNumber = false;
+var allowSep = false;
+var seenDot = false;
+for (var i = 0; i < url.length - 5; i++) {
+var ch = url.charAt(i);
+if ('a' <= ch && ch <= 'z' ||
+'A' <= ch && ch <= 'Z' ||
+ch == '$' ||
+ch == '_' ||
+ch.charCodeAt(0) > 127) {
+allowNumber = true;
+allowSep = true;
+} else if ('0' <= ch && ch <= '9'
+|| ch == '-') {
+if (!allowNumber)
+ return false;
+} else if (ch == '/' || ch == '.') {
+if (!allowSep)
+return false;
+allowNumber = false;
+allowSep = false;
+if (ch == '.')
+ seenDot = true;
+if (ch == '/' && seenDot)
+ return false;
+} else {
+return false;
+}
+}
+return true;
+}
+function loadFrames() {
+if (targetPage != "" && targetPage != "undefined")
+ top.classFrame.location = top.targetPage;
+}
+
+
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+Frame Alert
+This document is designed to be viewed using the frames feature. If you see 
this message, you are using a non-frame-capable web client. Link to Non-frame version.
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/ApexPipelineOptions.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/ApexPipelineOptions.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/ApexPipelineOptions.html
new file mode 100644
index 000..f3f5e40
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/ApexPipelineOptions.html
@@ -0,0 +1,402 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+ApexPipelineOptions
+
+
+
+
+
+
+var methods = 
{"i0":6,"i1":6,"i2":6,"i3":6,"i4":6,"i5":6,"i6":6,"i7":6,"i8":6,"i9":6,"i10":6,"i11":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.apex
+Interface 
ApexPipelineOptions
+
+
+
+
+
+
+All Superinterfaces:
+HasDisplayData, PipelineOptions, 
java.io.Serializable
+
+
+
+public interface ApexPipelineOptions
+extends PipelineOptions, 
java.io.Serializable
+Options that configure the Apex pipeline.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.sdk.options.PipelineOptions
+PipelineOptions.AtomicLongFactory, PipelineOptions.CheckEnabled, 
PipelineOptions.DirectRunner, 
PipelineOptions.JobNameFactory
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsAbstract Methods
+

[30/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/testing/package-summary.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/testing/package-summary.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/testing/package-summary.html
new file mode 100644
index 000..a7af44b
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/testing/package-summary.html
@@ -0,0 +1,171 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+org.apache.beam.runners.dataflow.testing
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevPackage
+NextPackage
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+Packageorg.apache.beam.runners.dataflow.testing
+
+Provides utilities for integration testing and RunnableOnService tests of the 
Google Cloud Dataflow
+ runner.
+
+See:Description
+
+
+
+
+
+Interface Summary
+
+Interface
+Description
+
+
+
+TestDataflowPipelineOptions
+
+A set of options used to configure the TestPipeline.
+
+
+
+
+
+
+
+Class Summary
+
+Class
+Description
+
+
+
+TestDataflowRunner
+
+TestDataflowRunner 
is a pipeline runner that wraps a
+ DataflowRunner when running 
tests against the TestPipeline.
+
+
+
+
+
+
+
+
+
+Package org.apache.beam.runners.dataflow.testing Description
+Provides utilities for integration testing and RunnableOnService tests of the 
Google Cloud Dataflow
+ runner.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevPackage
+NextPackage
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+



[07/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/hadoop/TemplatedSequenceFileOutputFormat.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/hadoop/TemplatedSequenceFileOutputFormat.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/hadoop/TemplatedSequenceFileOutputFormat.html
new file mode 100644
index 000..de75fe0
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/hadoop/TemplatedSequenceFileOutputFormat.html
@@ -0,0 +1,360 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+TemplatedSequenceFileOutputFormat
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.spark.io.hadoop
+Class 
TemplatedSequenceFileOutputFormatK,V
+
+
+
+java.lang.Object
+
+
+org.apache.hadoop.mapreduce.OutputFormatK,V
+
+
+org.apache.hadoop.mapreduce.lib.output.FileOutputFormatK,V
+
+
+org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormatK,V
+
+
+org.apache.beam.runners.spark.io.hadoop.TemplatedSequenceFileOutputFormatK,V
+
+
+
+
+
+
+
+
+
+
+
+
+
+All Implemented Interfaces:
+ShardNameTemplateAware
+
+
+
+public class TemplatedSequenceFileOutputFormatK,V
+extends 
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormatK,V
+implements ShardNameTemplateAware
+Templated sequence file output format.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+
+
+
+Nested classes/interfaces inherited from 
classorg.apache.hadoop.mapreduce.lib.output.FileOutputFormat
+org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.Counter
+
+
+
+
+
+
+
+
+Field Summary
+
+
+
+
+Fields inherited from 
classorg.apache.hadoop.mapreduce.lib.output.FileOutputFormat
+BASE_OUTPUT_NAME, COMPRESS, COMPRESS_CODEC, COMPRESS_TYPE, OUTDIR, 
PART
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors
+
+Constructor and Description
+
+
+TemplatedSequenceFileOutputFormat()
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+void
+checkOutputSpecs(org.apache.hadoop.mapreduce.JobContextjob)
+
+
+org.apache.hadoop.fs.Path
+getDefaultWorkFile(org.apache.hadoop.mapreduce.TaskAttemptContextcontext,
+  java.lang.Stringextension)
+
+
+
+
+
+
+Methods inherited from 
classorg.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat
+getOutputCompressionType, getRecordWriter, getSequenceWriter, 
setOutputCompressionType
+
+
+
+
+
+Methods inherited from 
classorg.apache.hadoop.mapreduce.lib.output.FileOutputFormat
+getCompressOutput, getOutputCommitter, getOutputCompressorClass, 
getOutputName, getOutputPath, getPathForWorkFile, getUniqueFile, 
getWorkOutputPath, setCompressOutput, setOutputCompressorClass, setOutputName, 
setOutputPath
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, 
toString, wait, wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+TemplatedSequenceFileOutputFormat
+publicTemplatedSequenceFileOutputFormat()
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+checkOutputSpecs
+publicvoidcheckOutputSpecs(org.apache.hadoop.mapreduce.JobContextjob)
+
+Overrides:
+checkOutputSpecsin 
classorg.apache.hadoop.mapreduce.lib.output.FileOutputFormatK,V
+
+
+
+
+
+
+
+
+getDefaultWorkFile
+publicorg.apache.hadoop.fs.PathgetDefaultWorkFile(org.apache.hadoop.mapreduce.TaskAttemptContextcontext,
+
java.lang.Stringextension)
+ throws java.io.IOException
+
+Overrides:
+getDefaultWorkFilein 
classorg.apache.hadoop.mapreduce.lib.output.FileOutputFormatK,V
+Throws:
+java.io.IOException
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+

[47/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/allclasses-noframe.html
--
diff --git a/content/documentation/sdks/javadoc/0.6.0/allclasses-noframe.html 
b/content/documentation/sdks/javadoc/0.6.0/allclasses-noframe.html
new file mode 100644
index 000..165a043
--- /dev/null
+++ b/content/documentation/sdks/javadoc/0.6.0/allclasses-noframe.html
@@ -0,0 +1,914 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+All Classes
+
+
+
+
+
+AllClasses
+
+
+AccumulatorCombiningState
+AfterAll
+AfterEach
+AfterFirst
+AfterPane
+AfterProcessingTime
+AfterSynchronizedProcessingTime
+AfterWatermark
+AfterWatermark.AfterWatermarkEarlyAndLate
+AfterWatermark.FromEndOfWindow
+AggAccumParam
+Aggregator
+AggregatorContainer
+AggregatorContainer.AggregatorKey
+AggregatorContainer.Mutator
+AggregatorMetric
+AggregatorMetricSource
+AggregatorRetrievalException
+AggregatorRetriever
+AggregatorsAccumulator
+AggregatorsAccumulator.AccumulatorCheckpointingSparkListener
+AggregatorValues
+ApexPipelineOptions
+ApexRunner
+ApexRunner.CreateApexPCollectionView
+ApexRunnerRegistrar
+ApexRunnerRegistrar.Options
+ApexRunnerRegistrar.Runner
+ApexRunnerResult
+ApexYarnLauncher
+ApexYarnLauncher.LaunchParams
+ApexYarnLauncher.ProcessWatcher
+ApiSurface
+AppEngineEnvironment
+ApplicationNameOptions
+AppliedCombineFn
+AppliedPTransform
+ApproximateQuantiles
+ApproximateQuantiles.ApproximateQuantilesCombineFn
+ApproximateUnique
+ApproximateUnique.ApproximateUniqueCombineFn
+ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
+AsJsons
+AtomicCoder
+AttemptAndTimeBoundedExponentialBackOff
+AttemptAndTimeBoundedExponentialBackOff.ResetPolicy
+AttemptBoundedExponentialBackOff
+AvroCoder
+AvroIO
+AvroIO.Read
+AvroIO.Read.Bound
+AvroIO.Write
+AvroIO.Write.Bound
+AvroSource
+AvroSource.AvroReader
+AvroUtils
+AvroUtils.AvroMetadata
+BagState
+BatchStatefulParDoOverrides
+BatchStatefulParDoOverrides.BatchStatefulDoFn
+BeamSparkRunnerRegistrator
+BigDecimalCoder
+BigEndianIntegerCoder
+BigEndianLongCoder
+BigIntegerCoder
+BigQueryIO
+BigQueryIO.Read
+BigQueryIO.Read.Bound
+BigQueryIO.Write
+BigQueryIO.Write.Bound
+BigQueryIO.Write.CreateDisposition
+BigQueryIO.Write.WriteDisposition
+BigqueryMatcher
+BigQueryOptions
+BigtableIO
+BigtableIO.Read
+BigtableIO.Write
+BitSetCoder
+BlockBasedSource
+BlockBasedSource.Block
+BlockBasedSource.BlockBasedReader
+BoundedReadFromUnboundedSource
+BoundedSource
+BoundedSource.BoundedReader
+BoundedWindow
+BucketingFunction
+BufferedElementCountingOutputStream
+BufferedExternalSorter
+BufferedExternalSorter.Options
+BundleFactory
+ByteArray
+ByteArrayCoder
+ByteBuddyDoFnInvokerFactory
+ByteBuddyDoFnInvokerFactory.DefaultRestrictionCoder
+ByteBuddyDoFnInvokerFactory.DefaultSplitRestriction
+ByteBuddyDoFnInvokerFactory.DoFnInvokerBase
+ByteCoder
+ByteKey
+ByteKeyRange
+ByteKeyRangeTracker
+ByteStringCoder
+CalendarWindows
+CalendarWindows.DaysWindows
+CalendarWindows.MonthsWindows
+CalendarWindows.YearsWindows
+CannotProvideCoderException
+CannotProvideCoderException.ReasonCode
+Clock
+CloudDebuggerOptions
+CloudObject
+CloudResourceManagerOptions
+Coder
+Coder.Context
+Coder.NonDeterministicException
+CoderException
+CoderFactories
+CoderFactory
+CoderHelpers
+CoderProperties
+CoderProperties.TestElementByteSizeObserver
+CoderProvider
+CoderProviders
+CoderRegistry
+CoderUtils
+CoGbkResult
+CoGbkResult.CoGbkResultCoder
+CoGbkResultSchema
+CoGroupByKey
+CollectionCoder
+Combine
+Combine.AccumulatingCombineFn
+Combine.AccumulatingCombineFn.Accumulator
+Combine.BinaryCombineDoubleFn
+Combine.BinaryCombineFn
+Combine.BinaryCombineIntegerFn
+Combine.BinaryCombineLongFn
+Combine.CombineFn
+Combine.Globally
+Combine.GloballyAsSingletonView
+Combine.GroupedValues
+Combine.Holder
+Combine.IterableCombineFn
+Combine.KeyedCombineFn
+Combine.PerKey
+Combine.PerKeyWithHotKeyFanout
+Combine.SimpleCombineFn
+CombineContextFactory
+CombineFnBase
+CombineFnBase.GlobalCombineFn
+CombineFnBase.PerKeyCombineFn
+CombineFns
+CombineFns.CoCombineResult
+CombineFns.ComposeCombineFnBuilder
+CombineFns.ComposedCombineFn
+CombineFns.ComposedCombineFnWithContext
+CombineFns.ComposedKeyedCombineFn
+CombineFns.ComposedKeyedCombineFnWithContext
+CombineFns.ComposeKeyedCombineFnBuilder
+CombineFnUtil
+CombineWithContext
+CombineWithContext.CombineFnWithContext
+CombineWithContext.Context
+CombineWithContext.KeyedCombineFnWithContext
+CombineWithContext.RequiresContextInternal
+CombiningState
+CompositeSource
+CompressedSource
+CompressedSource.CompressedReader
+CompressedSource.CompressionMode
+CompressedSource.DecompressingChannelFactory
+ConsoleIO
+ConsoleIO.Write
+ConsoleIO.Write.Unbound
+CopyOnAccessInMemoryStateInternals
+Count
+Counter
+CounterCell
+CountingInput
+CountingInput.BoundedCountingInput
+CountingInput.UnboundedCountingInput
+CountingSource
+CountingSource.CounterMark
+CrashingRunner
+Create

[48/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/allclasses-frame.html
--
diff --git a/content/documentation/sdks/javadoc/0.6.0/allclasses-frame.html 
b/content/documentation/sdks/javadoc/0.6.0/allclasses-frame.html
new file mode 100644
index 000..bed15b7
--- /dev/null
+++ b/content/documentation/sdks/javadoc/0.6.0/allclasses-frame.html
@@ -0,0 +1,914 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+All Classes
+
+
+
+
+
+AllClasses
+
+
+AccumulatorCombiningState
+AfterAll
+AfterEach
+AfterFirst
+AfterPane
+AfterProcessingTime
+AfterSynchronizedProcessingTime
+AfterWatermark
+AfterWatermark.AfterWatermarkEarlyAndLate
+AfterWatermark.FromEndOfWindow
+AggAccumParam
+Aggregator
+AggregatorContainer
+AggregatorContainer.AggregatorKey
+AggregatorContainer.Mutator
+AggregatorMetric
+AggregatorMetricSource
+AggregatorRetrievalException
+AggregatorRetriever
+AggregatorsAccumulator
+AggregatorsAccumulator.AccumulatorCheckpointingSparkListener
+AggregatorValues
+ApexPipelineOptions
+ApexRunner
+ApexRunner.CreateApexPCollectionView
+ApexRunnerRegistrar
+ApexRunnerRegistrar.Options
+ApexRunnerRegistrar.Runner
+ApexRunnerResult
+ApexYarnLauncher
+ApexYarnLauncher.LaunchParams
+ApexYarnLauncher.ProcessWatcher
+ApiSurface
+AppEngineEnvironment
+ApplicationNameOptions
+AppliedCombineFn
+AppliedPTransform
+ApproximateQuantiles
+ApproximateQuantiles.ApproximateQuantilesCombineFn
+ApproximateUnique
+ApproximateUnique.ApproximateUniqueCombineFn
+ApproximateUnique.ApproximateUniqueCombineFn.LargestUnique
+AsJsons
+AtomicCoder
+AttemptAndTimeBoundedExponentialBackOff
+AttemptAndTimeBoundedExponentialBackOff.ResetPolicy
+AttemptBoundedExponentialBackOff
+AvroCoder
+AvroIO
+AvroIO.Read
+AvroIO.Read.Bound
+AvroIO.Write
+AvroIO.Write.Bound
+AvroSource
+AvroSource.AvroReader
+AvroUtils
+AvroUtils.AvroMetadata
+BagState
+BatchStatefulParDoOverrides
+BatchStatefulParDoOverrides.BatchStatefulDoFn
+BeamSparkRunnerRegistrator
+BigDecimalCoder
+BigEndianIntegerCoder
+BigEndianLongCoder
+BigIntegerCoder
+BigQueryIO
+BigQueryIO.Read
+BigQueryIO.Read.Bound
+BigQueryIO.Write
+BigQueryIO.Write.Bound
+BigQueryIO.Write.CreateDisposition
+BigQueryIO.Write.WriteDisposition
+BigqueryMatcher
+BigQueryOptions
+BigtableIO
+BigtableIO.Read
+BigtableIO.Write
+BitSetCoder
+BlockBasedSource
+BlockBasedSource.Block
+BlockBasedSource.BlockBasedReader
+BoundedReadFromUnboundedSource
+BoundedSource
+BoundedSource.BoundedReader
+BoundedWindow
+BucketingFunction
+BufferedElementCountingOutputStream
+BufferedExternalSorter
+BufferedExternalSorter.Options
+BundleFactory
+ByteArray
+ByteArrayCoder
+ByteBuddyDoFnInvokerFactory
+ByteBuddyDoFnInvokerFactory.DefaultRestrictionCoder
+ByteBuddyDoFnInvokerFactory.DefaultSplitRestriction
+ByteBuddyDoFnInvokerFactory.DoFnInvokerBase
+ByteCoder
+ByteKey
+ByteKeyRange
+ByteKeyRangeTracker
+ByteStringCoder
+CalendarWindows
+CalendarWindows.DaysWindows
+CalendarWindows.MonthsWindows
+CalendarWindows.YearsWindows
+CannotProvideCoderException
+CannotProvideCoderException.ReasonCode
+Clock
+CloudDebuggerOptions
+CloudObject
+CloudResourceManagerOptions
+Coder
+Coder.Context
+Coder.NonDeterministicException
+CoderException
+CoderFactories
+CoderFactory
+CoderHelpers
+CoderProperties
+CoderProperties.TestElementByteSizeObserver
+CoderProvider
+CoderProviders
+CoderRegistry
+CoderUtils
+CoGbkResult
+CoGbkResult.CoGbkResultCoder
+CoGbkResultSchema
+CoGroupByKey
+CollectionCoder
+Combine
+Combine.AccumulatingCombineFn
+Combine.AccumulatingCombineFn.Accumulator
+Combine.BinaryCombineDoubleFn
+Combine.BinaryCombineFn
+Combine.BinaryCombineIntegerFn
+Combine.BinaryCombineLongFn
+Combine.CombineFn
+Combine.Globally
+Combine.GloballyAsSingletonView
+Combine.GroupedValues
+Combine.Holder
+Combine.IterableCombineFn
+Combine.KeyedCombineFn
+Combine.PerKey
+Combine.PerKeyWithHotKeyFanout
+Combine.SimpleCombineFn
+CombineContextFactory
+CombineFnBase
+CombineFnBase.GlobalCombineFn
+CombineFnBase.PerKeyCombineFn
+CombineFns
+CombineFns.CoCombineResult
+CombineFns.ComposeCombineFnBuilder
+CombineFns.ComposedCombineFn
+CombineFns.ComposedCombineFnWithContext
+CombineFns.ComposedKeyedCombineFn
+CombineFns.ComposedKeyedCombineFnWithContext
+CombineFns.ComposeKeyedCombineFnBuilder
+CombineFnUtil
+CombineWithContext
+CombineWithContext.CombineFnWithContext
+CombineWithContext.Context
+CombineWithContext.KeyedCombineFnWithContext
+CombineWithContext.RequiresContextInternal
+CombiningState
+CompositeSource
+CompressedSource
+CompressedSource.CompressedReader
+CompressedSource.CompressionMode
+CompressedSource.DecompressingChannelFactory
+ConsoleIO
+ConsoleIO.Write
+ConsoleIO.Write.Unbound
+CopyOnAccessInMemoryStateInternals
+Count
+Counter
+CounterCell
+CountingInput
+CountingInput.BoundedCountingInput
+CountingInput.UnboundedCountingInput
+CountingSource
+CountingSource.CounterMark
+CrashingRunner
+Create

[49/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/runners/spark/index.html
--
diff --git a/content/documentation/runners/spark/index.html 
b/content/documentation/runners/spark/index.html
index 9314203..29fce06 100644
--- a/content/documentation/runners/spark/index.html
+++ b/content/documentation/runners/spark/index.html
@@ -80,15 +80,15 @@
   
  SDKs
  Java 
SDK
- Java SDK API Reference Java SDK API Reference 
 
 Python SDK
-
+
  
  Runners
  Capability Matrix
@@ -177,7 +177,7 @@ The Spark Runner can execute Spark pipelines just like a 
native Spark applicatio
 dependency
   groupIdorg.apache.beam/groupId
   artifactIdbeam-runners-spark/artifactId
-  version0.5.0/version
+  version0.6.0/version
 /dependency
 
 

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/java/index.html
--
diff --git a/content/documentation/sdks/java/index.html 
b/content/documentation/sdks/java/index.html
index fc3123e..4901fb3 100644
--- a/content/documentation/sdks/java/index.html
+++ b/content/documentation/sdks/java/index.html
@@ -80,15 +80,15 @@
   
  SDKs
  Java 
SDK
- Java SDK API Reference Java SDK API Reference 
 
 Python SDK
-
+
  
  Runners
  Capability Matrix

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/META-INF/MANIFEST.MF
--
diff --git a/content/documentation/sdks/javadoc/0.6.0/META-INF/MANIFEST.MF 
b/content/documentation/sdks/javadoc/0.6.0/META-INF/MANIFEST.MF
new file mode 100644
index 000..7a2861f
--- /dev/null
+++ b/content/documentation/sdks/javadoc/0.6.0/META-INF/MANIFEST.MF
@@ -0,0 +1,5 @@
+Manifest-Version: 1.0
+Ant-Version: Apache Ant 1.9.4
+Created-By: 1.8.0_112-google-v7-146844476-143772575 (Oracle Corporatio
+ n)
+



[10/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/MicrobatchSource.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/MicrobatchSource.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/MicrobatchSource.html
new file mode 100644
index 000..4346268
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/io/MicrobatchSource.html
@@ -0,0 +1,487 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+MicrobatchSource
+
+
+
+
+
+
+var methods = 
{"i0":10,"i1":10,"i2":10,"i3":10,"i4":10,"i5":10,"i6":10,"i7":10,"i8":10,"i9":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.spark.io
+Class 
MicrobatchSourceT,CheckpointMarkT extends UnboundedSource.CheckpointMark
+
+
+
+java.lang.Object
+
+
+org.apache.beam.sdk.io.SourceT
+
+
+org.apache.beam.sdk.io.BoundedSourceT
+
+
+org.apache.beam.runners.spark.io.MicrobatchSourceT,CheckpointMarkT
+
+
+
+
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable, HasDisplayData
+
+
+
+public class MicrobatchSourceT,CheckpointMarkT extends UnboundedSource.CheckpointMark
+extends BoundedSourceT
+Mostly based on BoundedReadFromUnboundedSource,
+ with some adjustments for this specific use-case.
+
+ A BoundedSource wrapping 
an UnboundedSource to 
complement Spark's micro-batch
+ nature.
+
+ By design, Spark's micro-batches are bounded by their duration. Spark also 
provides a
+ back-pressure mechanism that may signal a bound by max records.
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+Nested Classes
+
+Modifier and Type
+Class and Description
+
+
+class
+MicrobatchSource.Reader
+A BoundedSource.BoundedReader
+ wrapping an UnboundedSource.UnboundedReader.
+
+
+
+
+
+
+
+Nested classes/interfaces inherited from 
classorg.apache.beam.sdk.io.BoundedSource
+BoundedSource.BoundedReaderT
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+BoundedSource.BoundedReaderT
+createReader(PipelineOptionsoptions)
+Returns a new BoundedSource.BoundedReader that reads 
from this source.
+
+
+
+BoundedSource.BoundedReaderT
+createReader(PipelineOptionsoptions,
+CheckpointMarkTcheckpointMark)
+
+
+boolean
+equals(java.lang.Objecto)
+
+
+CoderCheckpointMarkT
+getCheckpointMarkCoder()
+
+
+CoderT
+getDefaultOutputCoder()
+Returns the default Coder to use for the data 
read from this source.
+
+
+
+long
+getEstimatedSizeBytes(PipelineOptionsoptions)
+An estimate of the total size (in bytes) of the data that 
would be read from this source.
+
+
+
+java.lang.String
+getId()
+
+
+int
+hashCode()
+
+
+java.util.List? extends BoundedSourceT
+splitIntoBundles(longdesiredBundleSizeBytes,
+PipelineOptionsoptions)
+Splits the source into bundles of approximately 
desiredBundleSizeBytes.
+
+
+
+void
+validate()
+Checks that this source is valid, before it can be used in 
a pipeline.
+
+
+
+
+
+
+
+Methods inherited from classorg.apache.beam.sdk.io.Source
+populateDisplayData
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, finalize, getClass, notify, notifyAll, toString, wait, wait, 
wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+splitIntoBundles
+publicjava.util.List? extends BoundedSourceTsplitIntoBundles(longdesiredBundleSizeBytes,
+   PipelineOptionsoptions)
+throws 
java.lang.Exception
+Description copied from 
class:BoundedSource
+Splits the source into bundles of approximately 
desiredBundleSizeBytes.
+
+Specified by:
+splitIntoBundlesin
 classBoundedSourceT
+Throws:
+java.lang.Exception
+
+
+
+
+
+
+
+
+getEstimatedSizeBytes
+publiclonggetEstimatedSizeBytes(PipelineOptionsoptions)
+   throws java.lang.Exception

[28/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/DataflowTemplateJob.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/DataflowTemplateJob.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/DataflowTemplateJob.html
new file mode 100644
index 000..2c638f9
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/util/DataflowTemplateJob.html
@@ -0,0 +1,414 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+DataflowTemplateJob
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10,"i2":10,"i3":10,"i4":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.dataflow.util
+Class 
DataflowTemplateJob
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.dataflow.DataflowPipelineJob
+
+
+org.apache.beam.runners.dataflow.util.DataflowTemplateJob
+
+
+
+
+
+
+
+
+
+All Implemented Interfaces:
+PipelineResult
+
+
+
+public class DataflowTemplateJob
+extends DataflowPipelineJob
+A DataflowPipelineJob that is 
returned when --templateRunner is set.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.sdk.PipelineResult
+PipelineResult.State
+
+
+
+
+
+
+
+
+Field Summary
+
+
+
+
+Fields inherited from classorg.apache.beam.runners.dataflow.DataflowPipelineJob
+STATUS_BACKOFF_FACTORY
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors
+
+Constructor and Description
+
+
+DataflowTemplateJob()
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+PipelineResult.State
+cancel()
+Cancels the pipeline execution.
+
+
+
+java.lang.String
+getJobId()
+Get the id of this job.
+
+
+
+java.lang.String
+getProjectId()
+Get the project this job exists in.
+
+
+
+DataflowPipelineJob
+getReplacedByJob()
+Returns a new DataflowPipelineJob for the 
job that replaced this one, if applicable.
+
+
+
+PipelineResult.State
+getState()
+Retrieves the current state of the pipeline execution.
+
+
+
+
+
+
+
+Methods inherited from classorg.apache.beam.runners.dataflow.DataflowPipelineJob
+getAggregatorValues,
 metrics,
 waitUntilFinish,
 waitUntilFinish,
 waitUntilFinish
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, 
toString, wait, wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+DataflowTemplateJob
+publicDataflowTemplateJob()
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+getJobId
+publicjava.lang.StringgetJobId()
+Description copied from 
class:DataflowPipelineJob
+Get the id of this job.
+
+Overrides:
+getJobIdin
 classDataflowPipelineJob
+
+
+
+
+
+
+
+
+getProjectId
+publicjava.lang.StringgetProjectId()
+Description copied from 
class:DataflowPipelineJob
+Get the project this job exists in.
+
+Overrides:
+getProjectIdin
 classDataflowPipelineJob
+
+
+
+
+
+
+
+
+getReplacedByJob
+publicDataflowPipelineJobgetReplacedByJob()
+Description copied from 
class:DataflowPipelineJob
+Returns a new DataflowPipelineJob for the 
job that replaced this one, if applicable.
+
+Overrides:
+getReplacedByJobin
 classDataflowPipelineJob
+
+
+
+
+
+
+
+
+cancel
+publicPipelineResult.Statecancel()
+Description copied from 
interface:PipelineResult
+Cancels the pipeline execution.
+
+Specified by:
+cancelin
 interfacePipelineResult
+Overrides:
+cancelin
 classDataflowPipelineJob
+
+
+
+
+
+
+
+
+getState
+publicPipelineResult.StategetState()
+Description copied from 
interface:PipelineResult
+Retrieves the current state of the pipeline execution.
+
+Specified by:
+getStatein
 interfacePipelineResult
+Overrides:
+getStatein
 classDataflowPipelineJob
+Returns:
+the PipelineResult.State 
representing the state of this pipeline.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+

[41/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/package-frame.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/package-frame.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/package-frame.html
new file mode 100644
index 000..b9d24a3
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/package-frame.html
@@ -0,0 +1,33 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+org.apache.beam.runners.apex
+
+
+
+
+
+org.apache.beam.runners.apex
+
+Interfaces
+
+ApexPipelineOptions
+
+Classes
+
+ApexRunner
+ApexRunner.CreateApexPCollectionView
+ApexRunnerRegistrar
+ApexRunnerRegistrar.Options
+ApexRunnerRegistrar.Runner
+ApexRunnerResult
+ApexYarnLauncher
+ApexYarnLauncher.LaunchParams
+ApexYarnLauncher.ProcessWatcher
+TestApexRunner
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/package-summary.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/package-summary.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/package-summary.html
new file mode 100644
index 000..837d83b
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/package-summary.html
@@ -0,0 +1,225 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+org.apache.beam.runners.apex
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevPackage
+NextPackage
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+Packageorg.apache.beam.runners.apex
+
+Implementation of the Beam runner for Apache Apex.
+
+See:Description
+
+
+
+
+
+Interface Summary
+
+Interface
+Description
+
+
+
+ApexPipelineOptions
+
+Options that configure the Apex pipeline.
+
+
+
+
+
+
+
+Class Summary
+
+Class
+Description
+
+
+
+ApexRunner
+
+A PipelineRunner 
that translates the
+ pipeline to an Apex DAG and executes it on an Apex cluster.
+
+
+
+ApexRunner.CreateApexPCollectionViewElemT,ViewT
+
+Creates a primitive PCollectionView.
+
+
+
+ApexRunnerRegistrar
+
+Contains the PipelineRunnerRegistrar and PipelineOptionsRegistrar for the
+ ApexRunner.
+
+
+
+ApexRunnerRegistrar.Options
+
+Registers the ApexPipelineOptions.
+
+
+
+ApexRunnerRegistrar.Runner
+
+Registers the ApexRunner.
+
+
+
+ApexRunnerResult
+
+Result of executing a Pipeline with Apex in embedded mode.
+
+
+
+ApexYarnLauncher
+
+Proxy to launch the YARN application through the hadoop 
script to run in the
+ pre-configured environment (class path, configuration, native libraries 
etc.).
+
+
+
+ApexYarnLauncher.LaunchParams
+
+Launch parameters that will be serialized and passed to the 
child process.
+
+
+
+ApexYarnLauncher.ProcessWatcher
+
+Starts a command and waits for it to complete.
+
+
+
+TestApexRunner
+
+Apex PipelineRunner 
for testing.
+
+
+
+
+
+
+
+
+
+Package 
org.apache.beam.runners.apex Description
+Implementation of the Beam runner for Apache Apex.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevPackage
+NextPackage
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/package-tree.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/package-tree.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/package-tree.html
new file mode 100644
index 000..bb05c4f
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/apex/package-tree.html
@@ -0,0 +1,169 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+org.apache.beam.runners.apex Class Hierarchy
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10,"i2":10,"i3":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.spark.util
+Class 
GlobalWatermarkHolder.SparkWatermarks
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.spark.util.GlobalWatermarkHolder.SparkWatermarks
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable
+
+
+Enclosing class:
+GlobalWatermarkHolder
+
+
+
+public static class GlobalWatermarkHolder.SparkWatermarks
+extends java.lang.Object
+implements java.io.Serializable
+A GlobalWatermarkHolder.SparkWatermarks
 holds the watermarks and batch time
+ relevant to a micro-batch input from a specific source.
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors
+
+Constructor and Description
+
+
+SparkWatermarks(http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in org.joda.time">InstantlowWatermark,
+   http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in org.joda.time">InstanthighWatermark,
+   http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in 
org.joda.time">InstantsynchronizedProcessingTime)
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in org.joda.time">Instant
+getHighWatermark()
+
+
+http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in org.joda.time">Instant
+getLowWatermark()
+
+
+http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in org.joda.time">Instant
+getSynchronizedProcessingTime()
+
+
+java.lang.String
+toString()
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, wait, 
wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+SparkWatermarks
+publicSparkWatermarks(http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in org.joda.time">InstantlowWatermark,
+   http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in org.joda.time">InstanthighWatermark,
+   http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in 
org.joda.time">InstantsynchronizedProcessingTime)
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+getLowWatermark
+publichttp://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in 
org.joda.time">InstantgetLowWatermark()
+
+
+
+
+
+
+
+getHighWatermark
+publichttp://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in 
org.joda.time">InstantgetHighWatermark()
+
+
+
+
+
+
+
+getSynchronizedProcessingTime
+publichttp://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface 

[32/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/package-tree.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/package-tree.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/package-tree.html
new file mode 100644
index 000..89ff088
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/package-tree.html
@@ -0,0 +1,415 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+org.apache.beam.runners.dataflow.options Class Hierarchy
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+Hierarchy For Package 
org.apache.beam.runners.dataflow.options
+Package Hierarchies:
+
+All Packages
+
+
+
+Class Hierarchy
+
+java.lang.Object
+
+java.util.AbstractMapK,V (implements 
java.util.MapK,V)
+
+java.util.HashMapK,V (implements 
java.lang.Cloneable, java.util.MapK,V, java.io.Serializable)
+
+org.apache.beam.runners.dataflow.options.DataflowProfilingOptions.DataflowProfilingAgentConfiguration
+org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
+
+
+
+
+org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.DataflowClientFactory
 (implements org.apache.beam.sdk.options.DefaultValueFactoryT)
+org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.StagerFactory 
(implements org.apache.beam.sdk.options.DefaultValueFactoryT)
+org.apache.beam.runners.dataflow.options.DataflowPipelineOptions.StagingLocationFactory 
(implements org.apache.beam.sdk.options.DefaultValueFactoryT)
+org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions.WorkerHarnessContainerImageFactory
 (implements org.apache.beam.sdk.options.DefaultValueFactoryT)
+
+
+
+Interface Hierarchy
+
+org.apache.beam.runners.dataflow.options.CloudDebuggerOptions
+
+org.apache.beam.runners.dataflow.options.DataflowPipelineOptions (also extends 
org.apache.beam.sdk.options.ApplicationNameOptions, 
org.apache.beam.sdk.options.BigQueryOptions, 
org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions, 
org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions,
 org.apache.beam.runners.dataflow.options.DataflowProfilingOptions, 
org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions, 
org.apache.beam.sdk.options.GcpOptions, 
org.apache.beam.sdk.options.GcsOptions, 
org.apache.beam.sdk.options.PipelineOptions, 
org.apache.beam.sdk.options.PubsubOptions, 
org.apache.beam.sdk.options.StreamingOptions)
+
+org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
+
+
+
+
+org.apache.beam.runners.dataflow.options.DataflowProfilingOptions
+
+org.apache.beam.runners.dataflow.options.DataflowPipelineOptions (also extends 
org.apache.beam.sdk.options.ApplicationNameOptions, 
org.apache.beam.sdk.options.BigQueryOptions, 
org.apache.beam.runners.dataflow.options.CloudDebuggerOptions, 
org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions, 
org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions,
 org.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions, 
org.apache.beam.sdk.options.GcpOptions, 
org.apache.beam.sdk.options.GcsOptions, 
org.apache.beam.sdk.options.PipelineOptions, 
org.apache.beam.sdk.options.PubsubOptions, 
org.apache.beam.sdk.options.StreamingOptions)
+
+org.apache.beam.runners.dataflow.options.DataflowWorkerHarnessOptions
+
+
+
+
+org.apache.beam.sdk.transforms.display.HasDisplayData
+
+org.apache.beam.sdk.options.PipelineOptions
+
+org.apache.beam.sdk.options.ApplicationNameOptions
+
+org.apache.beam.sdk.options.BigQueryOptions (also extends 
org.apache.beam.sdk.options.GcpOptions, 
org.apache.beam.sdk.options.PipelineOptions, 
org.apache.beam.sdk.options.StreamingOptions)
+
+org.apache.beam.runners.dataflow.options.DataflowPipelineOptions (also extends 
org.apache.beam.sdk.options.ApplicationNameOptions, 
org.apache.beam.runners.dataflow.options.CloudDebuggerOptions, 
org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions, 
org.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions,
 

[04/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/package-summary.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/package-summary.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/package-summary.html
new file mode 100644
index 000..828fb52
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/package-summary.html
@@ -0,0 +1,232 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+org.apache.beam.runners.spark
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevPackage
+NextPackage
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+Packageorg.apache.beam.runners.spark
+
+Internal implementation of the Beam runner for Apache 
Spark.
+
+See:Description
+
+
+
+
+
+Interface Summary
+
+Interface
+Description
+
+
+
+SparkContextOptions
+
+A custom PipelineOptions to work with 
properties related to JavaSparkContext.
+
+
+
+SparkPipelineOptions
+
+Spark runner PipelineOptions handles Spark 
execution-related configurations,
+ such as the master address, batch-interval, and other user-related 
knobs.
+
+
+
+TestSparkPipelineOptions
+
+A SparkPipelineOptions for 
tests.
+
+
+
+
+
+
+
+Class Summary
+
+Class
+Description
+
+
+
+SparkContextOptions.EmptyListenersList
+
+Returns an empty list, top avoid handling null.
+
+
+
+SparkPipelineOptions.TmpCheckpointDirFactory
+
+Returns the default checkpoint directory of 
/tmp/${job.name}.
+
+
+
+SparkPipelineResult
+
+Represents a Spark pipeline execution result.
+
+
+
+SparkRunner
+
+The SparkRunner translate operations defined on a pipeline 
to a representation
+ executable by Spark, and then submitting the job to Spark to be 
executed.
+
+
+
+SparkRunner.Evaluator
+
+Evaluator on the pipeline.
+
+
+
+SparkRunnerRegistrar
+
+Contains the PipelineRunnerRegistrar and PipelineOptionsRegistrar for the
+ SparkRunner.
+
+
+
+SparkRunnerRegistrar.Options
+
+Registers the SparkPipelineOptions.
+
+
+
+SparkRunnerRegistrar.Runner
+
+Registers the SparkRunner.
+
+
+
+TestSparkRunner
+
+The SparkRunner translate operations defined on a pipeline 
to a representation executable
+ by Spark, and then submitting the job to Spark to be executed.
+
+
+
+
+
+
+
+
+
+Package 
org.apache.beam.runners.spark Description
+Internal implementation of the Beam runner for Apache 
Spark.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevPackage
+NextPackage
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/package-tree.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/package-tree.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/package-tree.html
new file mode 100644
index 000..59527d8
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/package-tree.html
@@ -0,0 +1,203 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+org.apache.beam.runners.spark Class Hierarchy
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+Hierarchy For Package org.apache.beam.runners.spark
+Package Hierarchies:
+
+All Packages
+
+
+
+Class Hierarchy
+
+java.lang.Object
+
+org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults (implements 

[23/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/DirectRegistrar.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/DirectRegistrar.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/DirectRegistrar.html
new file mode 100644
index 000..7488ddc
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/DirectRegistrar.html
@@ -0,0 +1,224 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+DirectRegistrar
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.direct
+Class DirectRegistrar
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.direct.DirectRegistrar
+
+
+
+
+
+
+
+
+public class DirectRegistrar
+extends java.lang.Object
+Contains the PipelineRunnerRegistrar and PipelineOptionsRegistrar for the
+ DirectRunner.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+Nested Classes
+
+Modifier and Type
+Class and Description
+
+
+static class
+DirectRegistrar.Options
+Registers the DirectOptions.
+
+
+
+static class
+DirectRegistrar.Runner
+Registers the DirectRunner.
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, 
toString, wait, wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/DirectRunner.DirectPipelineResult.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/DirectRunner.DirectPipelineResult.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/DirectRunner.DirectPipelineResult.html
new file mode 100644
index 000..0f11e31
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/DirectRunner.DirectPipelineResult.html
@@ -0,0 +1,408 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+DirectRunner.DirectPipelineResult
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10,"i2":10,"i3":10,"i4":10,"i5":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.direct
+Class 
DirectRunner.DirectPipelineResult
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.direct.DirectRunner.DirectPipelineResult
+
+
+
+
+
+
+
+All Implemented Interfaces:
+PipelineResult
+
+
+Enclosing class:
+DirectRunner
+
+
+
+public static class DirectRunner.DirectPipelineResult
+extends java.lang.Object
+implements PipelineResult
+The result of running a Pipeline with the DirectRunner.
+
+ Throws UnsupportedOperationException for all methods.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+
+
+
+Nested 

[01/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
Repository: beam-site
Updated Branches:
  refs/heads/asf-site 7407f66b5 -> 4490a1f9b


http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/sdk/Pipeline.PipelineVisitor.Defaults.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/sdk/Pipeline.PipelineVisitor.Defaults.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/sdk/Pipeline.PipelineVisitor.Defaults.html
new file mode 100644
index 000..8d043e8
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/sdk/Pipeline.PipelineVisitor.Defaults.html
@@ -0,0 +1,382 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+Pipeline.PipelineVisitor.Defaults
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10,"i2":10,"i3":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.sdk
+Class 
Pipeline.PipelineVisitor.Defaults
+
+
+
+java.lang.Object
+
+
+org.apache.beam.sdk.Pipeline.PipelineVisitor.Defaults
+
+
+
+
+
+
+
+All Implemented Interfaces:
+Pipeline.PipelineVisitor
+
+
+Direct Known Subclasses:
+SparkRunner.Evaluator
+
+
+Enclosing interface:
+Pipeline.PipelineVisitor
+
+
+
+public static class Pipeline.PipelineVisitor.Defaults
+extends java.lang.Object
+implements Pipeline.PipelineVisitor
+Default no-op Pipeline.PipelineVisitor that enters all 
composite transforms.
+ User implementations can override just those methods they are interested 
in.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.sdk.Pipeline.PipelineVisitor
+Pipeline.PipelineVisitor.CompositeBehavior, Pipeline.PipelineVisitor.Defaults
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors
+
+Constructor and Description
+
+
+Defaults()
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+Pipeline.PipelineVisitor.CompositeBehavior
+enterCompositeTransform(TransformHierarchy.Nodenode)
+Called for each composite transform after all topological 
predecessors have been visited
+ but before any of its component transforms.
+
+
+
+void
+leaveCompositeTransform(TransformHierarchy.Nodenode)
+Called for each composite transform after all of its 
component transforms and their outputs
+ have been visited.
+
+
+
+void
+visitPrimitiveTransform(TransformHierarchy.Nodenode)
+Called for each primitive transform after all of its 
topological predecessors
+ and inputs have been visited.
+
+
+
+void
+visitValue(PValuevalue,
+  TransformHierarchy.Nodeproducer)
+Called for each value after the transform that produced the 
value has been
+ visited.
+
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, 
toString, wait, wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+Defaults
+publicDefaults()
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+enterCompositeTransform
+publicPipeline.PipelineVisitor.CompositeBehaviorenterCompositeTransform(TransformHierarchy.Nodenode)
+Description copied from 
interface:Pipeline.PipelineVisitor
+Called for each composite transform after all topological 
predecessors have been visited
+ but before any of its component transforms.
+
+ The return value controls whether or not child transforms are 
visited.
+
+Specified by:
+enterCompositeTransformin
 interfacePipeline.PipelineVisitor
+
+
+
+
+
+
+
+
+leaveCompositeTransform
+publicvoidleaveCompositeTransform(TransformHierarchy.Nodenode)
+Description copied from 
interface:Pipeline.PipelineVisitor
+Called for each composite transform after all of its 
component transforms and their outputs
+ have been visited.
+
+Specified by:
+leaveCompositeTransformin
 interfacePipeline.PipelineVisitor
+
+
+
+
+
+
+
+
+visitPrimitiveTransform
+publicvoidvisitPrimitiveTransform(TransformHierarchy.Nodenode)
+Description copied from 
interface:Pipeline.PipelineVisitor
+Called for each primitive transform after all of its 

[37/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowPipelineDebugOptions.StagerFactory.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowPipelineDebugOptions.StagerFactory.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowPipelineDebugOptions.StagerFactory.html
new file mode 100644
index 000..e0ccff0
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowPipelineDebugOptions.StagerFactory.html
@@ -0,0 +1,292 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+DataflowPipelineDebugOptions.StagerFactory
+
+
+
+
+
+
+var methods = {"i0":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.dataflow.options
+Class DataflowPipelineDebugOptions.StagerFactory
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions.StagerFactory
+
+
+
+
+
+
+
+All Implemented Interfaces:
+DefaultValueFactoryStager
+
+
+Enclosing interface:
+DataflowPipelineDebugOptions
+
+
+
+public static class DataflowPipelineDebugOptions.StagerFactory
+extends java.lang.Object
+implements DefaultValueFactoryStager
+Creates a Stager object using the 
class specified in
+ DataflowPipelineDebugOptions.getStagerClass().
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors
+
+Constructor and Description
+
+
+StagerFactory()
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+Stager
+create(PipelineOptionsoptions)
+Creates a default value for a getter marked with Default.InstanceFactory.
+
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, 
toString, wait, wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+StagerFactory
+publicStagerFactory()
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+create
+publicStagercreate(PipelineOptionsoptions)
+Description copied from 
interface:DefaultValueFactory
+Creates a default value for a getter marked with Default.InstanceFactory.
+
+Specified by:
+createin
 interfaceDefaultValueFactoryStager
+Parameters:
+options - The current pipeline options.
+Returns:
+The default value to be used for the annotated getter.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowPipelineDebugOptions.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowPipelineDebugOptions.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowPipelineDebugOptions.html
new file mode 100644
index 000..5100710
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowPipelineDebugOptions.html
@@ -0,0 +1,681 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+DataflowPipelineDebugOptions
+
+
+
+
+
+
+var methods = 

[34/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowWorkerHarnessOptions.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowWorkerHarnessOptions.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowWorkerHarnessOptions.html
new file mode 100644
index 000..9f5ca9c
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/options/DataflowWorkerHarnessOptions.html
@@ -0,0 +1,445 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+DataflowWorkerHarnessOptions
+
+
+
+
+
+
+var methods = {"i0":6,"i1":6,"i2":6,"i3":6,"i4":6,"i5":6};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],4:["t3","Abstract Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.dataflow.options
+Interface 
DataflowWorkerHarnessOptions
+
+
+
+
+
+
+All Superinterfaces:
+ApplicationNameOptions, 
BigQueryOptions, CloudDebuggerOptions, DataflowPipelineDebugOptions, DataflowPipelineOptions, DataflowPipelineWorkerPoolOptions,
 DataflowProfilingOptions, DataflowWorkerLoggingOptions, GcpOptions, GcsOptions, GoogleApiDebugOptions, HasDisplayData, PipelineOptions, PubsubOptions, StreamingOptions
+
+
+
+@Hidden
+public interface DataflowWorkerHarnessOptions
+extends DataflowPipelineOptions
+Options that are used exclusively within the Dataflow 
worker harness.
+ These options have no effect at pipeline creation time.
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.runners.dataflow.options.DataflowPipelineOptions
+DataflowPipelineOptions.StagingLocationFactory
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.runners.dataflow.options.DataflowPipelineDebugOptions
+DataflowPipelineDebugOptions.DataflowClientFactory,
 DataflowPipelineDebugOptions.StagerFactory
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.runners.dataflow.options.DataflowPipelineWorkerPoolOptions
+DataflowPipelineWorkerPoolOptions.AutoscalingAlgorithmType,
 DataflowPipelineWorkerPoolOptions.WorkerHarnessContainerImageFactory
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.sdk.options.GcsOptions
+GcsOptions.ExecutorServiceFactory, GcsOptions.PathValidatorFactory
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.runners.dataflow.options.DataflowWorkerLoggingOptions
+DataflowWorkerLoggingOptions.Level,
 DataflowWorkerLoggingOptions.WorkerLogLevelOverrides
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.runners.dataflow.options.DataflowProfilingOptions
+DataflowProfilingOptions.DataflowProfilingAgentConfiguration
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.sdk.options.GcpOptions
+GcpOptions.DefaultProjectFactory, GcpOptions.GcpTempLocationFactory, GcpOptions.GcpUserCredentialsFactory
+
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.beam.sdk.options.GoogleApiDebugOptions
+GoogleApiDebugOptions.GoogleApiTracer
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsAbstract Methods
+
+Modifier and Type
+Method and Description
+
+
+java.lang.String
+getJobId()
+The identity of the Dataflow job.
+
+
+
+java.lang.Integer
+getWorkerCacheMb()
+The size of the worker's in-memory cache, in 
megabytes.
+
+
+
+java.lang.String
+getWorkerId()
+The identity of the worker running this pipeline.
+
+
+
+void
+setJobId(java.lang.Stringvalue)
+
+
+void
+setWorkerCacheMb(java.lang.Integervalue)
+
+
+void
+setWorkerId(java.lang.Stringvalue)
+
+
+
+
+
+
+Methods inherited from 
interfaceorg.apache.beam.runners.dataflow.options.DataflowPipelineOptions
+getProject,
 getRegion,
 getServiceAccount,
 getStagingLocation,
 getTemplateLocation,
 isUpdate,
 setProject,
 setRegion,
 setServiceAccount,
 setStagingLocation,
 setTemplateLocation,
 setUpdate
+
+
+
+
+
+Methods inherited from 

[18/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/flink/FlinkRunner.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/flink/FlinkRunner.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/flink/FlinkRunner.html
new file mode 100644
index 000..aaf552b
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/flink/FlinkRunner.html
@@ -0,0 +1,332 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+FlinkRunner
+
+
+
+
+
+
+var methods = {"i0":9,"i1":9,"i2":10,"i3":10,"i4":10};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.flink
+Class FlinkRunner
+
+
+
+java.lang.Object
+
+
+org.apache.beam.sdk.runners.PipelineRunnerPipelineResult
+
+
+org.apache.beam.runners.flink.FlinkRunner
+
+
+
+
+
+
+
+
+
+
+public class FlinkRunner
+extends PipelineRunnerPipelineResult
+A PipelineRunner 
that executes the operations in the
+ pipeline by first translating them to a Flink Plan and then executing them 
either locally
+ or on a Flink cluster, depending on the configuration.
+
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsStatic MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+protected static 
java.util.Listjava.lang.String
+detectClassPathResourcesToStage(java.lang.ClassLoaderclassLoader)
+Attempts to detect all the resources the class loader has 
access to.
+
+
+
+static FlinkRunner
+fromOptions(PipelineOptionsoptions)
+Construct a runner from the provided options.
+
+
+
+FlinkPipelineOptions
+getPipelineOptions()
+For testing.
+
+
+
+PipelineResult
+run(Pipelinepipeline)
+Processes the given Pipeline, returning the results.
+
+
+
+java.lang.String
+toString()
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, wait, 
wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+fromOptions
+public staticFlinkRunnerfromOptions(PipelineOptionsoptions)
+Construct a runner from the provided options.
+
+Parameters:
+options - Properties which configure the runner.
+Returns:
+The newly created runner.
+
+
+
+
+
+
+
+
+run
+publicPipelineResultrun(Pipelinepipeline)
+Description copied from 
class:PipelineRunner
+Processes the given Pipeline, returning the results.
+
+Specified by:
+runin
 classPipelineRunnerPipelineResult
+
+
+
+
+
+
+
+
+getPipelineOptions
+publicFlinkPipelineOptionsgetPipelineOptions()
+For testing.
+
+
+
+
+
+
+
+toString
+publicjava.lang.StringtoString()
+
+Overrides:
+toStringin classjava.lang.Object
+
+
+
+
+
+
+
+
+detectClassPathResourcesToStage
+protected 
staticjava.util.Listjava.lang.StringdetectClassPathResourcesToStage(java.lang.ClassLoaderclassLoader)
+Attempts to detect all the resources the class loader has 
access to. This does not recurse
+ to class loader parents stopping it from pulling in resources from the system 
class loader.
+
+Parameters:
+classLoader - The URLClassLoader to use to detect resources 
to stage.
+Returns:
+A list of absolute paths to the resources the class loader uses.
+Throws:
+java.lang.IllegalArgumentException - If either the class 
loader is not a URLClassLoader or one
+   of the resources the class loader exposes is not a file resource.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+


[51/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
Regenerate website


Project: http://git-wip-us.apache.org/repos/asf/beam-site/repo
Commit: http://git-wip-us.apache.org/repos/asf/beam-site/commit/dc04d667
Tree: http://git-wip-us.apache.org/repos/asf/beam-site/tree/dc04d667
Diff: http://git-wip-us.apache.org/repos/asf/beam-site/diff/dc04d667

Branch: refs/heads/asf-site
Commit: dc04d667ba728bf1a309b880ce9f7131fb893fb9
Parents: 497bcc7
Author: Davor Bonaci 
Authored: Tue Mar 14 15:32:57 2017 -0700
Committer: Davor Bonaci 
Committed: Tue Mar 14 15:32:57 2017 -0700

--
 content/.htaccess   |15 -
 .../2016/03/17/capability-matrix.html   | 6 +-
 .../2016/04/03/presentation-materials.html  | 6 +-
 .../sdk/2016/02/25/python-sdk-now-public.html   | 6 +-
 .../beam/release/2016/06/15/first-release.html  | 6 +-
 .../10/11/strata-hadoop-world-and-beam.html | 6 +-
 .../website/2016/02/22/beam-has-a-logo.html | 6 +-
 .../blog/2016/05/18/splitAtFraction-method.html | 6 +-
 .../05/27/where-is-my-pcollection-dot-map.html  | 6 +-
 .../06/13/flink-batch-runner-milestone.html | 6 +-
 content/blog/2016/08/03/six-months.html | 6 +-
 content/blog/2016/10/20/test-stream.html| 8 +-
 content/blog/2017/01/09/added-apex-runner.html  | 6 +-
 content/blog/2017/01/10/beam-graduates.html | 6 +-
 .../blog/2017/02/01/graduation-media-recap.html | 6 +-
 .../blog/2017/02/13/stateful-processing.html| 6 +-
 content/blog/index.html | 6 +-
 content/coming-soon.html| 6 +-
 .../contribute/contribution-guide/index.html| 6 +-
 content/contribute/design-principles/index.html | 6 +-
 content/contribute/index.html   | 6 +-
 content/contribute/logos/index.html | 6 +-
 content/contribute/maturity-model/index.html| 6 +-
 .../presentation-materials/index.html   | 6 +-
 .../ptransform-style-guide/index.html   | 6 +-
 content/contribute/release-guide/index.html | 6 +-
 content/contribute/source-repository/index.html | 6 +-
 content/contribute/team/index.html  | 6 +-
 content/contribute/testing/index.html   | 6 +-
 content/contribute/work-in-progress/index.html  | 6 +-
 content/documentation/index.html| 6 +-
 .../pipelines/create-your-pipeline/index.html   | 6 +-
 .../pipelines/design-your-pipeline/index.html   | 6 +-
 .../pipelines/test-your-pipeline/index.html |10 +-
 .../documentation/programming-guide/index.html  |16 +-
 content/documentation/resources/index.html  | 6 +-
 content/documentation/runners/apex/index.html   | 6 +-
 .../runners/capability-matrix/index.html| 6 +-
 .../documentation/runners/dataflow/index.html   |10 +-
 content/documentation/runners/direct/index.html |16 +-
 content/documentation/runners/flink/index.html  |10 +-
 content/documentation/runners/spark/index.html  | 8 +-
 content/documentation/sdks/java/index.html  | 6 +-
 .../sdks/javadoc/0.6.0/META-INF/MANIFEST.MF | 5 +
 .../sdks/javadoc/0.6.0/allclasses-frame.html|   914 +
 .../sdks/javadoc/0.6.0/allclasses-noframe.html  |   914 +
 .../sdks/javadoc/0.6.0/constant-values.html |  1267 ++
 .../sdks/javadoc/0.6.0/deprecated-list.html |   459 +
 .../sdks/javadoc/0.6.0/help-doc.html|   223 +
 .../sdks/javadoc/0.6.0/index-all.html   | 19962 +
 .../documentation/sdks/javadoc/0.6.0/index.html |75 +
 .../beam/runners/apex/ApexPipelineOptions.html  |   402 +
 .../ApexRunner.CreateApexPCollectionView.html   |   311 +
 .../apache/beam/runners/apex/ApexRunner.html|   380 +
 .../apex/ApexRunnerRegistrar.Options.html   |   284 +
 .../apex/ApexRunnerRegistrar.Runner.html|   288 +
 .../beam/runners/apex/ApexRunnerRegistrar.html  |   227 +
 .../beam/runners/apex/ApexRunnerResult.html |   446 +
 .../apex/ApexYarnLauncher.LaunchParams.html |   315 +
 .../apex/ApexYarnLauncher.ProcessWatcher.html   |   296 +
 .../beam/runners/apex/ApexYarnLauncher.html |   426 +
 .../beam/runners/apex/TestApexRunner.html   |   262 +
 .../apache/beam/runners/apex/package-frame.html |33 +
 .../beam/runners/apex/package-summary.html  |   225 +
 .../apache/beam/runners/apex/package-tree.html  |   169 +
 ...tatefulParDoOverrides.BatchStatefulDoFn.html |   338 +
 .../dataflow/BatchStatefulParDoOverrides.html   |   320 +
 .../beam/runners/dataflow/DataflowClient.html   |   426 +
 .../DataflowJobAlreadyExistsException.html  |   286 +
 .../DataflowJobAlreadyUpdatedException.html |   285 +
 .../runners/dataflow/DataflowJobException.html  |   273 +
 .../runners/dataflow/DataflowPipelineJob.html   |   568 +
 .../DataflowPipelineRegistrar.Options.html  |   284 +
 

[46/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/constant-values.html
--
diff --git a/content/documentation/sdks/javadoc/0.6.0/constant-values.html 
b/content/documentation/sdks/javadoc/0.6.0/constant-values.html
new file mode 100644
index 000..7305a0a
--- /dev/null
+++ b/content/documentation/sdks/javadoc/0.6.0/constant-values.html
@@ -0,0 +1,1267 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+Constant Field Values
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+Prev
+Next
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+
+
+
+Constant Field Values
+Contents
+
+org.apache.*
+
+
+
+
+
+org.apache.*
+
+
+
+org.apache.beam.runners.apex.ApexRunner
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+publicstaticfinaljava.lang.String
+CLASSPATH_SCHEME
+"classpath"
+
+
+
+
+
+
+
+
+org.apache.beam.runners.dataflow.DataflowRunner
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+publicstaticfinaljava.lang.String
+PROJECT_ID_REGEXP
+"[a-z][-a-z0-9:.]+[a-z0-9]"
+
+
+
+
+
+
+
+
+org.apache.beam.runners.dataflow.util.OutputReference
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+publicfinaljava.lang.String
+type
+"OutputReference"
+
+
+
+
+
+
+
+
+org.apache.beam.runners.spark.io.hadoop.ShardNameTemplateHelper
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+publicstaticfinaljava.lang.String
+OUTPUT_FILE_PREFIX
+"spark.beam.fileoutputformat.prefix"
+
+
+
+
+publicstaticfinaljava.lang.String
+OUTPUT_FILE_SUFFIX
+"spark.beam.fileoutputformat.suffix"
+
+
+
+
+publicstaticfinaljava.lang.String
+OUTPUT_FILE_TEMPLATE
+"spark.beam.fileoutputformat.template"
+
+
+
+
+
+
+
+
+org.apache.beam.sdk.io.BoundedSource.BoundedReaderT
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+publicstaticfinallong
+SPLIT_POINTS_UNKNOWN
+-1L
+
+
+
+
+
+
+org.apache.beam.sdk.io.FileSystems
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+publicstaticfinaljava.lang.String
+DEFAULT_SCHEME
+"default"
+
+
+
+
+
+
+org.apache.beam.sdk.io.ShardNameTemplate
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+publicstaticfinaljava.lang.String
+DIRECTORY_CONTAINER
+"/part-S"
+
+
+
+
+publicstaticfinaljava.lang.String
+INDEX_OF_MAX
+"-S-of-N"
+
+
+
+
+
+
+org.apache.beam.sdk.io.UnboundedSource.UnboundedReaderOutputT
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+publicstaticfinallong
+BACKLOG_UNKNOWN
+-1L
+
+
+
+
+
+
+org.apache.beam.sdk.io.XmlSink
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+protectedstaticfinaljava.lang.String
+XML_EXTENSION
+"xml"
+
+
+
+
+
+
+
+
+org.apache.beam.sdk.io.gcp.datastore.DatastoreV1.Read
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+publicstaticfinalint
+NUM_QUERY_SPLITS_MAX
+5
+
+
+
+
+
+
+
+
+org.apache.beam.sdk.io.range.OffsetRangeTracker
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+publicstaticfinallong
+OFFSET_INFINITY
+9223372036854775807L
+
+
+
+
+
+
+
+
+org.apache.beam.sdk.testing.PAssert
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+publicstaticfinaljava.lang.String
+FAILURE_COUNTER
+"PAssertFailure"
+
+
+
+
+publicstaticfinaljava.lang.String
+SUCCESS_COUNTER
+"PAssertSuccess"
+
+
+
+
+
+
+
+
+org.apache.beam.sdk.transforms.ApproximateQuantiles.ApproximateQuantilesCombineFnT,ComparatorT extends 
java.util.ComparatorT  
java.io.Serializable
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+publicstaticfinallong
+DEFAULT_MAX_NUM_ELEMENTS
+10L
+
+
+
+
+
+
+
+
+org.apache.beam.sdk.transforms.reflect.ByteBuddyDoFnInvokerFactory
+
+Modifier and Type
+Constant Field
+Value
+
+
+
+
+
+publicstaticfinaljava.lang.String
+CONTEXT_PARAMETER_METHOD
+"context"
+
+
+
+
+publicstaticfinaljava.lang.String
+INPUT_PROVIDER_PARAMETER_METHOD
+"inputProvider"
+
+
+
+
+publicstaticfinaljava.lang.String
+ON_TIMER_CONTEXT_PARAMETER_METHOD
+"onTimerContext"
+
+
+
+
+publicstaticfinaljava.lang.String
+OUTPUT_RECEIVER_PARAMETER_METHOD
+"outputReceiver"
+
+
+
+
+publicstaticfinaljava.lang.String
+PROCESS_CONTEXT_PARAMETER_METHOD
+"processContext"
+
+
+
+
+publicstaticfinaljava.lang.String
+RESTRICTION_TRACKER_PARAMETER_METHOD
+"restrictionTracker"
+
+
+
+
+publicstaticfinaljava.lang.String
+STATE_PARAMETER_METHOD
+"state"
+
+
+
+
+publicstaticfinaljava.lang.String
+TIMER_PARAMETER_METHOD
+"timer"
+
+
+
+
+publicstaticfinaljava.lang.String
+WINDOW_PARAMETER_METHOD
+"window"
+
+
+
+
+
+
+
+

[40/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowJobAlreadyExistsException.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowJobAlreadyExistsException.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowJobAlreadyExistsException.html
new file mode 100644
index 000..502f3b5
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowJobAlreadyExistsException.html
@@ -0,0 +1,286 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+DataflowJobAlreadyExistsException
+
+
+
+
+
+
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.dataflow
+Class 
DataflowJobAlreadyExistsException
+
+
+
+java.lang.Object
+
+
+java.lang.Throwable
+
+
+java.lang.Exception
+
+
+java.lang.RuntimeException
+
+
+org.apache.beam.runners.dataflow.DataflowJobException
+
+
+org.apache.beam.runners.dataflow.DataflowJobAlreadyExistsException
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable
+
+
+
+public class DataflowJobAlreadyExistsException
+extends DataflowJobException
+An exception that is thrown if the unique job name 
constraint of the Dataflow
+ service is broken because an existing job with the same job name is currently 
active.
+ The DataflowPipelineJob 
contained within this exception contains information
+ about the pre-existing job.
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors
+
+Constructor and Description
+
+
+DataflowJobAlreadyExistsException(DataflowPipelineJobjob,
+ java.lang.Stringmessage)
+Create a new DataflowJobAlreadyExistsException 
with the specified DataflowPipelineJob and 
message.
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+
+
+
+Methods inherited from classorg.apache.beam.runners.dataflow.DataflowJobException
+getJob
+
+
+
+
+
+Methods inherited from classjava.lang.Throwable
+addSuppressed, fillInStackTrace, getCause, getLocalizedMessage, 
getMessage, getStackTrace, getSuppressed, initCause, printStackTrace, 
printStackTrace, printStackTrace, setStackTrace, toString
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, wait, 
wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+DataflowJobAlreadyExistsException
+publicDataflowJobAlreadyExistsException(DataflowPipelineJobjob,
+ java.lang.Stringmessage)
+Create a new DataflowJobAlreadyExistsException 
with the specified DataflowPipelineJob and 
message.
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowJobAlreadyUpdatedException.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowJobAlreadyUpdatedException.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowJobAlreadyUpdatedException.html
new file mode 100644
index 000..3425afe
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/dataflow/DataflowJobAlreadyUpdatedException.html
@@ -0,0 +1,285 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+DataflowJobAlreadyUpdatedException
+
+
+
+
+
+
+
+
+JavaScript is 

[14/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/aggregators/AggAccumParam.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/aggregators/AggAccumParam.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/aggregators/AggAccumParam.html
new file mode 100644
index 000..3184b14
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/aggregators/AggAccumParam.html
@@ -0,0 +1,338 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+AggAccumParam
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10,"i2":10};
+var tabs = {65535:["t0","All Methods"],2:["t2","Instance 
Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.spark.aggregators
+Class AggAccumParam
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.spark.aggregators.AggAccumParam
+
+
+
+
+
+
+
+All Implemented Interfaces:
+java.io.Serializable, org.apache.spark.AccumulableParamNamedAggregators,NamedAggregators, 
org.apache.spark.AccumulatorParamNamedAggregators
+
+
+
+public class AggAccumParam
+extends java.lang.Object
+implements org.apache.spark.AccumulatorParamNamedAggregators
+Aggregator accumulator param.
+
+See Also:
+Serialized
 Form
+
+
+
+
+
+
+
+
+
+
+
+
+Nested Class Summary
+
+
+
+
+Nested classes/interfaces inherited from 
interfaceorg.apache.spark.AccumulatorParam
+org.apache.spark.AccumulatorParam.DoubleAccumulatorParam$, 
org.apache.spark.AccumulatorParam.FloatAccumulatorParam$, 
org.apache.spark.AccumulatorParam.IntAccumulatorParam$, 
org.apache.spark.AccumulatorParam.LongAccumulatorParam$
+
+
+
+
+
+
+
+
+Constructor Summary
+
+Constructors
+
+Constructor and Description
+
+
+AggAccumParam()
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+NamedAggregators
+addAccumulator(NamedAggregatorscurrent,
+  NamedAggregatorsadded)
+
+
+NamedAggregators
+addInPlace(NamedAggregatorscurrent,
+  NamedAggregatorsadded)
+
+
+NamedAggregators
+zero(NamedAggregatorsinitialValue)
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, 
toString, wait, wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Constructor Detail
+
+
+
+
+
+AggAccumParam
+publicAggAccumParam()
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+addAccumulator
+publicNamedAggregatorsaddAccumulator(NamedAggregatorscurrent,
+   NamedAggregatorsadded)
+
+Specified by:
+addAccumulatorin 
interfaceorg.apache.spark.AccumulableParamNamedAggregators,NamedAggregators
+Specified by:
+addAccumulatorin 
interfaceorg.apache.spark.AccumulatorParamNamedAggregators
+
+
+
+
+
+
+
+
+addInPlace
+publicNamedAggregatorsaddInPlace(NamedAggregatorscurrent,
+   NamedAggregatorsadded)
+
+Specified by:
+addInPlacein 
interfaceorg.apache.spark.AccumulableParamNamedAggregators,NamedAggregators
+
+
+
+
+
+
+
+
+zero
+publicNamedAggregatorszero(NamedAggregatorsinitialValue)
+
+Specified by:
+zeroin 
interfaceorg.apache.spark.AccumulableParamNamedAggregators,NamedAggregators
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+

http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/spark/aggregators/AggregatorsAccumulator.AccumulatorCheckpointingSparkListener.html
--
diff --git 

[24/52] [abbrv] [partial] beam-site git commit: Regenerate website

2017-03-14 Thread davor
http://git-wip-us.apache.org/repos/asf/beam-site/blob/dc04d667/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/CopyOnAccessInMemoryStateInternals.html
--
diff --git 
a/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/CopyOnAccessInMemoryStateInternals.html
 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/CopyOnAccessInMemoryStateInternals.html
new file mode 100644
index 000..c68ac7c
--- /dev/null
+++ 
b/content/documentation/sdks/javadoc/0.6.0/org/apache/beam/runners/direct/CopyOnAccessInMemoryStateInternals.html
@@ -0,0 +1,385 @@
+http://www.w3.org/TR/html4/loose.dtd;>
+
+
+
+
+CopyOnAccessInMemoryStateInternals
+
+
+
+
+
+
+var methods = {"i0":10,"i1":10,"i2":10,"i3":10,"i4":10,"i5":10,"i6":9};
+var tabs = {65535:["t0","All Methods"],1:["t1","Static 
Methods"],2:["t2","Instance Methods"],8:["t4","Concrete Methods"]};
+var altColor = "altColor";
+var rowColor = "rowColor";
+var tableTab = "tableTab";
+var activeTableTab = "activeTableTab";
+
+
+JavaScript is disabled on your browser.
+
+
+
+
+
+Skip navigation links
+
+
+
+
+Overview
+Package
+Class
+Tree
+Deprecated
+Index
+Help
+
+
+
+
+PrevClass
+NextClass
+
+
+Frames
+NoFrames
+
+
+AllClasses
+
+
+
+
+
+
+
+Summary:
+Nested|
+Field|
+Constr|
+Method
+
+
+Detail:
+Field|
+Constr|
+Method
+
+
+
+
+
+
+
+
+org.apache.beam.runners.direct
+Class 
CopyOnAccessInMemoryStateInternalsK
+
+
+
+java.lang.Object
+
+
+org.apache.beam.runners.direct.CopyOnAccessInMemoryStateInternalsK
+
+
+
+
+
+
+
+All Implemented Interfaces:
+org.apache.beam.runners.core.StateInternalsK
+
+
+
+public class CopyOnAccessInMemoryStateInternalsK
+extends java.lang.Object
+implements org.apache.beam.runners.core.StateInternalsK
+StateInternals built on top of an underlying 
StateTable that contains instances
+ of InMemoryStateInternals.InMemoryState. Whenever state that 
exists in the underlying StateTable is
+ accessed, an independent copy will be created within this table.
+
+
+
+
+
+
+
+
+
+
+
+Method Summary
+
+All MethodsStatic MethodsInstance MethodsConcrete Methods
+
+Modifier and Type
+Method and Description
+
+
+CopyOnAccessInMemoryStateInternalsK
+commit()
+Ensures this CopyOnAccessInMemoryStateInternals
 is complete.
+
+
+
+http://www.joda.org/joda-time/apidocs/org/joda/time/Instant.html?is-external=true;
 title="class or interface in org.joda.time">Instant
+getEarliestWatermarkHold()
+Gets the earliest Watermark Hold present in this 
table.
+
+
+
+K
+getKey()
+The key for this StateInternals.
+
+
+
+boolean
+isEmpty()
+
+
+T extends StateT
+state(org.apache.beam.runners.core.StateNamespacenamespace,
+ org.apache.beam.runners.core.StateTag? super K,Taddress)
+Return the state associated with address in 
the specified namespace.
+
+
+
+T extends StateT
+state(org.apache.beam.runners.core.StateNamespacenamespace,
+ org.apache.beam.runners.core.StateTag? super K,Taddress,
+ StateContext?c)
+Return the state associated with address in 
the specified namespace
+ with the StateContext.
+
+
+
+static KCopyOnAccessInMemoryStateInternalsK
+withUnderlying(Kkey,
+  CopyOnAccessInMemoryStateInternalsKunderlying)
+Creates a new CopyOnAccessInMemoryStateInternals
 with the underlying (possibly null)
+ StateInternals.
+
+
+
+
+
+
+
+Methods inherited from classjava.lang.Object
+clone, equals, finalize, getClass, hashCode, notify, notifyAll, 
toString, wait, wait, wait
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+Method Detail
+
+
+
+
+
+
+
+withUnderlying
+public staticKCopyOnAccessInMemoryStateInternalsKwithUnderlying(Kkey,
+   
@Nullable
+   CopyOnAccessInMemoryStateInternalsKunderlying)
+Creates a new CopyOnAccessInMemoryStateInternals
 with the underlying (possibly null)
+ StateInternals.
+
+
+
+
+
+
+
+commit
+publicCopyOnAccessInMemoryStateInternalsKcommit()
+Ensures this CopyOnAccessInMemoryStateInternals
 is complete. Other copies of state for
+ the same Step and Key may be discarded after invoking this method.
+
+ For each StateNamespace, for each address in 
that namespace that
+ has not been bound in this CopyOnAccessInMemoryStateInternals,
 put a reference to that
+ state within this StateInternals.
+
+ Additionally, stores the WatermarkHoldState with the 
earliest time bound in the
+ state table after the commit is completed, enabling calls to
+ getEarliestWatermarkHold().
+

[beam] Git Push Summary

2017-03-14 Thread altay
Repository: beam
Updated Tags:  refs/tags/v0.6.0 [created] fa5fa7b0c


Jenkins build became unstable: beam_PostCommit_Java_RunnableOnService_Flink #1929

2017-03-14 Thread Apache Jenkins Server
See 




[jira] [Issue Comment Deleted] (BEAM-1542) Need Source/Sink for Spanner

2017-03-14 Thread Guy Molinari (JIRA)

 [ 
https://issues.apache.org/jira/browse/BEAM-1542?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Guy Molinari updated BEAM-1542:
---
Comment: was deleted

(was: Here is the isolated error:

[INFO] 
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) 
on project beam-sdks-java-io-google-cloud-platform: Compilation failure
[ERROR] 
/Users/gmolinar/beam/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/spanner/SpannerIO.java:[112,12]
 no suitable method found for 
apply(java.lang.String,org.apache.beam.sdk.transforms.ParDo.Bound)
[ERROR] method 
org.apache.beam.sdk.values.PCollection.apply(org.apache.beam.sdk.transforms.PTransform,OutputT>) is not applicable
[ERROR] (cannot infer type-variable(s) OutputT
[ERROR] (actual and formal argument lists differ in length))
[ERROR] method 
org.apache.beam.sdk.values.PCollection.apply(java.lang.String,org.apache.beam.sdk.transforms.PTransform,OutputT>) is not applicable
[ERROR] (cannot infer type-variable(s) OutputT
[ERROR] (argument mismatch; 
org.apache.beam.sdk.transforms.ParDo.Bound
 cannot be converted to org.apache.beam.sdk.transforms.PTransform,OutputT>))
[ERROR] -> [Help 1]
[ERROR] )

> Need Source/Sink for Spanner
> 
>
> Key: BEAM-1542
> URL: https://issues.apache.org/jira/browse/BEAM-1542
> Project: Beam
>  Issue Type: New Feature
>  Components: sdk-java-gcp
>Reporter: Guy Molinari
>Assignee: Guy Molinari
>
> Is there a source/sink for Spanner in the works?   If not I would gladly give 
> this a shot.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


svn commit: r18744 - in /release/beam: ./ 0.6.0/

2017-03-14 Thread davor
Author: davor
Date: Tue Mar 14 21:52:43 2017
New Revision: 18744

Log:
Add Apache Beam, release 0.6.0


Added:
release/beam/0.6.0/
release/beam/0.6.0/apache-beam-0.6.0-python.zip   (with props)
release/beam/0.6.0/apache-beam-0.6.0-python.zip.asc
release/beam/0.6.0/apache-beam-0.6.0-python.zip.md5
release/beam/0.6.0/apache-beam-0.6.0-python.zip.sha1
release/beam/0.6.0/apache-beam-0.6.0-source-release.zip   (with props)
release/beam/0.6.0/apache-beam-0.6.0-source-release.zip.asc
release/beam/0.6.0/apache-beam-0.6.0-source-release.zip.md5
release/beam/0.6.0/apache-beam-0.6.0-source-release.zip.sha1
Modified:
release/beam/KEYS

Added: release/beam/0.6.0/apache-beam-0.6.0-python.zip
==
Binary file - no diff available.

Propchange: release/beam/0.6.0/apache-beam-0.6.0-python.zip
--
svn:mime-type = application/octet-stream

Added: release/beam/0.6.0/apache-beam-0.6.0-python.zip.asc
==
--- release/beam/0.6.0/apache-beam-0.6.0-python.zip.asc (added)
+++ release/beam/0.6.0/apache-beam-0.6.0-python.zip.asc Tue Mar 14 21:52:43 2017
@@ -0,0 +1,17 @@
+-BEGIN PGP SIGNATURE-
+Version: GnuPG v1
+
+iQIcBAABAgAGBQJYyEAHAAoJENYucUFglvoAV5kP/3i/fj0u+pUU235UmvHyQ1T6
+h4EpuTIM+DOQDEmByF+kSCGN3DjrVZZ48qDqw5+wUcAwMdAGd9nZpNGEbLI9axlY
+B431SZB0uvIn6zJNd0thlQgOZFqoSmKfWWEwRS1/c405L5y4Le9YnVT8H7Uvtx6a
+GiO0OKN6s9AVq4iWA3kJ9fmL5S2+wbhgbw9zRjYDO+6Dvy6w7NeRI1GunpltojhR
+hyR+mjRyj4Xy42NgRX79/wizHsQ5+ZFG18nLYerYdZSXHMHdCzWNJOPV/l4E60Zp
+F6TnYRGX33NLErxq1PtUwa5Q4xDaSansIQvxtjELip82M+AM2BLFsIeTUlbKowDG
+hzaBhpKU5h+qjHzgKh7Mu8a++LqBGJ4tSm0HhuOS8OFa7xy8iOjnPeIKaZzpyvfz
+9u1Vgf6HDfblTjIQIU2wjB350jwKripognyVKMrYtHhRjgg9RAx+vsBV+UarWN6I
+Vop+iJGe8C9jj+vA/NUIJJ0tvwYNkcNMAAxN3eTijYe3Z0dn64QcPlix4qt2TR/v
+ilG9OuSqlZds9BUbCzcIJjw4dHadBpu0QnofDe+EZ4GgC7ivU59S/CLyeREttGm9
+XLd4EyBXw3VleAu5Pb4D1pyOKbGG0CirXLSkyqnDzjn/Kai0BK9ttSDisX06mwrX
+GhoH2xl5TMgUOIzjM/lY
+=wzbq
+-END PGP SIGNATURE-

Added: release/beam/0.6.0/apache-beam-0.6.0-python.zip.md5
==
--- release/beam/0.6.0/apache-beam-0.6.0-python.zip.md5 (added)
+++ release/beam/0.6.0/apache-beam-0.6.0-python.zip.md5 Tue Mar 14 21:52:43 2017
@@ -0,0 +1 @@
+64ae57554571979d344b934b629c1748  apache-beam-0.6.0-python.zip

Added: release/beam/0.6.0/apache-beam-0.6.0-python.zip.sha1
==
--- release/beam/0.6.0/apache-beam-0.6.0-python.zip.sha1 (added)
+++ release/beam/0.6.0/apache-beam-0.6.0-python.zip.sha1 Tue Mar 14 21:52:43 
2017
@@ -0,0 +1 @@
+0d604b1454666a4e97428415d041cf49b86cac0e  apache-beam-0.6.0-python.zip

Added: release/beam/0.6.0/apache-beam-0.6.0-source-release.zip
==
Binary file - no diff available.

Propchange: release/beam/0.6.0/apache-beam-0.6.0-source-release.zip
--
svn:mime-type = application/octet-stream

Added: release/beam/0.6.0/apache-beam-0.6.0-source-release.zip.asc
==
--- release/beam/0.6.0/apache-beam-0.6.0-source-release.zip.asc (added)
+++ release/beam/0.6.0/apache-beam-0.6.0-source-release.zip.asc Tue Mar 14 
21:52:43 2017
@@ -0,0 +1,17 @@
+-BEGIN PGP SIGNATURE-
+Version: GnuPG v1
+
+iQIcBAABAgAGBQJYw1elAAoJENYucUFglvoANXMP/2LeHIfeg25VhtGToAEcofAx
+1Tr2g/We+oNVkcrvsrt3x66ZbMIOzZJseW6M99bnQ9h8A0FoCV0Tc2MUrT+WtjVt
+weyV0/JjdSlgvvM1L8DQbpgktEEmTjKgQI+3ZvIoe4ySW+07wkMGte3rSmaL9PW7
+nxlwVhPG8G/IG5TxBPegK45ug1p2cP1vcKL86UD/QTahmaUFphHOEPwyokzIhuG0
+L+5XsM3tCcbv5KBEqb2NkQ5G0aYqg8I4/GjLSV5SuyFsZ6bY7ZDzqIyiHMrByFr4
+7SnZDakch1kwtnFCMTW4MMUCSRqKYRvZd+VGOimOVgvGodXtgRA/1s/HltAi68tn
+J4LpVbihVH9mPJdUUTK9PLgQbGyiYzqV9s+hRWr2jrJBqgJb2KnY62I03gl/n6FU
+ehDbn7JcyiKaO3IVuat1mQ5k3gEoPozyObW2Mp3hV+rzbaiUpWJDzDj9Tg5oqEpz
++s4Fyv6jlElVRZ+vQNJWl41+nOUVwKXXB82tkQ3TiAXZjpmE3YwoTbi54yBoPDCx
+gjMlGk+RDszmxC0QjJR31l4KtgXfrYchOeO1kf5mTQcf5Ho5IQHQxAUtSmL+8HUy
+RhQfVyzmbGpuHlnTJn1gXrxK+jjiI8bI+6l5PGH0ka5tHqywbaMdHJATnLW23gsy
+j9DVkRowyyUjGjcx6iEK
+=Z72Z
+-END PGP SIGNATURE-

Added: release/beam/0.6.0/apache-beam-0.6.0-source-release.zip.md5
==
--- release/beam/0.6.0/apache-beam-0.6.0-source-release.zip.md5 (added)
+++ release/beam/0.6.0/apache-beam-0.6.0-source-release.zip.md5 Tue Mar 14 
21:52:43 2017
@@ -0,0 +1 @@
+6497f78594b60d0085b1802a842678b8  apache-beam-0.6.0-source-release.zip

Added: release/beam/0.6.0/apache-beam-0.6.0-source-release.zip.sha1
==
--- 

[jira] [Commented] (BEAM-1260) PAssert should capture the assertion site

2017-03-14 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1260?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15925055#comment-15925055
 ] 

ASF GitHub Bot commented on BEAM-1260:
--

GitHub user jkff opened a pull request:

https://github.com/apache/beam/pull/2247

[BEAM-1260] Another shot at capturing PAssert site

This is a modified resubmit of https://github.com/apache/beam/pull/1753, 
which was later rolled back in https://github.com/apache/beam/pull/1767 .

The current PR is basically the same, but I updated the Dataflow worker to 
reduce nesting of wrapped exceptions, so now the error message goes through 
(whereas previously it would lead to an exception so long that it was 
truncated, and the original site was not visible).

R: @tgroh  


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jkff/incubator-beam rollforward-passert-site

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2247.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2247






> PAssert should capture the assertion site
> -
>
> Key: BEAM-1260
> URL: https://issues.apache.org/jira/browse/BEAM-1260
> Project: Beam
>  Issue Type: Improvement
>  Components: sdk-java-core
>Reporter: Eugene Kirpichov
>Assignee: Eugene Kirpichov
> Fix For: 0.5.0
>
>
> When a PAssert assertion fails, it doesn't tell where the assertion 
> (PAssert.that(blah)) call was made in code - only when the failure was 
> detected, i.e. somewhere deep in worker code usually.
> It also doesn't allow specifying a message (unlike JUnit).
> This issue is about improving the way PAssert failures are reported.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2247: [BEAM-1260] Another shot at capturing PAssert site

2017-03-14 Thread jkff
GitHub user jkff opened a pull request:

https://github.com/apache/beam/pull/2247

[BEAM-1260] Another shot at capturing PAssert site

This is a modified resubmit of https://github.com/apache/beam/pull/1753, 
which was later rolled back in https://github.com/apache/beam/pull/1767 .

The current PR is basically the same, but I updated the Dataflow worker to 
reduce nesting of wrapped exceptions, so now the error message goes through 
(whereas previously it would lead to an exception so long that it was 
truncated, and the original site was not visible).

R: @tgroh  


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/jkff/incubator-beam rollforward-passert-site

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2247.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2247






---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[jira] [Commented] (BEAM-1676) SdkCoreApiSurfaceTest Failed in JDK7&8 and OpenJDK7&8 on Jenkins

2017-03-14 Thread Mark Liu (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1676?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15925018#comment-15925018
 ] 

Mark Liu commented on BEAM-1676:


[~kenn] Any thoughts? Since this block us to use Jenkins matrix configuration 
plugin. If no further progress on that, I'd rather not to use it currently.

> SdkCoreApiSurfaceTest Failed in JDK7&8 and OpenJDK7&8 on Jenkins
> 
>
> Key: BEAM-1676
> URL: https://issues.apache.org/jira/browse/BEAM-1676
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Mark Liu
>Assignee: Mark Liu
>
> After running beam_PostCommit_Java_MavenInstall in different JDK versions 
> (JDK7&8, OpenJDK7&8) on Jenkins, all failed due to following error:
> {code}
> Failed tests: 
>   SdkCoreApiSurfaceTest.testSdkApiSurface:59 
> Expected: API surface to include only:
>   Classes in package "org.apache.beam"
>   Classes in package "com.google.api.client"
>   Classes in package "com.google.api.services.bigquery"
>   Classes in package "com.google.api.services.cloudresourcemanager"
>   Classes in package "com.google.api.services.pubsub"
>   Classes in package "com.google.api.services.storage"
>   Classes in package "com.google.auth"
>   Classes in package "com.google.protobuf"
>   Classes in package "com.fasterxml.jackson.annotation"
>   Classes in package "com.fasterxml.jackson.core"
>   Classes in package "com.fasterxml.jackson.databind"
>   Classes in package "org.apache.avro"
>   Classes in package "org.hamcrest"
>   Classes in package "org.codehaus.jackson"
>   Classes in package "org.joda.time"
>   Classes in package "org.junit"
>   
>  but: The following white-listed scopes did not have matching classes on 
> the API surface:
>   No Classes in package "com.fasterxml.jackson.annotation"
>   No Classes in package "com.fasterxml.jackson.core"
>   No Classes in package "com.fasterxml.jackson.databind"
>   No Classes in package "com.google.api.client"
>   No Classes in package "com.google.api.services.bigquery"
>   No Classes in package "com.google.api.services.cloudresourcemanager"
>   No Classes in package "com.google.api.services.pubsub"
>   No Classes in package "com.google.api.services.storage"
>   No Classes in package "com.google.auth"
>   No Classes in package "com.google.protobuf"
>   No Classes in package "org.apache.avro"
>   No Classes in package "org.apache.beam"
>   No Classes in package "org.codehaus.jackson"
>   No Classes in package "org.hamcrest"
>   No Classes in package "org.joda.time"
>   No Classes in package "org.junit"
> {code}
> Job link:
> https://builds.apache.org/job/beam_PostCommit_Java_Version_Test/14/
> Multi-JDK version test is based on this PR:
> https://github.com/apache/beam/pull/2204/files
> Our beam_PostCommit_Java_MavenInstall is using JDK 1.8 (latest), which in 
> good health. And the maven command in version test is the same as 
> beam_PostCommit_Java_MavenInstall.
> Any ideas?



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Comment Edited] (BEAM-1676) SdkCoreApiSurfaceTest Failed in JDK7&8 and OpenJDK7&8 on Jenkins

2017-03-14 Thread Mark Liu (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1676?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15925018#comment-15925018
 ] 

Mark Liu edited comment on BEAM-1676 at 3/14/17 9:15 PM:
-

[~kenn] [~davor] Any thoughts? Since this block us to use Jenkins matrix 
configuration plugin. If no further progress on that, I'd rather not to use it 
currently.


was (Author: markflyhigh):
[~kenn] Any thoughts? Since this block us to use Jenkins matrix configuration 
plugin. If no further progress on that, I'd rather not to use it currently.

> SdkCoreApiSurfaceTest Failed in JDK7&8 and OpenJDK7&8 on Jenkins
> 
>
> Key: BEAM-1676
> URL: https://issues.apache.org/jira/browse/BEAM-1676
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Mark Liu
>Assignee: Mark Liu
>
> After running beam_PostCommit_Java_MavenInstall in different JDK versions 
> (JDK7&8, OpenJDK7&8) on Jenkins, all failed due to following error:
> {code}
> Failed tests: 
>   SdkCoreApiSurfaceTest.testSdkApiSurface:59 
> Expected: API surface to include only:
>   Classes in package "org.apache.beam"
>   Classes in package "com.google.api.client"
>   Classes in package "com.google.api.services.bigquery"
>   Classes in package "com.google.api.services.cloudresourcemanager"
>   Classes in package "com.google.api.services.pubsub"
>   Classes in package "com.google.api.services.storage"
>   Classes in package "com.google.auth"
>   Classes in package "com.google.protobuf"
>   Classes in package "com.fasterxml.jackson.annotation"
>   Classes in package "com.fasterxml.jackson.core"
>   Classes in package "com.fasterxml.jackson.databind"
>   Classes in package "org.apache.avro"
>   Classes in package "org.hamcrest"
>   Classes in package "org.codehaus.jackson"
>   Classes in package "org.joda.time"
>   Classes in package "org.junit"
>   
>  but: The following white-listed scopes did not have matching classes on 
> the API surface:
>   No Classes in package "com.fasterxml.jackson.annotation"
>   No Classes in package "com.fasterxml.jackson.core"
>   No Classes in package "com.fasterxml.jackson.databind"
>   No Classes in package "com.google.api.client"
>   No Classes in package "com.google.api.services.bigquery"
>   No Classes in package "com.google.api.services.cloudresourcemanager"
>   No Classes in package "com.google.api.services.pubsub"
>   No Classes in package "com.google.api.services.storage"
>   No Classes in package "com.google.auth"
>   No Classes in package "com.google.protobuf"
>   No Classes in package "org.apache.avro"
>   No Classes in package "org.apache.beam"
>   No Classes in package "org.codehaus.jackson"
>   No Classes in package "org.hamcrest"
>   No Classes in package "org.joda.time"
>   No Classes in package "org.junit"
> {code}
> Job link:
> https://builds.apache.org/job/beam_PostCommit_Java_Version_Test/14/
> Multi-JDK version test is based on this PR:
> https://github.com/apache/beam/pull/2204/files
> Our beam_PostCommit_Java_MavenInstall is using JDK 1.8 (latest), which in 
> good health. And the maven command in version test is the same as 
> beam_PostCommit_Java_MavenInstall.
> Any ideas?



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Created] (BEAM-1721) Reshuffle can shift elements in time

2017-03-14 Thread Thomas Groh (JIRA)
Thomas Groh created BEAM-1721:
-

 Summary: Reshuffle can shift elements in time
 Key: BEAM-1721
 URL: https://issues.apache.org/jira/browse/BEAM-1721
 Project: Beam
  Issue Type: Bug
  Components: sdk-java-core
Reporter: Thomas Groh
Assignee: Thomas Groh


The reshuffle transform is meant to have no visible effects on the data that it 
processes. However, due to the use of a {{GroupByKey}}, the timestamp of the 
output elements is determined by the {{OutputTimeFn}} of the input 
{{WindowingStrategy}}

Elements should not be shifted in time when being processed in {{Reshuffle}}. 
Currently this would require reifying all timestamps before applying the 
GroupByKey and reapplying them after. As an intermediate solution, elements 
should never be shifted forwards in time, as doing so permits the watermark to 
advance improperly (if the elements already contain their timestamps, for 
example), and prevents the timestamps from being reassigned within a {{DoFn}} 
or via the {{WithTimestamps}} transform.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[jira] [Commented] (BEAM-1721) Reshuffle can shift elements in time

2017-03-14 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/BEAM-1721?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15925017#comment-15925017
 ] 

ASF GitHub Bot commented on BEAM-1721:
--

GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/2246

[BEAM-1721] Do not shift Timestamps forwards in Reshuffle

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---
Timestamps can be shifted forwards after the fact, but cannot generally
be shifted backwards. Because reshuffle outputs "as quickly as
possible", only elements that arrive approximately simulatenously with
each other will have their timestamps shifted.

There is currently no way to output all input elements with their
original timestamps without explicitly reifying those timestamps and
reassigning them on the output elements.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam reshuffle_output_time_fn

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2246.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2246


commit 347250b80fafecbdb12059233255e526370a1623
Author: Thomas Groh 
Date:   2017-03-14T21:05:44Z

Do not shift Timestamps forwards in Reshuffle

Timestamps can be shifted forwards after the fact, but cannot generally
be shifted backwards. Because reshuffle outputs "as quickly as
possible", only elements that arrive approximately simulatenously with
each other will have their timestamps shifted.

There is currently no way to output all input elements with their
original timestamps without explicitly reifying those timestamps and
reassigning them on the output elements.




> Reshuffle can shift elements in time
> 
>
> Key: BEAM-1721
> URL: https://issues.apache.org/jira/browse/BEAM-1721
> Project: Beam
>  Issue Type: Bug
>  Components: sdk-java-core
>Reporter: Thomas Groh
>Assignee: Thomas Groh
>
> The reshuffle transform is meant to have no visible effects on the data that 
> it processes. However, due to the use of a {{GroupByKey}}, the timestamp of 
> the output elements is determined by the {{OutputTimeFn}} of the input 
> {{WindowingStrategy}}
> Elements should not be shifted in time when being processed in {{Reshuffle}}. 
> Currently this would require reifying all timestamps before applying the 
> GroupByKey and reapplying them after. As an intermediate solution, elements 
> should never be shifted forwards in time, as doing so permits the watermark 
> to advance improperly (if the elements already contain their timestamps, for 
> example), and prevents the timestamps from being reassigned within a {{DoFn}} 
> or via the {{WithTimestamps}} transform.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)


[GitHub] beam pull request #2246: [BEAM-1721] Do not shift Timestamps forwards in Res...

2017-03-14 Thread tgroh
GitHub user tgroh opened a pull request:

https://github.com/apache/beam/pull/2246

[BEAM-1721] Do not shift Timestamps forwards in Reshuffle

Be sure to do all of the following to help us incorporate your contribution
quickly and easily:

 - [ ] Make sure the PR title is formatted like:
   `[BEAM-] Description of pull request`
 - [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
   Travis-CI on your fork and ensure the whole test matrix passes).
 - [ ] Replace `` in the title with the actual Jira issue
   number, if there is one.
 - [ ] If this contribution is large, please file an Apache
   [Individual Contributor License 
Agreement](https://www.apache.org/licenses/icla.txt).

---
Timestamps can be shifted forwards after the fact, but cannot generally
be shifted backwards. Because reshuffle outputs "as quickly as
possible", only elements that arrive approximately simulatenously with
each other will have their timestamps shifted.

There is currently no way to output all input elements with their
original timestamps without explicitly reifying those timestamps and
reassigning them on the output elements.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/tgroh/beam reshuffle_output_time_fn

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/beam/pull/2246.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2246


commit 347250b80fafecbdb12059233255e526370a1623
Author: Thomas Groh 
Date:   2017-03-14T21:05:44Z

Do not shift Timestamps forwards in Reshuffle

Timestamps can be shifted forwards after the fact, but cannot generally
be shifted backwards. Because reshuffle outputs "as quickly as
possible", only elements that arrive approximately simulatenously with
each other will have their timestamps shifted.

There is currently no way to output all input elements with their
original timestamps without explicitly reifying those timestamps and
reassigning them on the output elements.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


  1   2   >