I'm Writing SpannerIO.Write cross-language transform and when I try to run it 
from python I receive errors:

On Flink:
apache_beam.utils.subprocess_server: INFO: b'Caused by: 
java.lang.IllegalArgumentException: Transform external_1HolderCoder uses 
unknown accumulator coder id %s'
apache_beam.utils.subprocess_server: INFO: b'\tat 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:216)'
apache_beam.utils.subprocess_server: INFO: b'\tat 
org.apache.beam.runners.core.construction.graph.PipelineValidator.validateCombine(PipelineValidator.java:273)'

On DirectRunner:
  File 
"/Users/piotr/beam/sdks/python/apache_beam/runners/portability/fn_api_runner/fn_runner.py",
 line 181, in run_via_runner_api
    self._validate_requirements(pipeline_proto)
  File 
"/Users/piotr/beam/sdks/python/apache_beam/runners/portability/fn_api_runner/fn_runner.py",
 line 264, in _validate_requirements
    raise ValueError(
ValueError: Missing requirement declaration: 
{'beam:requirement:pardo:splittable_dofn:v1'}

I suppose that SpannerIO.Write uses a transform that cannot be translated in 
cross-language usage? I'm not sure whether there is something I can do about it.

Reply via email to