That is strange. I did nothing special but cloned your repo and then:
1. docker-compose -f docker-compose.yaml up
2. I just ran both ways for a simple t.py test, which works well
t.py:
import apache_beam as beam
with beam.Pipeline() as p:
_ = (
p
| beam.Create(
[(
Hi XQ,
Sorry to bother you again, but I've tested the same thing again in a linux
env, and it is still not working and showing the same error in the python
worker harness. (Note that this won't fail immediately, but it is failing
after the task is assigned to task manager and the python worker ha
I did not do anything special but ran `docker-compose -f
docker-compose.yaml up` from your repo.
On Sun, Mar 17, 2024 at 11:38 PM Lydian Lee wrote:
> Hi XQ,
>
> The code is simplified from my previous work and thus it is still using
> the old version. But I've tested with Beam 2.54.0 and the cod
Hi XQ,
The code is simplified from my previous work and thus it is still using the
old version. But I've tested with Beam 2.54.0 and the code still works (I
mean using my company's image.) If this is running well in your linux, I
guess there could be something related to how I build the docker im
I cloned your repo on my Linux machine, which is super useful to run. Not
sure why you use Beam 2.41 but anyway, I tried this on my Linux machine:
python t.py \
--topic test --group test-group --bootstrap-server localhost:9092 \
--job_endpoint localhost:8099 \
--artifact_endpoint localhost:8
Hi,
Just FYI, the similar things works on a different image with the one I
built using my company’s image as base image. I’ve only replaced the base
image with ubuntu. But given that the error log is completely not helpful,
it’s really hard for me to continue debugging on the issue though.
The do
Hello,
The pipeline runs in host while host.docker.internal would only be resolved
on the containers that run with the host network mode. I guess the pipeline
wouldn't be accessible to host.docker.internal and fails to run.
If everything before ReadFromKafka works successfully, a docker container
Hi,
I have an issue when setting up a POC of Python SDK with Flink runner to
run in docker-compose. The python worker harness was not returning any
error but:
```
python-worker-harness-1 | 2024/03/17 07:10:17 Executing: python -m
apache_beam.runners.worker.sdk_worker_main
python-worker-harness-