Is it possible to add component/unit test for this case? I believe we should aim for all precommits to be executable on isolated singlebox.
Regards, --Mikhail On Mon, Oct 8, 2018, 09:27 Udi Meiri <eh...@google.com wrote: > It's the current status: I believe having a basic wordcount integration > test in precommit would have caught this issue, since it seems to have > broken all tests using the Dataflow service. > > On Sun, Oct 7, 2018 at 9:06 PM Kenneth Knowles <k...@apache.org> wrote: > >> Out of curiosity - is it a logical necessity, or just current status, >> that one needs to run a full job to catch this? >> >> On Sat, Oct 6, 2018 at 4:26 AM Maximilian Michels <m...@apache.org> wrote: >> >>> My changes to the Python option parsing broke the PostCommit. PreCommit >>> passed, as well as the Portable Runner tests. Sorry about that. >>> >>> +1 It would be great to have some more basic integration tests in the >>> PreCommit. That will give us more confidence before merge without always >>> running the PostCommit. >>> >>> > The same goes for Flink: we should be running wordcount and >>> wordcount_streaming integration tests as part of pre-commit tests. >>> >>> Can look into that. >>> >>> Thanks, >>> Max >>> >>> On 05.10.18 23:08, Ahmet Altay wrote: >>> > >>> > >>> > On Fri, Oct 5, 2018 at 1:51 PM, Udi Meiri <eh...@google.com >>> > <mailto:eh...@google.com>> wrote: >>> > >>> > I was sure that we ran some basic Dataflow integration tests in >>> > pre-commit, and that they should have caught this issue. >>> > But then I remembered that we only have those in Java SDK. >>> > I opened this bug to add end-to-end tests to Python pre-commits as >>> > well: https://issues.apache.org/jira/browse/BEAM-5058 >>> > <https://issues.apache.org/jira/browse/BEAM-5058> >>> > >>> > >>> > +1 >>> > >>> > >>> > The same goes for Flink: we should be running wordcount and >>> > wordcount_streaming integration tests as part of pre-commit tests. >>> > >>> > On Fri, Oct 5, 2018 at 1:37 PM Thomas Weise <t...@apache.org >>> > <mailto:t...@apache.org>> wrote: >>> > >>> > Fixed. Can someone please take a look at the usage of >>> > the --beam_plugins flag in the Dataflow runner so that we can >>> > address the root cause? >>> > >>> > We can probably do more to avoid Friday Python post commit >>> > excitement. In this case, extra checking was done pre-merge by >>> > running the Python VR tests for Flink, but the failure occurred >>> > with the Dataflow runner. >>> > >>> > >>> > It would be good to have a list of what post commit test are >>> available, >>> > what do they test and what is the keyword to trigger them from a PR. >>> > >>> > >>> > The changes were pipeline options related, so (pre-existing) >>> > test coverage should have been better. >>> > >>> > But beyond that, we can probably make it easier for >>> contributors >>> > and reviewers to know what extra checks are available and >>> > possibly appropriate to run pre-commit. Should we add some >>> > pointers to >>> > https://beam.apache.org/contribute/testing/#pre-commit >>> > <https://beam.apache.org/contribute/testing/#pre-commit> or is >>> > there a better place? >>> > >>> > Thanks >>> > >>> > >>> > >>> > On Fri, Oct 5, 2018 at 10:38 AM Udi Meiri <eh...@google.com >>> > <mailto:eh...@google.com>> wrote: >>> > >>> > More details in >>> > https://issues.apache.org/jira/browse/BEAM-5442 >>> > <https://issues.apache.org/jira/browse/BEAM-5442> >>> > >>> > On Fri, Oct 5, 2018 at 10:26 AM Udi Meiri < >>> eh...@google.com >>> > <mailto:eh...@google.com>> wrote: >>> > >>> > I'm seeing these errors at least in one test: >>> > "Python sdk harness failed: >>> > Traceback (most recent call last): >>> > File >>> > >>> >>> "/usr/local/lib/python2.7/dist-packages/apache_beam/runners/worker/sdk_worker_main.py", >>> > line 133, in main >>> > >>> > >>> sdk_pipeline_options.get_all_options(drop_default=True)) >>> > File >>> > >>> >>> "/usr/local/lib/python2.7/dist-packages/apache_beam/options/pipeline_options.py", >>> > line 224, in get_all_options >>> > parser.add_argument(arg.split('=', 1)[0], >>> nargs='?') >>> > File "/usr/lib/python2.7/argparse.py", line 1308, in >>> > add_argument >>> > return self._add_action(action) >>> > File "/usr/lib/python2.7/argparse.py", line 1682, in >>> > _add_action >>> > self._optionals._add_action(action) >>> > File "/usr/lib/python2.7/argparse.py", line 1509, in >>> > _add_action >>> > action = super(_ArgumentGroup, >>> > self)._add_action(action) >>> > File "/usr/lib/python2.7/argparse.py", line 1322, in >>> > _add_action >>> > self._check_conflict(action) >>> > File "/usr/lib/python2.7/argparse.py", line 1460, in >>> > _check_conflict >>> > conflict_handler(action, confl_optionals) >>> > File "/usr/lib/python2.7/argparse.py", line 1467, in >>> > _handle_conflict_error >>> > raise ArgumentError(action, message % >>> conflict_string) >>> > ArgumentError: argument --beam_plugins: conflicting >>> > option string(s): --beam_plugins" >>> > >>> > This looks like >>> https://github.com/apache/beam/pull/6557 >>> > <https://github.com/apache/beam/pull/6557> >>> > >>> > On Fri, Oct 5, 2018 at 9:41 AM Boyuan Zhang >>> > <boyu...@google.com <mailto:boyu...@google.com>> >>> wrote: >>> > >>> > Seems like tests failed: >>> > test_leader_board_it >>> > >>> (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) >>> > -> Bigquery table not found >>> > test_game_stats_it >>> > >>> (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) >>> > -> Bigquery table not found >>> > streaming related tests -> Assertion errors >>> > >>> > On Fri, Oct 5, 2018 at 9:33 AM Udi Meiri >>> > <eh...@google.com <mailto:eh...@google.com>> >>> wrote: >>> > >>> > I'm seeing post-commit failures in >>> > :beam-sdks-python:postCommitITTests: >>> > >>> https://builds.apache.org/job/beam_PostCommit_Python_Verify/6181/console >>> > < >>> https://builds.apache.org/job/beam_PostCommit_Python_Verify/6181/console >>> > >>> > >>> https://builds.apache.org/job/beam_PostCommit_Python_Verify/6182/console >>> > < >>> https://builds.apache.org/job/beam_PostCommit_Python_Verify/6182/console >>> > >>> > >>> > >>> > >>> >>