[ 
https://issues.apache.org/jira/browse/BEAM-6908?focusedWorklogId=238955&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-238955
 ]

ASF GitHub Bot logged work on BEAM-6908:
----------------------------------------

                Author: ASF GitHub Bot
            Created on: 08/May/19 03:02
            Start Date: 08/May/19 03:02
    Worklog Time Spent: 10m 
      Work Description: tvalentyn commented on pull request #8518: [BEAM-6908] 
Refactor Python performance test groovy file for easy configuration
URL: https://github.com/apache/beam/pull/8518#discussion_r281897962
 
 

 ##########
 File path: .test-infra/jenkins/job_PerformanceTests_Python.groovy
 ##########
 @@ -18,46 +18,107 @@
 
 import CommonJobProperties as commonJobProperties
 
-// This job runs the Beam Python performance tests on PerfKit Benchmarker.
-job('beam_PerformanceTests_Python'){
-  // Set default Beam job properties.
-  commonJobProperties.setTopLevelMainJobProperties(delegate)
-
-  // Run job in postcommit every 6 hours, don't trigger every push.
-  commonJobProperties.setAutoJob(
-      delegate,
-      'H */6 * * *')
-
-  // Allows triggering this build against pull requests.
-  commonJobProperties.enablePhraseTriggeringFromPullRequest(
-      delegate,
-      'Python SDK Performance Test',
-      'Run Python Performance Test')
-
-  def pipelineArgs = [
-      project: 'apache-beam-testing',
-      staging_location: 'gs://temp-storage-for-end-to-end-tests/staging-it',
-      temp_location: 'gs://temp-storage-for-end-to-end-tests/temp-it',
-      output: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output'
-  ]
-  def pipelineArgList = []
-  pipelineArgs.each({
-    key, value -> pipelineArgList.add("--$key=$value")
-  })
-  def pipelineArgsJoined = pipelineArgList.join(',')
-
-  def argMap = [
-      beam_sdk                 : 'python',
-      benchmarks               : 'beam_integration_benchmark',
-      bigquery_table           : 'beam_performance.wordcount_py_pkb_results',
-      beam_it_class            : 
'apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it',
-      beam_it_module           : 'sdks/python',
-      beam_prebuilt            : 'true',  // skip beam prebuild
-      beam_python_sdk_location : 'build/apache-beam.tar.gz',
-      beam_runner              : 'TestDataflowRunner',
-      beam_it_timeout          : '1200',
-      beam_it_args             : pipelineArgsJoined,
-  ]
-
-  commonJobProperties.buildPerformanceTest(delegate, argMap)
+
+class PerformanceTestConfigurations {
+  String jobName
+  String jobDescription
+  String jobTriggerPhrase
+  String buildSchedule = 'H */6 * * *'  // every 6 hours
+  String benchmarkName = 'beam_integration_benchmark'
+  String sdk = 'python'
+  String bigqueryTable
+  String itClass
+  String itModule
+  Boolean skipPrebuild = false
+  String pythonSdkLocation
+  String runner = 'TestDataflowRunner'
+  Integer itTimeout = 1200
+  Map extraPipelineArgs
+}
+
+// Common pipeline args for Dataflow job.
+def dataflowPipelineArgs = [
+    project         : 'apache-beam-testing',
+    staging_location: 'gs://temp-storage-for-end-to-end-tests/staging-it',
+    temp_location   : 'gs://temp-storage-for-end-to-end-tests/temp-it',
+]
+
+
+// Configurations of each Jenkins job.
+def testConfigurations = [
+    new PerformanceTestConfigurations(
+        jobName           : 'beam_PerformanceTests_Python',
+        jobDescription    : 'Python SDK Performance Test',
+        jobTriggerPhrase  : 'Run Python Performance Test',
+        bigqueryTable     : 'beam_performance.wordcount_py_pkb_results',
+        skipPrebuild      : true,
+        pythonSdkLocation : 'build/apache-beam.tar.gz',
+        itClass           : 
'apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it',
+        itModule          : 'sdks/python',
+        extraPipelineArgs : dataflowPipelineArgs + [
+            output: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output'
+        ],
+    ),
+    new PerformanceTestConfigurations(
+        jobName           : 'beam_PerformanceTests_Python35',
+        jobDescription    : 'Python35 SDK Performance Test',
+        jobTriggerPhrase  : 'Run Python35 Performance Test',
+        bigqueryTable     : 'beam_performance.wordcount_py35_pkb_results',
+        skipPrebuild      : true,
+        pythonSdkLocation : 
'test-suites/dataflow/py35/build/apache-beam.tar.gz',
 
 Review comment:
   I wonder if we can always use the same location here. As I mentioned before, 
there is no difference which interpreter version to use to create tarball. 
   If we must specify location per suite, perhaps we could have one parameter, 
such as `testRoot` and evaluate `sdkLocation` and `itModule` value from that 
parameter. As mentioned above I also am not sure what `itModule` is  for.
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
-------------------

    Worklog Id:     (was: 238955)

> Add Python3 performance benchmarks
> ----------------------------------
>
>                 Key: BEAM-6908
>                 URL: https://issues.apache.org/jira/browse/BEAM-6908
>             Project: Beam
>          Issue Type: Sub-task
>          Components: testing
>            Reporter: Mark Liu
>            Assignee: Mark Liu
>            Priority: Major
>          Time Spent: 6h 50m
>  Remaining Estimate: 0h
>
> Similar to 
> [beam_PerformanceTests_Python|https://builds.apache.org/view/A-D/view/Beam/view/PerformanceTests/job/beam_PerformanceTests_Python/],
>  we want to have a Python3 benchmark running on Jenkins to detect performance 
> regression during code adoption.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to