[jira] [Commented] (BEAM-8384) Spark runner is not respecting spark.default.parallelism user defined configuration

2022-06-04 Thread Danny McCormick (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8384?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17548207#comment-17548207
 ] 

Danny McCormick commented on BEAM-8384:
---

This issue has been migrated to https://github.com/apache/beam/issues/19854

> Spark runner is not respecting spark.default.parallelism user defined 
> configuration
> ---
>
> Key: BEAM-8384
> URL: https://issues.apache.org/jira/browse/BEAM-8384
> Project: Beam
>  Issue Type: Bug
>  Components: runner-spark
>Affects Versions: 2.16.0
>Reporter: Ismaël Mejía
>Priority: P3
>
> It was reported in [the mailing 
> list|https://lists.apache.org/thread.html/792fb7fc2a5113837fbcdafce6a5d9100309881b366c1a7163d2c898@%3Cdev.beam.apache.org%3E]
>  that the Spark runner is not respecting the user defined Spark default 
> parallelism configuration. We should investigate and if it is the case ensure 
> that a user defined configuration is always respected. Runner optimizations 
> should apply only for default (unconfigured) values otherwise we will confuse 
> users and limit them from parametrizing Spark for their best convenience.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)


[jira] [Commented] (BEAM-8384) Spark runner is not respecting spark.default.parallelism user defined configuration

2020-06-16 Thread Beam JIRA Bot (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8384?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17137107#comment-17137107
 ] 

Beam JIRA Bot commented on BEAM-8384:
-

This issue was marked "stale-P2" and has not received a public comment in 14 
days. It is now automatically moved to P3. If you are still affected by it, you 
can comment and move it back to P2.

> Spark runner is not respecting spark.default.parallelism user defined 
> configuration
> ---
>
> Key: BEAM-8384
> URL: https://issues.apache.org/jira/browse/BEAM-8384
> Project: Beam
>  Issue Type: Bug
>  Components: runner-spark
>Affects Versions: 2.16.0
>Reporter: Ismaël Mejía
>Priority: P3
>
> It was reported in [the mailing 
> list|https://lists.apache.org/thread.html/792fb7fc2a5113837fbcdafce6a5d9100309881b366c1a7163d2c898@%3Cdev.beam.apache.org%3E]
>  that the Spark runner is not respecting the user defined Spark default 
> parallelism configuration. We should investigate and if it is the case ensure 
> that a user defined configuration is always respected. Runner optimizations 
> should apply only for default (unconfigured) values otherwise we will confuse 
> users and limit them from parametrizing Spark for their best convenience.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-8384) Spark runner is not respecting spark.default.parallelism user defined configuration

2020-06-01 Thread Beam JIRA Bot (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8384?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17122626#comment-17122626
 ] 

Beam JIRA Bot commented on BEAM-8384:
-

This issue is P2 but has been unassigned without any comment for 60 days so it 
has been labeled "stale-P2". If this issue is still affecting you, we care! 
Please comment and remove the label. Otherwise, in 14 days the issue will be 
moved to P3.

Please see https://beam.apache.org/contribute/jira-priorities/ for a detailed 
explanation of what these priorities mean.


> Spark runner is not respecting spark.default.parallelism user defined 
> configuration
> ---
>
> Key: BEAM-8384
> URL: https://issues.apache.org/jira/browse/BEAM-8384
> Project: Beam
>  Issue Type: Bug
>  Components: runner-spark
>Affects Versions: 2.16.0
>Reporter: Ismaël Mejía
>Priority: P2
>  Labels: stale-P2
>
> It was reported in [the mailing 
> list|https://lists.apache.org/thread.html/792fb7fc2a5113837fbcdafce6a5d9100309881b366c1a7163d2c898@%3Cdev.beam.apache.org%3E]
>  that the Spark runner is not respecting the user defined Spark default 
> parallelism configuration. We should investigate and if it is the case ensure 
> that a user defined configuration is always respected. Runner optimizations 
> should apply only for default (unconfigured) values otherwise we will confuse 
> users and limit them from parametrizing Spark for their best convenience.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-8384) Spark runner is not respecting spark.default.parallelism user defined configuration

2019-11-19 Thread Jira


[ 
https://issues.apache.org/jira/browse/BEAM-8384?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16978142#comment-16978142
 ] 

Ismaël Mejía commented on BEAM-8384:


I have forgotten about this one. Looks like a regression but not convinced it 
is a blocker. I will move the Fix version tag to unblock the release and 
eventually send a cherry pick if still in time.

> Spark runner is not respecting spark.default.parallelism user defined 
> configuration
> ---
>
> Key: BEAM-8384
> URL: https://issues.apache.org/jira/browse/BEAM-8384
> Project: Beam
>  Issue Type: Bug
>  Components: runner-spark
>Affects Versions: 2.16.0
>Reporter: Ismaël Mejía
>Assignee: Ismaël Mejía
>Priority: Major
>
> It was reported in [the mailing 
> list|https://lists.apache.org/thread.html/792fb7fc2a5113837fbcdafce6a5d9100309881b366c1a7163d2c898@%3Cdev.beam.apache.org%3E]
>  that the Spark runner is not respecting the user defined Spark default 
> parallelism configuration. We should investigate and if it is the case ensure 
> that a user defined configuration is always respected. Runner optimizations 
> should apply only for default (unconfigured) values otherwise we will confuse 
> users and limit them from parametrizing Spark for their best convenience.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-8384) Spark runner is not respecting spark.default.parallelism user defined configuration

2019-11-19 Thread Kenneth Knowles (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8384?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16978068#comment-16978068
 ] 

Kenneth Knowles commented on BEAM-8384:
---

[~iemejia] is this in progress? Is there a chance for it to be done for 2.17.0? 
Is it a regression?

> Spark runner is not respecting spark.default.parallelism user defined 
> configuration
> ---
>
> Key: BEAM-8384
> URL: https://issues.apache.org/jira/browse/BEAM-8384
> Project: Beam
>  Issue Type: Bug
>  Components: runner-spark
>Affects Versions: 2.16.0
>Reporter: Ismaël Mejía
>Assignee: Ismaël Mejía
>Priority: Major
> Fix For: 2.17.0
>
>
> It was reported in [the mailing 
> list|https://lists.apache.org/thread.html/792fb7fc2a5113837fbcdafce6a5d9100309881b366c1a7163d2c898@%3Cdev.beam.apache.org%3E]
>  that the Spark runner is not respecting the user defined Spark default 
> parallelism configuration. We should investigate and if it is the case ensure 
> that a user defined configuration is always respected. Runner optimizations 
> should apply only for default (unconfigured) values otherwise we will confuse 
> users and limit them from parametrizing Spark for their best convenience.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)


[jira] [Commented] (BEAM-8384) Spark runner is not respecting spark.default.parallelism user defined configuration

2019-10-11 Thread Ryan Skraba (Jira)


[ 
https://issues.apache.org/jira/browse/BEAM-8384?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16949333#comment-16949333
 ] 

Ryan Skraba commented on BEAM-8384:
---

Related to BEAM-8191 (exploding number of partitions during flatten).

There could definitely be some cleanup around 
[getPartitioner|https://github.com/apache/beam/blob/a5e7e671d4571d86991151e79586c98fd107a2b1/runners/spark/src/main/java/org/apache/beam/runners/spark/translation/TransformTranslator.java#L571]
 and how bundleSize impacts the partitioner choice.

> Spark runner is not respecting spark.default.parallelism user defined 
> configuration
> ---
>
> Key: BEAM-8384
> URL: https://issues.apache.org/jira/browse/BEAM-8384
> Project: Beam
>  Issue Type: Bug
>  Components: runner-spark
>Affects Versions: 2.16.0
>Reporter: Ismaël Mejía
>Assignee: Ismaël Mejía
>Priority: Major
> Fix For: 2.17.0
>
>
> It was reported in [the mailing 
> list|https://lists.apache.org/thread.html/792fb7fc2a5113837fbcdafce6a5d9100309881b366c1a7163d2c898@%3Cdev.beam.apache.org%3E]
>  that the Spark runner is not respecting the user defined Spark default 
> parallelism configuration. We should investigate and if it is the case ensure 
> that a user defined configuration is always respected. Runner optimizations 
> should apply only for default (unconfigured) values otherwise we will confuse 
> users and limit them from parametrizing Spark for their best convenience.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)