[jira] [Updated] (SPARK-21685) Params isSet in scala Transformer triggered by _setDefault in pyspark
[ https://issues.apache.org/jira/browse/SPARK-21685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ratan Rai Sur updated SPARK-21685: -- Description: I'm trying to write a PySpark wrapper for a Transformer whose transform method includes the line {code:java} require(!(isSet(outputNodeName) && isSet(outputNodeIndex)), "Can't set both outputNodeName and outputNodeIndex") {code} This should only throw an exception when both of these parameters are explicitly set. In the PySpark wrapper for the Transformer, there is this line in ___init___ {code:java} self._setDefault(outputNodeIndex=0) {code} Here is the line in the main python script showing how it is being configured {code:java} cntkModel = CNTKModel().setInputCol("images").setOutputCol("output").setModelLocation(spark, model.uri).setOutputNodeName("z") {code} As you can see, only setOutputNodeName is being explicitly set but the exception is still being thrown. If you need more context, https://github.com/RatanRSur/mmlspark/tree/default-cntkmodel-output is the branch with the code, the files I'm referring to here are CNTKModel.scala and _CNTKModel.py was: I'm trying to write a PySpark wrapper for a Transformer whose transform method includes the line {code:scala} require(!(isSet(outputNodeName) && isSet(outputNodeIndex)), "Can't set both outputNodeName and outputNodeIndex") {code} This should only throw an exception when both of these parameters are explicitly set. In the PySpark wrapper for the Transformer, there is this line in ___init___ {code:python} self._setDefault(outputNodeIndex=0) {code} Here is the line in the main python script showing how it is being configured {code:python} cntkModel = CNTKModel().setInputCol("images").setOutputCol("output").setModelLocation(spark, model.uri).setOutputNodeName("z") {code} As you can see, only setOutputNodeName is being explicitly set but the exception is still being thrown. If you need more context, https://github.com/RatanRSur/mmlspark/tree/default-cntkmodel-output is the branch with the code, the files I'm referring to here are CNTKModel.scala and _CNTKModel.py > Params isSet in scala Transformer triggered by _setDefault in pyspark > - > > Key: SPARK-21685 > URL: https://issues.apache.org/jira/browse/SPARK-21685 > Project: Spark > Issue Type: Bug > Components: PySpark >Affects Versions: 2.1.0 >Reporter: Ratan Rai Sur > > I'm trying to write a PySpark wrapper for a Transformer whose transform > method includes the line > {code:java} > require(!(isSet(outputNodeName) && isSet(outputNodeIndex)), "Can't set both > outputNodeName and outputNodeIndex") > {code} > This should only throw an exception when both of these parameters are > explicitly set. > In the PySpark wrapper for the Transformer, there is this line in ___init___ > {code:java} > self._setDefault(outputNodeIndex=0) > {code} > Here is the line in the main python script showing how it is being configured > {code:java} > cntkModel = > CNTKModel().setInputCol("images").setOutputCol("output").setModelLocation(spark, > model.uri).setOutputNodeName("z") > {code} > As you can see, only setOutputNodeName is being explicitly set but the > exception is still being thrown. > If you need more context, > https://github.com/RatanRSur/mmlspark/tree/default-cntkmodel-output is the > branch with the code, the files I'm referring to here are CNTKModel.scala and > _CNTKModel.py -- This message was sent by Atlassian JIRA (v6.4.14#64029) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-21685) Params isSet in scala Transformer triggered by _setDefault in pyspark
[ https://issues.apache.org/jira/browse/SPARK-21685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ratan Rai Sur updated SPARK-21685: -- Description: I'm trying to write a PySpark wrapper for a Transformer whose transform method includes the line {code:java} require(!(isSet(outputNodeName) && isSet(outputNodeIndex)), "Can't set both outputNodeName and outputNodeIndex") {code} This should only throw an exception when both of these parameters are explicitly set. In the PySpark wrapper for the Transformer, there is this line in ___init___ {code:java} self._setDefault(outputNodeIndex=0) {code} Here is the line in the main python script showing how it is being configured {code:java} cntkModel = CNTKModel().setInputCol("images").setOutputCol("output").setModelLocation(spark, model.uri).setOutputNodeName("z") {code} As you can see, only setOutputNodeName is being explicitly set but the exception is still being thrown. If you need more context, https://github.com/RatanRSur/mmlspark/tree/default-cntkmodel-output is the branch with the code, the files I'm referring to here are src/cntk-model/src/main/scala/CNTKModel.scala src/src/main/resources/mmlspark/_CNTKModel.py notebooks/tests/301 - CIFAR10 CNTK CNN Evaluation.ipynb was: I'm trying to write a PySpark wrapper for a Transformer whose transform method includes the line {code:java} require(!(isSet(outputNodeName) && isSet(outputNodeIndex)), "Can't set both outputNodeName and outputNodeIndex") {code} This should only throw an exception when both of these parameters are explicitly set. In the PySpark wrapper for the Transformer, there is this line in ___init___ {code:java} self._setDefault(outputNodeIndex=0) {code} Here is the line in the main python script showing how it is being configured {code:java} cntkModel = CNTKModel().setInputCol("images").setOutputCol("output").setModelLocation(spark, model.uri).setOutputNodeName("z") {code} As you can see, only setOutputNodeName is being explicitly set but the exception is still being thrown. If you need more context, https://github.com/RatanRSur/mmlspark/tree/default-cntkmodel-output is the branch with the code, the files I'm referring to here are CNTKModel.scala and _CNTKModel.py > Params isSet in scala Transformer triggered by _setDefault in pyspark > - > > Key: SPARK-21685 > URL: https://issues.apache.org/jira/browse/SPARK-21685 > Project: Spark > Issue Type: Bug > Components: PySpark >Affects Versions: 2.1.0 >Reporter: Ratan Rai Sur > > I'm trying to write a PySpark wrapper for a Transformer whose transform > method includes the line > {code:java} > require(!(isSet(outputNodeName) && isSet(outputNodeIndex)), "Can't set both > outputNodeName and outputNodeIndex") > {code} > This should only throw an exception when both of these parameters are > explicitly set. > In the PySpark wrapper for the Transformer, there is this line in ___init___ > {code:java} > self._setDefault(outputNodeIndex=0) > {code} > Here is the line in the main python script showing how it is being configured > {code:java} > cntkModel = > CNTKModel().setInputCol("images").setOutputCol("output").setModelLocation(spark, > model.uri).setOutputNodeName("z") > {code} > As you can see, only setOutputNodeName is being explicitly set but the > exception is still being thrown. > If you need more context, > https://github.com/RatanRSur/mmlspark/tree/default-cntkmodel-output is the > branch with the code, the files I'm referring to here are > src/cntk-model/src/main/scala/CNTKModel.scala > src/src/main/resources/mmlspark/_CNTKModel.py > notebooks/tests/301 - CIFAR10 CNTK CNN Evaluation.ipynb -- This message was sent by Atlassian JIRA (v6.4.14#64029) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-21685) Params isSet in scala Transformer triggered by _setDefault in pyspark
[ https://issues.apache.org/jira/browse/SPARK-21685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Ratan Rai Sur updated SPARK-21685: -- Description: I'm trying to write a PySpark wrapper for a Transformer whose transform method includes the line {code:java} require(!(isSet(outputNodeName) && isSet(outputNodeIndex)), "Can't set both outputNodeName and outputNodeIndex") {code} This should only throw an exception when both of these parameters are explicitly set. In the PySpark wrapper for the Transformer, there is this line in ___init___ {code:java} self._setDefault(outputNodeIndex=0) {code} Here is the line in the main python script showing how it is being configured {code:java} cntkModel = CNTKModel().setInputCol("images").setOutputCol("output").setModelLocation(spark, model.uri).setOutputNodeName("z") {code} As you can see, only setOutputNodeName is being explicitly set but the exception is still being thrown. If you need more context, https://github.com/RatanRSur/mmlspark/tree/default-cntkmodel-output is the branch with the code, the files I'm referring to here that are tracked are the following: src/cntk-model/src/main/scala/CNTKModel.scala notebooks/tests/301 - CIFAR10 CNTK CNN Evaluation.ipynb The pyspark wrapper code is autogenerated was: I'm trying to write a PySpark wrapper for a Transformer whose transform method includes the line {code:java} require(!(isSet(outputNodeName) && isSet(outputNodeIndex)), "Can't set both outputNodeName and outputNodeIndex") {code} This should only throw an exception when both of these parameters are explicitly set. In the PySpark wrapper for the Transformer, there is this line in ___init___ {code:java} self._setDefault(outputNodeIndex=0) {code} Here is the line in the main python script showing how it is being configured {code:java} cntkModel = CNTKModel().setInputCol("images").setOutputCol("output").setModelLocation(spark, model.uri).setOutputNodeName("z") {code} As you can see, only setOutputNodeName is being explicitly set but the exception is still being thrown. If you need more context, https://github.com/RatanRSur/mmlspark/tree/default-cntkmodel-output is the branch with the code, the files I'm referring to here are src/cntk-model/src/main/scala/CNTKModel.scala src/src/main/resources/mmlspark/_CNTKModel.py notebooks/tests/301 - CIFAR10 CNTK CNN Evaluation.ipynb > Params isSet in scala Transformer triggered by _setDefault in pyspark > - > > Key: SPARK-21685 > URL: https://issues.apache.org/jira/browse/SPARK-21685 > Project: Spark > Issue Type: Bug > Components: PySpark >Affects Versions: 2.1.0 >Reporter: Ratan Rai Sur > > I'm trying to write a PySpark wrapper for a Transformer whose transform > method includes the line > {code:java} > require(!(isSet(outputNodeName) && isSet(outputNodeIndex)), "Can't set both > outputNodeName and outputNodeIndex") > {code} > This should only throw an exception when both of these parameters are > explicitly set. > In the PySpark wrapper for the Transformer, there is this line in ___init___ > {code:java} > self._setDefault(outputNodeIndex=0) > {code} > Here is the line in the main python script showing how it is being configured > {code:java} > cntkModel = > CNTKModel().setInputCol("images").setOutputCol("output").setModelLocation(spark, > model.uri).setOutputNodeName("z") > {code} > As you can see, only setOutputNodeName is being explicitly set but the > exception is still being thrown. > If you need more context, > https://github.com/RatanRSur/mmlspark/tree/default-cntkmodel-output is the > branch with the code, the files I'm referring to here that are tracked are > the following: > src/cntk-model/src/main/scala/CNTKModel.scala > notebooks/tests/301 - CIFAR10 CNTK CNN Evaluation.ipynb > The pyspark wrapper code is autogenerated -- This message was sent by Atlassian JIRA (v6.4.14#64029) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org