[GitHub] spark issue #16486: [SPARK-13610][ML] Create a Transformer to disassemble ve...

2018-08-02 Thread AlbertPlaPlanas
Github user AlbertPlaPlanas commented on the issue:

https://github.com/apache/spark/pull/16486
  
Was this ever implemented?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16486: [SPARK-13610][ML] Create a Transformer to disassemble ve...

2017-12-14 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue:

https://github.com/apache/spark/pull/16486
  
Can one of the admins verify this patch?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16486: [SPARK-13610][ML] Create a Transformer to disassemble ve...

2017-05-01 Thread leonfl
Github user leonfl commented on the issue:

https://github.com/apache/spark/pull/16486
  
@mrjrdnthms ,Yes, your understand is correct, in scala it like this:

```
val rows: RDD[Row] = df.rdd.map(
  rowIn => {
// handle the rowIn and return a Row
  }
)
val newDF = df.sqlContext.createDataFrame(rows, /*create the newDF 
schema*/)
```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16486: [SPARK-13610][ML] Create a Transformer to disassemble ve...

2017-04-30 Thread mrjrdnthms
Github user mrjrdnthms commented on the issue:

https://github.com/apache/spark/pull/16486
  
@leonfl The python udf is too slow for my task. By "mappatition and row 
iterator" do you mean doing the transformation on the RDD directly instead of 
the dataframe? Sorry for the basic question. I am new to spark. And thanks for 
help.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16486: [SPARK-13610][ML] Create a Transformer to disassemble ve...

2017-04-24 Thread leonfl
Github user leonfl commented on the issue:

https://github.com/apache/spark/pull/16486
  
@mrjrdnthms , this is implemented by UDF, which will run a little bit 
slower, but easy to use.
If you want it run faster, you can implement it using mappatition and row 
iterator instead of udf.
That implementation will reduce the running time a lot.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16486: [SPARK-13610][ML] Create a Transformer to disassemble ve...

2017-04-19 Thread mrjrdnthms
Github user mrjrdnthms commented on the issue:

https://github.com/apache/spark/pull/16486
  
I could use this. I have udf to pick out single values I want but my 
implementation is slow: here is my python udf:
`probTrue_udf = udf(lambda value: value[1].item(), FloatType())`
I was hoping there would be a lower level api that did the disassemble 
transformation quickly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16486: [SPARK-13610][ML] Create a Transformer to disassemble ve...

2017-01-08 Thread leonfl
Github user leonfl commented on the issue:

https://github.com/apache/spark/pull/16486
  
@jkbradley, Could you also help to check this patch cause you are familiar 
with this defect, Thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16486: [SPARK-13610][ML] Create a Transformer to disassemble ve...

2017-01-06 Thread leonfl
Github user leonfl commented on the issue:

https://github.com/apache/spark/pull/16486
  
@mengxr, could you help to check this patch? Thanks


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16486: [SPARK-13610][ML] Create a Transformer to disassemble ve...

2017-01-06 Thread leonfl
Github user leonfl commented on the issue:

https://github.com/apache/spark/pull/16486
  
It's a method like VectorAssembler, which make user easy to handle single 
fields and vector field.
Pull out a single field is easy, but for all single fields in a vector, it 
still need some code by users. This Transformer will make user easy understand 
and use, right?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16486: [SPARK-13610][ML] Create a Transformer to disassemble ve...

2017-01-06 Thread srowen
Github user srowen commented on the issue:

https://github.com/apache/spark/pull/16486
  
I don't think this is worth adding. It's pretty easy to pull out a single 
fiedl from a vector already.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #16486: [SPARK-13610][ML] Create a Transformer to disassemble ve...

2017-01-05 Thread AmplabJenkins
Github user AmplabJenkins commented on the issue:

https://github.com/apache/spark/pull/16486
  
Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org