Re: SparkR + binary type + how to get value

2019-02-19 Thread Felix Cheung
there: From: Thijs Haarhuis Sent: Tuesday, February 19, 2019 5:28 AM To: Felix Cheung; user@spark.apache.org Subject: Re: SparkR + binary type + how to get value Hi Felix, Thanks. I got it working now by using the unlist function. I have another question, maybe you can help me with, since I did

Re: SparkR + binary type + how to get value

2019-02-19 Thread Thijs Haarhuis
From: Felix Cheung Sent: Sunday, February 17, 2019 7:18 PM To: Thijs Haarhuis; user@spark.apache.org Subject: Re: SparkR + binary type + how to get value A byte buffer in R is the raw vector type, so seems like it is working as expected. What do you have in the raw

Re: SparkR + binary type + how to get value

2019-02-17 Thread Felix Cheung
: Thijs Haarhuis Sent: Thursday, February 14, 2019 4:01 AM To: Felix Cheung; user@spark.apache.org Subject: Re: SparkR + binary type + how to get value Hi Felix, Sure.. I have the following code: printSchema(results) cat("\n\n\n") firstRow <- first(results

Re: SparkR + binary type + how to get value

2019-02-14 Thread Thijs Haarhuis
Any idea how to get the actual value, or how to process the individual bytes? Thanks Thijs From: Felix Cheung Sent: Thursday, February 14, 2019 5:31 AM To: Thijs Haarhuis; user@spark.apache.org Subject: Re: SparkR + binary type + how to get value Please share

Re: SparkR + binary type + how to get value

2019-02-13 Thread Felix Cheung
Please share your code From: Thijs Haarhuis Sent: Wednesday, February 13, 2019 6:09 AM To: user@spark.apache.org Subject: SparkR + binary type + how to get value Hi all, Does anybody have any experience in accessing the data from a column which has a binary

SparkR + binary type + how to get value

2019-02-13 Thread Thijs Haarhuis
Hi all, Does anybody have any experience in accessing the data from a column which has a binary type in a Spark Data Frame in R? I have a Spark Data Frame which has a column which is of a binary type. I want to access this data and process it. In my case I collect the spark data frame to a R