Hello Jacek,
I was just facing the same issue and have found a possible solution:
scala> ds.select('id.as[Int], 'text.as[String]).show
+---+-+
| _1| _2|
+---+-+
| 0|hello|
| 1|world|
+---+-+
The only thing is that the resulting DS losses the filed names ;-(
Regards,
Seba
Matt, have you tried using the parameter --*proxy*-*user* matt
On Apr 2, 2016 8:17 AM, "Mich Talebzadeh" wrote:
> Matt,
>
> What OS are you using on your laptop? Sounds like Ubuntu or something?
>
> Thanks
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn *
> https://www.linkedin.com/profile/view?id=AAEA
Hello Michel,
I had a similar issue when running my custom built Spark 1.6.1, at the end
I resolved the issue by building Spark and my Jar with the CDH built in jvm
export JAVA_HOME=/usr/java/jdk1.7.0_67-cloudera/
Regards and hope this helps,
Sebastian
On Tue, Mar 22, 2016 at 10:13 AM, Michel
Hello,
I am wondering if anyone else is also facing this issue:
https://issues.apache.org/jira/browse/SPARK-11147