Re: Spark build 1.6.2 error

2016-09-03 Thread Nachiketa
I think the difference was the -Dscala2.11 to the command line. I have seen this show up when I miss that. Regards, Nachiketa On Sat 3 Sep, 2016, 12:14 PM Diwakar Dhanuskodi, < diwakar.dhanusk...@gmail.com> wrote: > Hi, > > Just re-ran again without killing zinc server pr

Re: Spark build 1.6.2 error

2016-08-31 Thread Nachiketa
Hi Diwakar, Could you please share the entire maven command that you are using to build ? And also the JDK version you are using ? Also could you please confirm that you did execute the script for change scala version to 2.11 before starting the build ? Thanks. Regards, Nachiketa On Wed, Aug

Re: removing header from csv file

2016-04-27 Thread Nachiketa
Why "without sqlcontext" ? Could you please describe what is it that you are trying to accomplish ? Thanks. Regards, Nachiketa On Wed, Apr 27, 2016 at 10:54 AM, Ashutosh Kumar <kmr.ashutos...@gmail.com> wrote: > I see there is a library spark-csv which can be used

Re: Spark 1.4.0, Secure YARN Cluster, Application Master throws 500 connection refused (Resolved)

2015-06-25 Thread Nachiketa
Setting the yarn.resourcemanager.webapp.address.rm1 and yarn.resourcemanager.webapp.address.rm2 in yarn-site.xml seems to have resolved the issue. Appreciate any comments about the regression from 1.3.1 ? Thanks. Regards, Nachiketa On Fri, Jun 26, 2015 at 1:28 AM, Nachiketa nachiketa.shu

Re: Spark 1.4.0, Secure YARN Cluster, Application Master throws 500 connection refused

2015-06-25 Thread Nachiketa
this issue : https://issues.apache.org/jira/browse/SPARK-5837 and multiple references to other YARN issues to the same. Continuing to understand and explore the possibilities documented there. Regards, Nachiketa On Fri, Jun 26, 2015 at 12:52 AM, Nachiketa nachiketa.shu...@gmail.com wrote: Spark

Spark 1.4.0, Secure YARN Cluster, Application Master throws 500 connection refused

2015-06-25 Thread Nachiketa
well against a non-secure cluster (same HDP distribution). No debug logs or stack trace is easily visible. Where do I look for what is going wrong ? And has anything changed in spark security that could be contributing to this ? Thank you for your help with this. Regards, Nachiketa