Re: Spark Dataframe returning null columns when schema is specified

2017-09-08 Thread Praneeth Gayam
What is the desired behaviour when a field is null for only a few records? You can not avoid nulls in this case But if all rows are guaranteed to be uniform(either all-null are all-non-null), you can *take* the first row of the DF and *drop* the columns with null fields. On Fri, Sep 8, 2017 at

Re: Chaining Spark Streaming Jobs

2017-09-08 Thread Praneeth Gayam
With file stream you will have to deal with the following 1. The file(s) must not be changed once created. So if the files are being continuously appended, the new data will not be read. Refer 2.

Re: use WithColumn with external function in a java jar

2017-08-28 Thread Praneeth Gayam
You can create a UDF which will invoke your java lib def calculateExpense: UserDefinedFunction = udf((pexpense: String, cexpense: String) => new MyJava().calculateExpense(pexpense.toDouble, cexpense.toDouble)) On Tue, Aug 29, 2017 at 6:53 AM, purna pradeep wrote: >

Re: Error while reading the CSV

2017-04-07 Thread Praneeth Gayam
Try the following spark-shell --master yarn-client --name nayan /opt/packages/-data- prepration/target/scala-2.10/-data-prepration-assembly-1.0.jar On Thu, Apr 6, 2017 at 6:36 PM, nayan sharma wrote: > Hi All, > I am getting error while loading CSV file. > >