Turn off logs in spark-sql shell

2015-10-15 Thread Muhammad Ahsan
(Level.OFF)Logger.getLogger("akka").setLevel(Level.OFF) Thanks -- Thanks Muhammad Ahsan

ERROR: "Size exceeds Integer.MAX_VALUE" Spark 1.5

2015-10-05 Thread Muhammad Ahsan
p.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111) at java.lang.Thread.run(Thread.java:745) Thanks in advance. -- Thanks Muhammad Ahsan

Re: ERROR YarnClientClusterScheduler: Lost executor Akka client disassociated

2014-12-11 Thread Muhammad Ahsan
-- Code -- scala import org.apache.spark.SparkContext._ import org.apache.spark.SparkContext._ scala import org.apache.spark.rdd.RDD import org.apache.spark.rdd.RDD scala import org.apache.spark.sql.SchemaRDD

Re: parquet file not loading (spark v 1.1.0)

2014-12-11 Thread Muhammad Ahsan
Hi It worked for me like this. Just define the case class outside of any class to write to parquet format successfully. I am using Spark version 1.1.1. case class person(id: Int, name: String, fathername: String, officeid: Int) object Program { def main (args: Array[String]) { val

Re: Error outputing to CSV file

2014-12-11 Thread Muhammad Ahsan
Hi saveAsTextFile is a member of RDD where as fields.map(_.mkString(|)).mkString(\n) is a string. You have to transform it into RDD using something like sc.parallel(...) before saveAsTextFile. Thanks -- View this message in context: