output folder structure not getting commited and remains as _temporary

2015-06-30 Thread nkd
I am running a spark application in standalone cluster on windows 7 environment. Following are the details. spark version = 1.4.0 Windows/Standalone mode built the Hadoop 2.6.0 on windows and set the env params like so HADOOP_HOME = E:\hadooptar260\hadoop-2.6.0 HADOOP_CONF_DIR

Re: any work around to support nesting of RDDs other than join

2014-04-06 Thread nkd
It worked when I converted the nested RDD to an array -- case class TradingTier(tierId:String, lowerLimit:Int,upperLimit:Int , transactionFees:Double) //userTransactions Seq[(accountId,numTransactions)] val userTransactionsRDD =