using Convert function of sql in spark sql

2015-08-25 Thread Rajeshkumar J
Hi All,

  I want to use Convert() function in sql in one of my spark sql query.
Can any one tell me whether it is supported or not?


org.apache.spark.SparkException: Detected yarn-cluster mode, but isn't running on a cluster. Deployment to YARN is not supported directly by SparkContext. Please use spark-submit

2015-08-03 Thread Rajeshkumar J
Hi Everyone,

 I am using Apache Spark for 2 weeks and as of now I am querying hive
tables using spark java api. And it is working fine in Hadoop single mode
but when I tried the same code in Hadoop multi cluster it throws
org.apache.spark.SparkException: Detected yarn-cluster mode, but isn't
running on a cluster. Deployment to YARN is not supported directly by
SparkContext. Please use spark-submit


Fwd: org.apache.spark.SparkException: Detected yarn-cluster mode, but isn't running on a cluster. Deployment to YARN is not supported directly by SparkContext. Please use spark-submit

2015-08-03 Thread Rajeshkumar J
Hi Everyone,

 I am using Apache Spark for 2 weeks and as of now I am querying hive
tables using spark java api. And it is working fine in Hadoop single mode
but when I tried the same code in Hadoop multi cluster it throws
org.apache.spark.SparkException: Detected yarn-cluster mode, but isn't
running on a cluster. Deployment to YARN is not supported directly by
SparkContext. Please use spark-submit
   This is my java code what I tried in Single node cluster

SparkConf sparkConf = new
SparkConf().setAppName(Hive).setMaster(local).setSparkHome(path);
JavaSparkContext ctx = new JavaSparkContext(sparkConf);
HiveContext sqlContext = new HiveContext(ctx.sc());
   org.apache.spark.sql.Row[] result = sqlContext.sql(Select * from
tablename).collect();

But In multi node cluster I have changed local to yarn-cluster . can anyone
help me in this?