Re: Not able pass 3rd party jars to mesos executors

2016-05-11 Thread Giri P
; > On Wed, May 11, 2016 at 10:05 PM, Giri P <gpatc...@gmail.com> wrote: > >> I'm not using docker >> >> On Wed, May 11, 2016 at 8:47 AM, Raghavendra Pandey < >> raghavendra.pan...@gmail.com> wrote: >> >>> By any chance, are you using docke

Re: Not able pass 3rd party jars to mesos executors

2016-05-11 Thread Giri P
I'm not using docker On Wed, May 11, 2016 at 8:47 AM, Raghavendra Pandey < raghavendra.pan...@gmail.com> wrote: > By any chance, are you using docker to execute? > On 11 May 2016 21:16, "Raghavendra Pandey" > wrote: > >> On 11 May 2016 02:13, "gpatcham"

Re: using spark context in map funciton TASk not serilizable error

2016-01-20 Thread Giri P
method1 looks like this reRDD.map(row =>method1(row,sc)).saveAsTextFile(outputDir) reRDD has userId's def method1(sc:SparkContext , userId: string){ sc.cassandraTable("Keyspace", "Table2").where("userid = ?" userId) ...do something return "Test" } On Wed, Jan 20, 2016 at 11:00 AM,

Re: using spark context in map funciton TASk not serilizable error

2016-01-18 Thread Giri P
I'm using spark cassandra connector to do this and the way we access cassandra table is sc.cassandraTable("keySpace", "tableName") Thanks Giri On Mon, Jan 18, 2016 at 12:37 PM, Ted Yu wrote: > Can you pass the properties which are needed for accessing Cassandra > without

Re: using spark context in map funciton TASk not serilizable error

2016-01-18 Thread Giri P
Can we use @transient ? On Mon, Jan 18, 2016 at 12:44 PM, Giri P <gpatc...@gmail.com> wrote: > I'm using spark cassandra connector to do this and the way we access > cassandra table is > > sc.cassandraTable("keySpace", "tableName") > > Thanks > Gi

Re: using spark context in map funciton TASk not serilizable error

2016-01-18 Thread Giri P
that would work. > > Doesn't seem to be good practice. > > On Mon, Jan 18, 2016 at 1:27 PM, Giri P <gpatc...@gmail.com> wrote: > >> Can we use @transient ? >> >> >> On Mon, Jan 18, 2016 at 12:44 PM, Giri P <gpatc...@gmail.com> wrote: >>

Re: query avro hive table in spark sql

2015-08-28 Thread Giri P
Any idea what causing this error 15/08/28 21:03:03 WARN scheduler.TaskSetManager: Lost task 34.0 in stage 9.0 (TID 20, dtord01hdw0228p.dc.dotomi.net): java.lang.RuntimeException: cannot find field message_campaign_id from [0:error_error_error_error_error_error_error, 1:cannot_determine_schema,

Re: query avro hive table in spark sql

2015-08-27 Thread Giri P
...@databricks.com; user@spark.apache.org can we run hive queries using spark-avro ? In our case its not just reading the avro file. we have view in hive which is based on multiple tables. On Thu, Aug 27, 2015 at 9:41 AM, Giri P gpatc...@gmail.com wrote: we are using hive1.1 . I was able to fix below

Re: query avro hive table in spark sql

2015-08-27 Thread Giri P
can we run hive queries using spark-avro ? In our case its not just reading the avro file. we have view in hive which is based on multiple tables. On Thu, Aug 27, 2015 at 9:41 AM, Giri P gpatc...@gmail.com wrote: we are using hive1.1 . I was able to fix below error when I used right version

Re: query avro hive table in spark sql

2015-08-27 Thread Giri P
we are using hive1.1 . I was able to fix below error when I used right version spark 15/08/26 17:51:12 WARN avro.AvroSerdeUtils: Encountered AvroSerdeException determining schema. Returning signal schema to indicate problem org.apache.hadoop.hive.serde2.avro.AvroSerdeException: Neither

Re: resource allocation spark on yarn

2014-12-12 Thread Giri P
but on spark 0.9 we don't have these options --num-executors: controls how many executors will be allocated --executor-memory: RAM for each executor --executor-cores: CPU cores for each executor On Fri, Dec 12, 2014 at 12:27 PM, Sameer Farooqui same...@databricks.com wrote: Hi, FYI - There