Re: Getting an error while submitting spark jar

2016-01-11 Thread Gaini Rajeshwar
Hi Sree, I think it has to be *--class mllib.perf.TestRunner* instead of *--class mllib.perf.TesRunner* On Mon, Jan 11, 2016 at 1:19 PM, Sree Eedupuganti wrote: > The way how i submitting jar > > hadoop@localhost:/usr/local/hadoop/spark$ ./bin/spark-submit \ > > --class mllib.perf.TesRunner \

Getting an error while submitting spark jar

2016-01-10 Thread Sree Eedupuganti
The way how i submitting jar hadoop@localhost:/usr/local/hadoop/spark$ ./bin/spark-submit \ > --class mllib.perf.TesRunner \ > --master spark://localhost:7077 \ > --executor-memory 2G \ > --total-executor-cores 100 \ > /usr/local/hadoop/spark/lib/mllib-perf-tests-assembly.jar \ > 1000

Re: Getting an error in insertion to mysql through sparkcontext in java..

2015-12-20 Thread Ted Yu
Was there stack trace following the error ? Which Spark release are you using ? Cheers > On Dec 19, 2015, at 10:43 PM, Sree Eedupuganti wrote: > > i had 9 rows in my Mysql table > > > options.put("dbtable", "(select * from employee"); >options.put("lowerBound", "1"); >options

Getting an error in insertion to mysql through sparkcontext in java..

2015-12-19 Thread Sree Eedupuganti
i had 9 rows in my Mysql table options.put("dbtable", "(select * from employee"); options.put("lowerBound", "1"); options.put("upperBound", "8"); options.put("numPartitions", "2"); Error : Parameter index out of range (1 > number of parameters, which is 0) -- Best Regards,

Re: Getting an error when trying to read a GZIPPED file

2015-09-04 Thread Akhil Das
Are you doing a .cache after the sc.textFile? If so, you can set the StorageLevel to MEMORY_AND_DISK to avoid that. Thanks Best Regards On Thu, Sep 3, 2015 at 10:11 AM, Spark Enthusiast wrote: > Folks, > > I have an input file which is gzipped. I use sc.textFile("foo.gz") when I > see the follo

Getting an error when trying to read a GZIPPED file

2015-09-02 Thread Spark Enthusiast
Folks, I have an input file which is gzipped. I use sc.textFile("foo.gz") when I see the following problem. Can someone help me how to fix this? 15/09/03 10:05:32 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id15/09/03 10:05:32 INFO CodecPool: Got brand-new decompress

Re: getting an error

2014-05-02 Thread Mayur Rustagi
file/n4933/pic4.png> > <http://apache-spark-user-list.1001560.n3.nabble.com/file/n4933/pic1.png> > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/getting-an-error-tp4933.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >

getting an error

2014-04-28 Thread Joe L
this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/getting-an-error-tp4933.html Sent from the Apache Spark User List mailing list archive at Nabble.com.