Hi Sree,
I think it has to be *--class mllib.perf.TestRunner* instead of *--class
mllib.perf.TesRunner*
On Mon, Jan 11, 2016 at 1:19 PM, Sree Eedupuganti wrote:
> The way how i submitting jar
>
> hadoop@localhost:/usr/local/hadoop/spark$ ./bin/spark-submit \
> > --class mllib.perf.TesRunner \
The way how i submitting jar
hadoop@localhost:/usr/local/hadoop/spark$ ./bin/spark-submit \
> --class mllib.perf.TesRunner \
> --master spark://localhost:7077 \
> --executor-memory 2G \
> --total-executor-cores 100 \
> /usr/local/hadoop/spark/lib/mllib-perf-tests-assembly.jar \
> 1000
Was there stack trace following the error ?
Which Spark release are you using ?
Cheers
> On Dec 19, 2015, at 10:43 PM, Sree Eedupuganti wrote:
>
> i had 9 rows in my Mysql table
>
>
> options.put("dbtable", "(select * from employee");
>options.put("lowerBound", "1");
>options
i had 9 rows in my Mysql table
options.put("dbtable", "(select * from employee");
options.put("lowerBound", "1");
options.put("upperBound", "8");
options.put("numPartitions", "2");
Error : Parameter index out of range (1 > number of parameters, which is 0)
--
Best Regards,
Are you doing a .cache after the sc.textFile? If so, you can set the
StorageLevel to MEMORY_AND_DISK to avoid that.
Thanks
Best Regards
On Thu, Sep 3, 2015 at 10:11 AM, Spark Enthusiast
wrote:
> Folks,
>
> I have an input file which is gzipped. I use sc.textFile("foo.gz") when I
> see the follo
Folks,
I have an input file which is gzipped. I use sc.textFile("foo.gz") when I see
the following problem. Can someone help me how to fix this?
15/09/03 10:05:32 INFO deprecation: mapred.job.id is deprecated. Instead, use
mapreduce.job.id15/09/03 10:05:32 INFO CodecPool: Got brand-new decompress
file/n4933/pic4.png>
> <http://apache-spark-user-list.1001560.n3.nabble.com/file/n4933/pic1.png>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/getting-an-error-tp4933.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/getting-an-error-tp4933.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.