drop() function is in scala,an attribute of Array,no in spark
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-ArrayIndexOutofBoundsException-tp15639p28127.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
),
p(780),p(781),p(782),p(783),p(784)
i.e by specifying all 785 elements physically
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-ArrayIndexOutOfBoundsException-tp22854p22855.html
Sent from the Apache Spark User List mailing list archive
= sql_cxt.sql(SELECT COUNT (DISTINCT userid) FROM
tusers).collect().head.getLong(0)
println(unique_count)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-ArrayIndexOutofBoundsException-tp15639.html
Sent from the Apache Spark User List mailing list
val unique_count = sql_cxt.sql(SELECT COUNT (DISTINCT userid) FROM
tusers).collect().head.getLong(0)
println(unique_count)
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-ArrayIndexOutofBoundsException-tp15639.html
Sent from the Apache
-- Forwarded message --
From: Liquan Pei liquan...@gmail.com
Date: Thu, Oct 2, 2014 at 3:42 PM
Subject: Re: Spark SQL: ArrayIndexOutofBoundsException
To: SK skrishna...@gmail.com
There is only one place you use index 1. One possible issue is that your
may have only one element
just deleting this header line manually before
processing it in Spark.
thanks
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-ArrayIndexOutofBoundsException-tp15639p15642.html
Sent from the Apache Spark User List mailing list archive
the header line
of
the file and use the rest for the data would be useful - just as a
suggestion. Currently I am just deleting this header line manually before
processing it in Spark.
thanks
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL
this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-ArrayIndexOutofBoundsException-tp15639p15642.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe