Re: scala.MatchError while doing BinaryClassificationMetrics

2016-11-15 Thread Bhaarat Sharma
from the test score and label DF. > > But you may prefer to just use spark.ml evaluators, which work with > DataFrames. Try BinaryClassificationEvaluator. > > On Mon, 14 Nov 2016 at 19:30, Bhaarat Sharma <bhaara...@gmail.com> wrote: > > I am getting scala.MatchError in the co

Re: scala.MatchError while doing BinaryClassificationMetrics

2016-11-14 Thread Nick Pentreath
use spark.ml evaluators, which work with DataFrames. Try BinaryClassificationEvaluator. On Mon, 14 Nov 2016 at 19:30, Bhaarat Sharma <bhaara...@gmail.com> wrote: I am getting scala.MatchError in the code below. I'm not able to see why this would be happening. I am using Spark 2.0.1 scala

Re: scala.MatchError while doing BinaryClassificationMetrics

2016-11-14 Thread Bhaarat Sharma
valuator. > > On Mon, 14 Nov 2016 at 19:30, Bhaarat Sharma <bhaara...@gmail.com> wrote: > >> I am getting scala.MatchError in the code below. I'm not able to see why >> this would be happening. I am using Spark 2.0.1 >> >> scala> testResults.columns >> res538

Re: scala.MatchError while doing BinaryClassificationMetrics

2016-11-14 Thread Nick Pentreath
com> wrote: > I am getting scala.MatchError in the code below. I'm not able to see why > this would be happening. I am using Spark 2.0.1 > > scala> testResults.columns > res538: Array[String] = Array(TopicVector, subject_id, hadm_id, isElective, > isNewborn, isUrgent, is

scala.MatchError while doing BinaryClassificationMetrics

2016-11-14 Thread Bhaarat Sharma
I am getting scala.MatchError in the code below. I'm not able to see why this would be happening. I am using Spark 2.0.1 scala> testResults.columns res538: Array[String] = Array(TopicVector, subject_id, hadm_id, isElective, isNewborn, isUrgent, isEmergency, isMale, isFemale, oasis_sc

Re: scala.MatchError on stand-alone cluster mode

2016-07-17 Thread Mekal Zheng
: Mekal Zheng <mekal.zh...@gmail.com> <mekal.zh...@gmail.com>, spark users <user@spark.apache.org> <user@spark.apache.org> 主题: Re: scala.MatchError on stand-alone cluster mode Hi Mekal, It may be a scala version mismatch error,kindly check whether you are running both (

Re: scala.MatchError on stand-alone cluster mode

2016-07-15 Thread Saisai Shao
The error stack is throwing from your code: Caused by: scala.MatchError: [Ljava.lang.String;@68d279ec (of class [Ljava.lang.String;) at com.jd.deeplog.LogAggregator$.main(LogAggregator.scala:29) at com.jd.deeplog.LogAggregator.main(LogAggregator.scala) I think you should debug

scala.MatchError on stand-alone cluster mode

2016-07-15 Thread Mekal Zheng
pl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:58) at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala) Caused by: scala.MatchError: [Ljava.lang.String;@68d279ec (of class [Ljava.l

Re: [MARKETING] Spark Streaming stateful transformation mapWithState function getting error scala.MatchError: [Ljava.lang.Object]

2016-03-14 Thread Vinti Maheshwari
ari [mailto:vinti.u...@gmail.com] > *Sent:* 12 March 2016 22:10 > *To:* user > *Subject:* [MARKETING] Spark Streaming stateful transformation > mapWithState function getting error scala.MatchError: [Ljava.lang.Object] > > > > Hi All, > > I wanted to replace my upda

Spark Streaming stateful transformation mapWithState function getting error scala.MatchError: [Ljava.lang.Object]

2016-03-12 Thread Vinti Maheshwari
://docs.cloud.databricks.com/docs/spark/1.6/index.html#examples/Streaming%20mapWithState.html but i am getting error *scala.MatchError: [Ljava.lang.Object]* org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 71.0 failed 4 times, most recent failure: Lost task 0.3 in stage 71.0 (TID 88

sql:Exception in thread "main" scala.MatchError: StringType

2016-01-03 Thread Bonsen
": "fantasy" }, { "firstName": "Frank", "lastName": "Peretti", "genre": "christianfiction" } ], "musicians": [ { "fi

Re: sql:Exception in thread "main" scala.MatchError: StringType

2016-01-03 Thread Jeff Zhang
"firstName": "Tad", > "lastName": "Williams", > "genre": "fantasy" > }, > { > "firstName": "Frank", > "lastName": "Peretti",

Re: Reading JSON in Pyspark throws scala.MatchError

2015-10-20 Thread Jeff Zhang
k.vija...@gmail.com> wrote: >> > Running Windows 8.1, Python 2.7.x, Scala 2.10.5, Spark 1.4.1. >> > >> > I'm trying to read in a large quantity of json data in a couple of >> files and >> > I receive a scala.MatchError when I do so. Json, Python and stack

Re: Reading JSON in Pyspark throws scala.MatchError

2015-10-20 Thread Jeff Zhang
gt; > On Fri, Oct 2, 2015 at 1:42 PM, balajikvijayan > <balaji.k.vija...@gmail.com> wrote: > > Running Windows 8.1, Python 2.7.x, Scala 2.10.5, Spark 1.4.1. > > > > I'm trying to read in a large quantity of json data in a couple of files > and > > I re

Re: Reading JSON in Pyspark throws scala.MatchError

2015-10-20 Thread Balaji Vijayan
ricks.com> wrote: >> >>> Could you create a JIRA to track this bug? >>> >>> On Fri, Oct 2, 2015 at 1:42 PM, balajikvijayan >>> <balaji.k.vija...@gmail.com> wrote: >>> > Running Windows 8.1, Python 2.7.x, Scala 2.10.5, Spark 1.4.1. >>

Re: Reading JSON in Pyspark throws scala.MatchError

2015-10-05 Thread Davies Liu
and > I receive a scala.MatchError when I do so. Json, Python and stack trace all > shown below. > > Json: > > { > "dataunit": { > "page_view": { > "nonce": 438058072, > "person": { >

Reading JSON in Pyspark throws scala.MatchError

2015-10-02 Thread balajikvijayan
Running Windows 8.1, Python 2.7.x, Scala 2.10.5, Spark 1.4.1. I'm trying to read in a large quantity of json data in a couple of files and I receive a scala.MatchError when I do so. Json, Python and stack trace all shown below. Json: { "dataunit": { "page_view":

Re: Reading JSON in Pyspark throws scala.MatchError

2015-10-02 Thread Ted Yu
trying to read in a large quantity of json data in a couple of files > and > I receive a scala.MatchError when I do so. Json, Python and stack trace all > shown below. > > Json: > > { > "dataunit": { > "page_view": { > "n

Re: SparkSQL : using Hive UDF returning Map throws rror: scala.MatchError: interface java.util.Map (of class java.lang.Class) (state=,code=0)

2015-06-05 Thread Okehee Goh
- From: Cheng, Hao [mailto:hao.ch...@intel.com] Sent: Friday, June 5, 2015 12:35 PM To: ogoh; user@spark.apache.org Subject: RE: SparkSQL : using Hive UDF returning Map throws rror: scala.MatchError: interface java.util.Map (of class java.lang.Class) (state=,code=0) Which version of Hive

Re: SparkSQL : using Hive UDF returning Map throws rror: scala.MatchError: interface java.util.Map (of class java.lang.Class) (state=,code=0)

2015-06-05 Thread Okehee Goh
: using Hive UDF returning Map throws rror: scala.MatchError: interface java.util.Map (of class java.lang.Class) (state=,code=0) Hello, I tested some custom udf on SparkSql's ThriftServer Beeline (Spark 1.3.1). Some udfs work fine (access array parameter and returning int or string type

RE: SparkSQL : using Hive UDF returning Map throws rror: scala.MatchError: interface java.util.Map (of class java.lang.Class) (state=,code=0)

2015-06-05 Thread Cheng, Hao
Hive UDF returning Map throws rror: scala.MatchError: interface java.util.Map (of class java.lang.Class) (state=,code=0) Which version of Hive jar are you using? Hive 0.13.1 or Hive 0.12.0? -Original Message- From: ogoh [mailto:oke...@gmail.com] Sent: Friday, June 5, 2015 10:10 AM

SparkSQL : using Hive UDF returning Map throws rror: scala.MatchError: interface java.util.Map (of class java.lang.Class) (state=,code=0)

2015-06-04 Thread ogoh
Hello, I tested some custom udf on SparkSql's ThriftServer Beeline (Spark 1.3.1). Some udfs work fine (access array parameter and returning int or string type). But my udf returning map type throws an error: Error: scala.MatchError: interface java.util.Map (of class java.lang.Class) (state

RE: SparkSQL : using Hive UDF returning Map throws rror: scala.MatchError: interface java.util.Map (of class java.lang.Class) (state=,code=0)

2015-06-04 Thread Cheng, Hao
Which version of Hive jar are you using? Hive 0.13.1 or Hive 0.12.0? -Original Message- From: ogoh [mailto:oke...@gmail.com] Sent: Friday, June 5, 2015 10:10 AM To: user@spark.apache.org Subject: SparkSQL : using Hive UDF returning Map throws rror: scala.MatchError: interface

scala.MatchError: class org.apache.avro.Schema (of class java.lang.Class)

2015-04-07 Thread Yamini
Using spark(1.2) streaming to read avro schema based topics flowing in kafka and then using spark sql context to register data as temp table. Avro maven plugin(1.7.7 version) generates the java bean class for the avro file but includes a field named SCHEMA$ of type org.apache.avro.Schema which is

Re: scala.MatchError: class org.apache.avro.Schema (of class java.lang.Class)

2015-04-07 Thread Yamini Maddirala
For more details on my question http://apache-spark-user-list.1001560.n3.nabble.com/How-to-generate-Java-bean-class-for-avro-files-using-spark-avro-project-tp22413.html Thanks, Yamini On Tue, Apr 7, 2015 at 2:23 PM, Yamini Maddirala yamini.m...@gmail.com wrote: Hi Michael, Yes, I did try

Re: scala.MatchError: class org.apache.avro.Schema (of class java.lang.Class)

2015-04-07 Thread Michael Armbrust
Have you looked at spark-avro? https://github.com/databricks/spark-avro On Tue, Apr 7, 2015 at 3:57 AM, Yamini yamini.m...@gmail.com wrote: Using spark(1.2) streaming to read avro schema based topics flowing in kafka and then using spark sql context to register data as temp table. Avro maven

Re: scala.MatchError: class org.apache.avro.Schema (of class java.lang.Class)

2015-04-07 Thread Yamini Maddirala
Hi Michael, Yes, I did try spark-avro 0.2.0 databricks project. I am using CHD5.3 which is based on spark 1.2. Hence I'm bound to use spark-avro 0.2.0 instead of the latest. I'm not sure how spark-avro project can help me in this scenario. 1. I have JavaDStream of type avro generic record

Q about Spark MLlib- Decision tree - scala.MatchError: 2.0 (of class java.lang.Double)

2014-12-14 Thread jake Lim
tasks have all completed, from pool org.apache.spark.SparkException: Job aborted due to stage failure: Task 21.0:0 failed 4 times, most recent failure: Exception failure in TID 34 on host krbda1anode01.kr.test.com: scala.MatchError: 2.0 (of class java.lang.Double

scala.MatchError on SparkSQL when creating ArrayType of StructType

2014-12-05 Thread Hao Ren
Hi, I am using SparkSQL on 1.1.0 branch. The following code leads to a scala.MatchError at org.apache.spark.sql.catalyst.expressions.Cast.cast$lzycompute(Cast.scala:247) val scm = StructType(inputRDD.schema.fields.init :+ StructField(list, ArrayType( StructType

Re: scala.MatchError on SparkSQL when creating ArrayType of StructType

2014-12-05 Thread Michael Armbrust
code leads to a scala.MatchError at org.apache.spark.sql.catalyst.expressions.Cast.cast$lzycompute(Cast.scala:247) val scm = StructType(inputRDD.schema.fields.init :+ StructField(list, ArrayType( StructType( Seq(StructField(date, StringType, nullable

RE: scala.MatchError

2014-11-12 Thread Naveen Kumar Pokala
) case class Instrument(issue: Issue = null) -Naveen From: Michael Armbrust [mailto:mich...@databricks.com] Sent: Wednesday, November 12, 2014 12:09 AM To: Xiangrui Meng Cc: Naveen Kumar Pokala; user@spark.apache.org Subject: Re: scala.MatchError Xiangrui is correct that is must be a java bean

scala.MatchError

2014-11-11 Thread Naveen Kumar Pokala
) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:162) Caused by: scala.MatchError: class sample.spark.test.Issue (of class java.lang.Class

Re: scala.MatchError

2014-11-11 Thread Xiangrui Meng
) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:162) Caused by: scala.MatchError: class sample.spark.test.Issue (of class

Re: scala.MatchError

2014-11-11 Thread Michael Armbrust
:43) at java.lang.reflect.Method.invoke(Method.java:483) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:162) Caused by: scala.MatchError: class sample.spark.test.Issue (of class java.lang.Class

scala.MatchError: class java.sql.Timestamp

2014-10-19 Thread Ge, Yao (Y.)
I am working with Spark 1.1.0 and I believe Timestamp is a supported data type for Spark SQL. However I keep getting this MatchError for java.sql.Timestamp when I try to use reflection to register a Java Bean with Timestamp field. Anything wrong with my code below? public

RE: scala.MatchError: class java.sql.Timestamp

2014-10-19 Thread Wang, Daoyuan
Can you provide the exception stack? Thanks, Daoyuan From: Ge, Yao (Y.) [mailto:y...@ford.com] Sent: Sunday, October 19, 2014 10:17 PM To: user@spark.apache.org Subject: scala.MatchError: class java.sql.Timestamp I am working with Spark 1.1.0 and I believe Timestamp is a supported data type

RE: scala.MatchError: class java.sql.Timestamp

2014-10-19 Thread Ge, Yao (Y.)
scala.MatchError: class java.sql.Timestamp (of class java.lang.Class) at org.apache.spark.sql.api.java.JavaSQLContext$$anonfun$getSchema$1.apply(JavaSQLContext.scala:189) at org.apache.spark.sql.api.java.JavaSQLContext$$anonfun$getSchema$1.apply

RE: scala.MatchError: class java.sql.Timestamp

2014-10-19 Thread Cheng, Hao
Seems bugs in the JavaSQLContext.getSchema(), which doesn't enumerate all of the data types supported by Catalyst. From: Ge, Yao (Y.) [mailto:y...@ford.com] Sent: Sunday, October 19, 2014 11:44 PM To: Wang, Daoyuan; user@spark.apache.org Subject: RE: scala.MatchError: class java.sql.Timestamp

RE: scala.MatchError: class java.sql.Timestamp

2014-10-19 Thread Wang, Daoyuan
I have created an issue for this https://issues.apache.org/jira/browse/SPARK-4003 From: Cheng, Hao Sent: Monday, October 20, 2014 9:20 AM To: Ge, Yao (Y.); Wang, Daoyuan; user@spark.apache.org Subject: RE: scala.MatchError: class java.sql.Timestamp Seems bugs in the JavaSQLContext.getSchema

Are scala.MatchError messages a problem?

2014-06-08 Thread Jeremy Lee
running job streaming job 1402245172000 ms.2 scala.MatchError: 0101-01-10 (of class java.lang.String) at SimpleApp$$anonfun$6$$anonfun$apply$6.apply(SimpleApp.scala:218) at SimpleApp$$anonfun$6$$anonfun$apply$6.apply(SimpleApp.scala:217) at scala.collection.IndexedSeqOptimized

Re: Are scala.MatchError messages a problem?

2014-06-08 Thread Sean Owen
scala.MatchError: 0101-01-10 (of class java.lang.String) at SimpleApp$$anonfun$6$$anonfun$apply$6.apply(SimpleApp.scala:218) at SimpleApp$$anonfun$6$$anonfun$apply$6.apply(SimpleApp.scala:217) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33

Re: Are scala.MatchError messages a problem?

2014-06-08 Thread Nick Pentreath
been a bit of a horror and I need to sleep now. Should I be worried about these errors? Or did I just have the old log4j.config tuned so I didn't see them? I 14/06/08 16:32:52 ERROR scheduler.JobScheduler: Error running job streaming job 1402245172000 ms.2 scala.MatchError: 0101-01-10 (of class

Re: Are scala.MatchError messages a problem?

2014-06-08 Thread Mark Hamstra
streaming job 1402245172000 ms.2 scala.MatchError: 0101-01-10 (of class java.lang.String) at SimpleApp$$anonfun$6$$anonfun$apply$6.apply(SimpleApp.scala:218) at SimpleApp$$anonfun$6$$anonfun$apply$6.apply(SimpleApp.scala:217) at scala.collection.IndexedSeqOptimized

Re: Are scala.MatchError messages a problem?

2014-06-08 Thread Jeremy Lee
On Sun, Jun 8, 2014 at 10:00 AM, Nick Pentreath nick.pentre...@gmail.com wrote: When you use match, the match must be exhaustive. That is, a match error is thrown if the match fails. Ahh, right. That makes sense. Scala is applying its strong typing rules here instead of no ceremony... but

Re: Are scala.MatchError messages a problem?

2014-06-08 Thread Tobias Pfeiffer
Jeremy, On Mon, Jun 9, 2014 at 10:22 AM, Jeremy Lee unorthodox.engine...@gmail.com wrote: When you use match, the match must be exhaustive. That is, a match error is thrown if the match fails. Ahh, right. That makes sense. Scala is applying its strong typing rules here instead of no