Fwd: posts are not accepted

2015-07-23 Thread Mattmann, Chris A (3980)


Sent from my iPhone

Begin forwarded message:

From: Rob Sargent rob.sarg...@utah.edumailto:rob.sarg...@utah.edu
Date: July 23, 2015 at 1:14:04 PM PDT
To: user-ow...@spark.apache.orgmailto:user-ow...@spark.apache.org
Subject: posts are not accepted

Hello,

my user name is iceback and my email is 
rob.sarg...@utah.edumailto:rob.sarg...@utah.edu.

There seems to be a problem with my account as my posts are never accepted.

Any information would be appreciated,

rjs



Re: Where to find Spark-project-hive

2015-07-23 Thread Steve Loughran

On 22 Jul 2015, at 18:57, Xiaoyu Ma 
hzmaxia...@corp.netease.commailto:hzmaxia...@corp.netease.com wrote:

Hi guys,
I’m trying to patch hive thrift server part related to HIVE-7620. I saw in 
spark is pulling a private fork of hive under spark-project hive name.
Any idea where I can find the source code of it?

Thanks~

马晓宇 / Xiaoyu Ma
hzmaxia...@corp.netease.commailto:hzmaxia...@corp.netease.com





The JIRA related to this issue is 
https://issues.apache.org/jira/browse/SPARK-5111


Re: Where to find Spark-project-hive

2015-07-23 Thread Steve Loughran
Xiaoyo,

In SPARK-8064 I've been working on getting spark  hive working against Hive 
1.2.1; targeting spark 1.5

that should pick up the Hive fix automatically —though we need to get the 
sql/hive tests all working first, and there aren't any tests backed by minikdc 
to verify secure operation



On 22 Jul 2015, at 18:57, Xiaoyu Ma 
hzmaxia...@corp.netease.commailto:hzmaxia...@corp.netease.com wrote:

Hi guys,
I’m trying to patch hive thrift server part related to HIVE-7620. I saw in 
spark is pulling a private fork of hive under spark-project hive name.
Any idea where I can find the source code of it?

Thanks~

马晓宇 / Xiaoyu Ma
hzmaxia...@corp.netease.commailto:hzmaxia...@corp.netease.com







Re: Shouldn't SparseVector constructor give error when declared number of elements less than array lenght?

2015-07-23 Thread Manoj Kumar
Hi,

I think this should raise an error both in the scala code and python API.

Please open a JIRA.

On Thu, Jul 23, 2015 at 4:22 PM, Andrew Vykhodtsev yoz...@gmail.com wrote:

 Dear Developers,

 I found that one can create SparseVector inconsistently and it will lead
 to an Java error in runtime, for example when training
 LogisticRegressionWithSGD.

 Here is the test case:


 In [2]:
 sc.version
 Out[2]:
 u'1.3.1'
 In [13]:
 from pyspark.mllib.linalg import SparseVector
 from pyspark.mllib.regression import LabeledPoint
 from pyspark.mllib.classification import LogisticRegressionWithSGD
 In [3]:
 x =  SparseVector(2, {1:1, 2:2, 3:3, 4:4, 5:5})
 In [10]:
 l = LabeledPoint(0, x)
 In [12]:
 r = sc.parallelize([l])
 In [14]:
 m = LogisticRegressionWithSGD.train(r)

 Error:


 Py4JJavaError: An error occurred while calling 
 o86.trainLogisticRegressionModelWithSGD.
 : org.apache.spark.SparkException: Job aborted due to stage failure: Task 7 
 in stage 11.0 failed 1 times, most recent failure: Lost task 7.0 in stage 
 11.0 (TID 47, localhost): *java.lang.ArrayIndexOutOfBoundsException: 2*



 Attached is the notebook with the scenario and the full message:



 Should I raise a JIRA for this (forgive me if there is such a JIRA and I did 
 not notice it)




 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
 For additional commands, e-mail: dev-h...@spark.apache.org




-- 
Godspeed,
Manoj Kumar,
http://manojbits.wordpress.com
http://goog_1017110195
http://github.com/MechCoder