Re: Remove / update version in spark-packages.org

2016-07-26 Thread Julio Antonio Soto de Vicente
Hi Burak,

Yes, you're right.

Thanks.

> El 27 jul 2016, a las 0:19, Burak Yavuz  escribió:
> 
> Hi,
> 
> It's bad practice to change jars for the same version and is prohibited in 
> Spark Packages. Please bump your version number and make a new release.
> 
> Best regards,
> Burak
> 
>> On Tue, Jul 26, 2016 at 3:51 AM, Julio Antonio Soto de Vicente 
>>  wrote:
>> Hi all,
>> 
>> Maybe I am missing something, but... Is there a way to update a package 
>> uploaded to spark-packages.org under the same version?
>> 
>> Given a release called my_package 1.1.2, I would like to re-upload it due to 
>> build failure; but I want to do it also as version 1.1.2...
>> 
>> Thank you.
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> 


Re: Remove / update version in spark-packages.org

2016-07-26 Thread Burak Yavuz
Hi,

It's bad practice to change jars for the same version and is prohibited in
Spark Packages. Please bump your version number and make a new release.

Best regards,
Burak

On Tue, Jul 26, 2016 at 3:51 AM, Julio Antonio Soto de Vicente <
ju...@esbet.es> wrote:

> Hi all,
>
> Maybe I am missing something, but... Is there a way to update a package
> uploaded to spark-packages.org under the same version?
>
> Given a release called my_package 1.1.2, I would like to re-upload it due
> to build failure; but I want to do it also as version 1.1.2...
>
> Thank you.
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: [VOTE] Release Apache Spark 2.0.0 (RC5)

2016-07-26 Thread Stephen Hellberg
Yeah, I thought the vote was closed... but I couldn't think of a better
thread to remark upon!
That's a useful comment on Derby's role - thanks.  Certainly, we'd just
attempted a build-and-test execution with revising the Derby level to the
current 10.12.1.1, and hadn't observed any issues... a PR will be
forthcoming.



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-2-0-0-RC5-tp18367p18467.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.0 (RC5)

2016-07-26 Thread Sean Owen
The release vote has already closed and passed. Derby is only used in
tests AFAIK, so I don't think this is even critical let alone a
blocker. Updating is fine though, open a PR.

On Tue, Jul 26, 2016 at 3:37 PM, Stephen Hellberg  wrote:
>  -1   Sorry, I've just noted that the RC5 proposal includes shipping Derby @
> 10.11.1.1 which is vulnerable to CVE: 2015-1832.
> It would be ideal if we could instead ship 10.12.1.1 real soon.
>
>
>
> --
> View this message in context: 
> http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-2-0-0-RC5-tp18367p18465.html
> Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.0 (RC5)

2016-07-26 Thread Stephen Hellberg
 -1   Sorry, I've just noted that the RC5 proposal includes shipping Derby @
10.11.1.1 which is vulnerable to CVE: 2015-1832.
It would be ideal if we could instead ship 10.12.1.1 real soon.



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-2-0-0-RC5-tp18367p18465.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



[no subject]

2016-07-26 Thread thibaut
unsuscribe

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Outer Explode needed

2016-07-26 Thread Yong Zhang
The reason of no response is that this feature is not available yet.


You can vote and following this JIRA 
https://issues.apache.org/jira/browse/SPARK-13721, if you really need this 
feature.


Yong



From: Don Drake 
Sent: Monday, July 25, 2016 9:12 PM
To: dev@spark.apache.org
Subject: Fwd: Outer Explode needed

No response on the Users list, I thought I would repost here.

See below.

-Don
-- Forwarded message --
From: Don Drake >
Date: Sun, Jul 24, 2016 at 2:18 PM
Subject: Outer Explode needed
To: user >


I have a nested data structure (array of structures) that I'm using the DSL 
df.explode() API to flatten the data.  However, when the array is empty, I'm 
not getting the rest of the row in my output as it is skipped.

This is the intended behavior, and Hive supports a SQL "OUTER explode()" to 
generate the row when the explode would not yield any output.

https://cwiki.apache.org/confluence/display/Hive/LanguageManual+LateralView

Can we get this same outer explode in the DSL?  I have to jump through some 
outer join hoops to get the rows where the array is empty.

Thanks.

-Don

--
Donald Drake
Drake Consulting
http://www.drakeconsulting.com/
https://twitter.com/dondrake
800-733-2143



--
Donald Drake
Drake Consulting
http://www.drakeconsulting.com/
https://twitter.com/dondrake
800-733-2143


Renaming spark.driver.appUIAddress to spark.yarn.driver.appUIAddress?

2016-07-26 Thread Jacek Laskowski
Hi,

Since spark.driver.appUIAddress is only used in Spark on YARN to
"announce" the web UI's address, I think the setting should rather be
called spark.yarn.driver.appUIAddress (for consistency with the other
YARN-specific settings).

What do you think? I'd like to hear your thoughts before filling an JIRA issue.

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Remove / update version in spark-packages.org

2016-07-26 Thread Julio Antonio Soto de Vicente
Hi all,

Maybe I am missing something, but... Is there a way to update a package 
uploaded to spark-packages.org under the same version?

Given a release called my_package 1.1.2, I would like to re-upload it due to 
build failure; but I want to do it also as version 1.1.2...

Thank you.
-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org