I hope https://github.com/apache/spark/pull/16252 can be fixed until
release 2.1.0. It's a fix for broadcast cannot fit in memory.
On Sat, Dec 17, 2016 at 10:23 AM, Joseph Bradley
wrote:
> +1
>
> On Fri, Dec 16, 2016 at 3:21 PM, Herman van Hövell tot Westerflier <
>
+1
On Fri, Dec 16, 2016 at 3:21 PM, Herman van Hövell tot Westerflier <
hvanhov...@databricks.com> wrote:
> +1
>
> On Sat, Dec 17, 2016 at 12:14 AM, Xiao Li wrote:
>
>> +1
>>
>> Xiao Li
>>
>> 2016-12-16 12:19 GMT-08:00 Felix Cheung :
>>
>>> For R
+1
On Sat, Dec 17, 2016 at 12:14 AM, Xiao Li wrote:
> +1
>
> Xiao Li
>
> 2016-12-16 12:19 GMT-08:00 Felix Cheung :
>
>> For R we have a license field in the DESCRIPTION, and this is standard
>> practice (and requirement) for R packages.
>>
>>
+1
Xiao Li
2016-12-16 12:19 GMT-08:00 Felix Cheung :
> For R we have a license field in the DESCRIPTION, and this is standard
> practice (and requirement) for R packages.
>
> https://cran.r-project.org/doc/manuals/R-exts.html#Licensing
>
>
I'd be happy to review a PR. At the minute, I'm still learning Spark
SQL, so writing documentation might be a bit of a stretch, but reviewing
would be fine.
Thanks!
On 12/16/2016 08:39 AM, Thakrar, Jayesh wrote:
Yes - that sounds good Anton, I can work on documenting the window
functions.
RC5 is also tested on CentOS 6.8, OpenJDK 1.8.0_111, R 3.3.2 with profiles
`-Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver -Psparkr`.
BTW, there still exist five on-going issues in JIRA (with target version 2.1.0).
1. SPARK-16845
(If you have a template for these emails, maybe update it to use https
links. They work for apache.org domains. After all we are asking people to
verify the integrity of release artifacts, so it might as well be secure.)
(Also the new archives use .tar.gz instead of .tgz like the others. No big
Hi
I am using Spark 1.6. I have one query about Fine Grained model in Spark.
I have a simple Spark application which transforms A -> B. Its a single
stage application. To begin the program, It starts with 48 partitions.
When the program starts running, in mesos UI it shows 48 tasks and 48 CPUs
Yes - that sounds good Anton, I can work on documenting the window functions.
From: Anton Okolnychyi
Date: Thursday, December 15, 2016 at 4:34 PM
To: Conversant
Cc: Michael Armbrust , Jim Hughes ,
Hi,
I tried your example with latest Spark master branch and branch-2.0. It
works well.
-
Liang-Chi Hsieh | @viirya
Spark Technology Center
http://www.spark.tc/
--
View this message in context:
Thanks for the specific mention of the new PySpark packaging Shivaram,
For *nix (Linux, Unix, OS X, etc.) Python users interested in helping test
the new artifacts you can do as follows:
Setup PySpark with pip by:
1. Download the artifact from
11 matches
Mail list logo