Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/10429#issuecomment-166737236
@rxin I fixed the test title, and the scala style issues.
I ran `dev/scalastyle` successfully.
---
If your project is set up for it, you can reply to this
Github user pierre-borckmans commented on a diff in the pull request:
https://github.com/apache/spark/pull/10429#discussion_r48297482
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/DataFrameComplexTypeSuite.scala ---
@@ -43,4 +43,12 @@ class DataFrameComplexTypeSuite
Github user pierre-borckmans commented on a diff in the pull request:
https://github.com/apache/spark/pull/10429#discussion_r48297441
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/DataFrameComplexTypeSuite.scala ---
@@ -43,4 +43,12 @@ class DataFrameComplexTypeSuite
Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/10392#issuecomment-166645424
Thanks, good to know. Is there work planned on that matter? Or do you guys
just specialise in "seeing that it was a flaky test, just restart
Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/10392#issuecomment-166637885
@srowen Thanks. Just for my curiosity, the failing tests were completely
unrelated, right? Does that mean master was in an unstable state? Or are the
unit
Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/10392#issuecomment-166572991
@srowen I fixed the scala style issues and the string interpolator.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/10392#issuecomment-166565536
@srowen Agreed. I reverted my changes, and updated the warning message.
Thanks for your review.
---
If your project is set up for it, you can reply to
Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/10429#issuecomment-166562978
@rxin This PR incidentally also fixes another issue. Accessing a null
element in an array of IntegerType erroneously returned 0:
```
scala> val
Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/10429#issuecomment-166561023
@rxin I added a small test, let me know if more should be added.
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/10429#issuecomment-166557024
@rxin Where should it go to be sure?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/10429#issuecomment-166556972
@rxin Sure!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
GitHub user pierre-borckmans opened a pull request:
https://github.com/apache/spark/pull/10429
[SPARK 12477][SQL] Tungsten projection fails for null values in array fields
Accessing null elements in an array field fails when tungsten is enabled.
It works in Spark 1.3.1, and in
Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/10400#issuecomment-166118539
Nice small DSL. Small comment, with the regular constructor, the `nullable`
field is optional, defaulting to `true`. You might want to make this DSL
follow
Github user pierre-borckmans commented on a diff in the pull request:
https://github.com/apache/spark/pull/10392#discussion_r48103410
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -2072,14 +2072,15 @@ class SparkContext(config: SparkConf) extends
Logging
GitHub user pierre-borckmans opened a pull request:
https://github.com/apache/spark/pull/10392
[SPARK-12440] - [core] - Avoid setCheckpoint warning when directory is not
local
In SparkContext method `setCheckpointDir`, a warning is issued when spark
master is not local and the
GitHub user pierre-borckmans opened a pull request:
https://github.com/apache/spark/pull/5083
[SPARK-6402] - Remove refererences to shark in docs and ec2
EC2 script and job scheduling documentation still refered to Shark.
I removed these references.
I also removed a
Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/2354#issuecomment-77120890
Ok good to know!
It's a really interesting debug tool, and gives us a lot of insight about
what's going on inside catalyst.
Hope to see it re
Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/2354#issuecomment-76424493
Why was this deleted?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/4284#issuecomment-73201804
@chenghao-intel No worries, thanks for the update!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/4284#issuecomment-73137097
@chenghao-intel Why did you close this? Is there another PR to fix this
issue?
---
If your project is set up for it, you can reply to this email and have your
Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/600#issuecomment-41852359
A last idea.
How about this sbt plugin?
https://github.com/sbt/sbt-buildinfo
It could definitely do the trick.
We would still need sth
Github user pierre-borckmans commented on the pull request:
https://github.com/apache/spark/pull/600#issuecomment-41840133
@pwendell AFAIK it's only made available in the META-INF of the fat jar. So
you are right.
Sorry about that, I overlooked these 2 scenarios.
---
If
Github user pierre-borckmans commented on a diff in the pull request:
https://github.com/apache/spark/pull/600#discussion_r12160496
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -1037,6 +1037,11 @@ private[spark] object Utils extends Logging
Github user pierre-borckmans commented on a diff in the pull request:
https://github.com/apache/spark/pull/600#discussion_r12156870
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -1037,6 +1037,11 @@ private[spark] object Utils extends Logging
Github user pierre-borckmans commented on a diff in the pull request:
https://github.com/apache/spark/pull/600#discussion_r12156082
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -1037,6 +1037,11 @@ private[spark] object Utils extends Logging
Github user pierre-borckmans commented on a diff in the pull request:
https://github.com/apache/spark/pull/600#discussion_r12156040
--- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
@@ -1037,6 +1037,11 @@ private[spark] object Utils extends Logging
Github user pierre-borckmans commented on a diff in the pull request:
https://github.com/apache/spark/pull/600#discussion_r12151396
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1565,4 +1565,3 @@ private[spark] class WritableConverter[T](
val
Github user pierre-borckmans commented on a diff in the pull request:
https://github.com/apache/spark/pull/600#discussion_r12151374
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1253,7 +1253,7 @@ class SparkContext(config: SparkConf) extends Logging
GitHub user pierre-borckmans opened a pull request:
https://github.com/apache/spark/pull/600
Remove hardcoded Spark version string to use SBT defined version instead
This PR attempts to remove hardcoded references to the Spark version which
must be updated for each release
29 matches
Mail list logo