Github user EugenCepoi commented on the pull request:
https://github.com/apache/spark/pull/10203#issuecomment-163345847
Looks good
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user EugenCepoi commented on a diff in the pull request:
https://github.com/apache/spark/pull/10203#discussion_r47127712
--- Diff:
extras/kinesis-asl/src/main/scala/org/apache/spark/streaming/kinesis/KinesisReceiver.scala
---
@@ -222,7 +222,7 @@ private[kinesis] class
Github user EugenCepoi commented on the pull request:
https://github.com/apache/spark/pull/10203#issuecomment-162990380
That problem actually applies to all types for which Kryo provides a
default ser/de. Mostly because kryo will try to deserialize to the type known
during
Github user EugenCepoi commented on the pull request:
https://github.com/apache/spark/pull/10203#issuecomment-162985706
Why don't you just replace the use of SynchronizedMap in KinesisReceiver
with a ConcurrentHashMap instead?
---
If your project is set up for it, you can rep
Github user EugenCepoi commented on the pull request:
https://github.com/apache/spark/pull/7122#issuecomment-136454383
Np, thanks for the review :)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not
Github user EugenCepoi commented on the pull request:
https://github.com/apache/spark/pull/7122#issuecomment-124728202
@squito I don't get what fails. Is it some random issue or did I really
break something? If I didn't should we just re run the tests?
---
If your project
Github user EugenCepoi commented on a diff in the pull request:
https://github.com/apache/spark/pull/7122#discussion_r35452547
--- Diff:
core/src/main/scala/org/apache/spark/serializer/JavaSerializer.scala ---
@@ -62,17 +62,34 @@ private[spark] class JavaDeserializationStream(in
Github user EugenCepoi commented on a diff in the pull request:
https://github.com/apache/spark/pull/7122#discussion_r35418323
--- Diff:
core/src/test/scala/org/apache/spark/serializer/JavaSerializerSuite.scala ---
@@ -25,4 +25,18 @@ class JavaSerializerSuite extends SparkFunSuite
Github user EugenCepoi commented on a diff in the pull request:
https://github.com/apache/spark/pull/7122#discussion_r35416682
--- Diff:
core/src/test/scala/org/apache/spark/serializer/JavaSerializerSuite.scala ---
@@ -25,4 +25,18 @@ class JavaSerializerSuite extends SparkFunSuite
GitHub user EugenCepoi opened a pull request:
https://github.com/apache/spark/pull/7122
Fixes SPARK-8730 - Deser objects containing a primitive class attribute
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/EugenCepoi/spark
Github user EugenCepoi commented on a diff in the pull request:
https://github.com/apache/spark/pull/5708#discussion_r32741755
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -926,7 +926,9 @@ class SparkContext(config: SparkConf) extends Logging
with
Github user EugenCepoi commented on a diff in the pull request:
https://github.com/apache/spark/pull/5708#discussion_r32720782
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -926,7 +926,9 @@ class SparkContext(config: SparkConf) extends Logging
with
Github user EugenCepoi commented on a diff in the pull request:
https://github.com/apache/spark/pull/5708#discussion_r32719064
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -926,7 +926,9 @@ class SparkContext(config: SparkConf) extends Logging
with
Github user EugenCepoi commented on a diff in the pull request:
https://github.com/apache/spark/pull/5708#discussion_r32718265
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -926,7 +926,9 @@ class SparkContext(config: SparkConf) extends Logging
with
Github user EugenCepoi commented on the pull request:
https://github.com/apache/spark/pull/2481#issuecomment-57824300
@andrewor14 great! Glad to see it merged into master :)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user EugenCepoi commented on the pull request:
https://github.com/apache/spark/pull/2481#issuecomment-57786084
@andrewor14 The trailing % came from a bad copy paste, nice catch :) Double
percent exists but was wrong in this case,
http://stackoverflow.com/questions/14509652
Github user EugenCepoi commented on the pull request:
https://github.com/apache/spark/pull/997#issuecomment-56305844
@andrewor14 @pwendell I updated it and opened a new PR #2481 against master.
---
If your project is set up for it, you can reply to this email and have your
reply
GitHub user EugenCepoi opened a pull request:
https://github.com/apache/spark/pull/2481
SPARK-2058: Overriding SPARK_HOME/conf with SPARK_CONF_DIR
Update of PR #997.
With this PR, setting SPARK_CONF_DIR overrides SPARK_HOME/conf (not only
spark-defaults.conf and spark-env
Github user EugenCepoi commented on the pull request:
https://github.com/apache/spark/pull/997#issuecomment-54427688
@andrewor14 Agreed it's too verbose. I will make those changes and reopen
it against branch-1.1 but I wont have the time to do it right now.
---
If your proje
Github user EugenCepoi commented on the pull request:
https://github.com/apache/spark/pull/997#issuecomment-46086066
Updated the doc. BTW, actually the behaviour is not symmetric, for example
spark-env if only searched in SPARK_CONF_DIR if defined but for the configs
loaded from
Github user EugenCepoi commented on the pull request:
https://github.com/apache/spark/pull/997#issuecomment-45997041
Thx for the comments. Updated the PR based on them.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user EugenCepoi commented on a diff in the pull request:
https://github.com/apache/spark/pull/997#discussion_r13710617
--- Diff: bin/compute-classpath.sh ---
@@ -30,6 +30,11 @@ FWDIR="$(cd `dirname $0`/..; pwd)"
# Build up classpath
CLASSPATH="$
GitHub user EugenCepoi opened a pull request:
https://github.com/apache/spark/pull/997
SPARK-2058: Overriding config from SPARK_HOME with SPARK_CONF_DIR
Fixes https://issues.apache.org/jira/browse/SPARK-2058
The general idea is to use SPARK_CONF_DIR content instead of
23 matches
Mail list logo