GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/10674
[SPARK-12736][CORE][DEPLOY] Standalone Master cannot be started due tâ¦
â¦o NoClassDefFoundError: org/spark-project/guava/collect/Maps
/cc @srowen @rxin
You can merge this pull
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/10595#issuecomment-169633803
On it. I'll rebase and push update. Thanks @srowen for code review.
---
If your project is set up for it, you can reply to this email and have your
reply a
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/10636#discussion_r49051693
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/json/JSONRelation.scala
---
@@ -68,29 +68,12 @@ private[sql] class
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/10603#discussion_r49048172
--- Diff: docs/streaming-custom-receivers.md ---
@@ -257,9 +257,9 @@ The following table summarizes the characteristics of
both types of receivers
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/10636
[MINOR] Fix for BUILD FAILURE for Scala 2.11
It was introduced in 917d3fc069fb9ea1c1487119c9c12b373f4f9b77
/cc @cloud-fan @rxin
You can merge this pull request into a Git
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/10603#discussion_r48852732
--- Diff: docs/streaming-custom-receivers.md ---
@@ -273,9 +273,9 @@ class CustomActor extends Actor with ActorHelper {
And a new input stream
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/10603
[STREAMING][DOCS][EXAMPLES] Minor fixes
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/jaceklaskowski/spark
streaming-actor-custom
Github user jaceklaskowski closed the pull request at:
https://github.com/apache/spark/pull/10591
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/10591#issuecomment-168976471
Merged to #10595
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user jaceklaskowski closed the pull request at:
https://github.com/apache/spark/pull/10592
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/10592#issuecomment-168976132
Merged to #10595.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/10595#issuecomment-168975843
Merged the other branches and ran build locally. Please review and merge at
your convenience @srowen @rxin. Thanks!
---
If your project is set up for it, you
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/10595#discussion_r48830750
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/dstream/DStream.scala ---
@@ -286,7 +286,7 @@ abstract class DStream[T: ClassTag
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/10595#discussion_r48830663
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/scheduler/JobSet.scala ---
@@ -59,17 +59,15 @@ case class JobSet(
// Time
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/10595#discussion_r48830059
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/scheduler/JobSet.scala ---
@@ -59,17 +59,15 @@ case class JobSet(
// Time
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/10595#issuecomment-168949749
Yes, sure! Been worried that @srowen might cross few lines out that would
completely devastate my mood today :) On to...
---
If your project is set up for it
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/10595
[STREAMING][MINOR] More contextual information in logs + minor code iâ¦
â¦mprovements
Please review and merge at your convenience. Thanks!
You can merge this pull request into a
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/10592
[STREAMING][MINOR] Scaladoc fixes...mostly
Please review and merge at your convenience. Thanks.
You can merge this pull request into a Git repository by running:
$ git pull https
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/10591
[CORE][MINOR] scaladoc fixes
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/jaceklaskowski/spark core-scaladoc
Alternatively you can
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/7385#issuecomment-168384426
@zsxwing @JoshRosen Does the comment need attention since the pr is closed,
https://github.com/apache/spark/blob/master/streaming/src/main/scala/org/apache
Github user jaceklaskowski closed the pull request at:
https://github.com/apache/spark/pull/10433
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/10433#issuecomment-166628562
Doh, you're *again* inviting me to make more involved changes (which is
great, but don't think helps others to contribute). What I'm gonna do is
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/10433#issuecomment-166618819
I disagree since Spark core is hard due to what it does and making it
harder can hinder contributions. There are just such little things that can
change how
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/10433
[CORE] Refactoring: Use pattern matching and dedicated method
It should be slightly easier to see what really happens (without questions
like what other state a task can be here
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/10432
Minor corrections, i.e. typo fixes and follow deprecated
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/jaceklaskowski/spark minor
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/10273#discussion_r47426529
--- Diff:
examples/src/main/java/org/apache/spark/examples/BasicAuthFilter.java ---
@@ -0,0 +1,108 @@
+/*
+ * Licensed to the Apache Software
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/10273#discussion_r47426522
--- Diff:
examples/src/main/java/org/apache/spark/examples/BasicAuthFilter.java ---
@@ -0,0 +1,108 @@
+/*
+ * Licensed to the Apache Software
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/9603#issuecomment-157309968
I don't agree with @andrewor14. It does add a value of being consistent
with how Spark informs about its status - if it says "Stopping..." at INF
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/9603#discussion_r44723534
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -506,6 +506,7 @@ class SparkContext(config: SparkConf) extends Logging
with
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/9538#issuecomment-154988152
Knock, knock. Can we do someting with the patch as the build procedure for
Scala 2.11 expanded with another step - `git merge pr-9538` beside
`./dev/change-scala
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/9538#issuecomment-154707778
It worked fine for me.
```
â spark git:(master) â ./build/mvn -Pyarn -Phadoop-2.6
-Dhadoop.version=2.7.1 -Dscala-2.11 -Phive -Phive-thriftserver
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/9501#issuecomment-154572569
Thanks again @srowen! Next pull req should be from starter tag.
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/9501#issuecomment-154457887
The only files that touch spaces are `ShuffleMapTask.scala` and
`TaskSet.scala` (and these could be reverted). The others look more than space
fixes
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/9501#issuecomment-154199308
Thanks @srowen for reviewing the changes. They indeed are small since
they're the outcome of me going through the code and learning Spark this way.
When
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/9501#discussion_r44071949
--- Diff: core/src/main/scala/org/apache/spark/rdd/HadoopRDD.scala ---
@@ -195,9 +196,6 @@ class HadoopRDD[K, V](
// add the credentials here
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/9501
Typo fixes + code readability improvements
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/jaceklaskowski/spark typos-with-style
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/9444#issuecomment-153579178
Thanks @rxin! I knew it might've been too small for a change, but since
it's in the UI, I thought I'd *not* wait till I find other typos.
---
If
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/9444
Fix typo in WebUI
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/jaceklaskowski/spark TImely-fix
Alternatively you can review and
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/9432#issuecomment-153362113
Could you change `Usage: Worker [options] ` and `Usage: Master
[options]` to be the scripts' names themselves not the cryptic Master and
Worker?
---
If
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/9250#issuecomment-150852862
All for now. Merge at will. Thanks.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/9250#issuecomment-150756926
I need few more hours. I'll ping you. Thanks.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as wel
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/9250#issuecomment-150659267
Thanks @rxin @srowen for support! It took me 30 minutes to find and use
`aspell` on a single Scala package and it drove me crazy with tons of false
positives
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/9250#issuecomment-150606925
Ok, deal. I can run a spell-checker and see what I can fix within a
half-an-hour timeframe. Should I go and create a JIRA task for it? Any
particular module
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/9250
Fix typos
Two typos squashed.
BTW Let me know how to proceed with other typos if I ran across any. I
don't feel well to leave them aside as much as sending pull requests with
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/9230
Fix a (very tiny) typo
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/jaceklaskowski/spark utils-seconds-typo
Alternatively you can
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8976#issuecomment-148512484
@srowen Mind looking at the PR once again? I'd appreciate.
---
If your project is set up for it, you can reply to this email and have your
reply appear on G
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8976#issuecomment-147602196
Hey @srowen @jerryshao How does the change look like now?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8976#issuecomment-146574566
SPARK-2089 is marked as resolved so any discussion there is done (or am I
mistaken?)
Also, I'm saying that the comments say it's used in YARN
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8976#issuecomment-146506978
Sure, but how do you know **now** how the public API is going to look like?
It is **currently** causing troubles in understanding the code and we only
**wish
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8976#issuecomment-146474747
Fully agree. I'm however not skilled to work on the feature to enable
locality preference and haven't heard about anyone working on it, and the code
as
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8976#issuecomment-146462053
At some point in the future, very much likely, but it's not happening now
and I'd remove it now (to make the code clearer) and once it's needed it
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8976#issuecomment-145958130
Thanks @vanzin @srowen! That helped a lot. I'm going to fix it. How can I
run the compatibility check locally? Is it `./build/sbt
core/mima-report-binary-i
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/8976#discussion_r41303229
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -147,23 +139,41 @@ class SparkContext(config: SparkConf) extends Logging
with
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8976#issuecomment-145910412
@srowen @SparkQA says "This patch fails MiMa tests.", but I can't seem to
decipher what exactly the pull request has broken. I remember you mentio
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/8976
[SPARK-10921] [YARN] Completely remove the use of SparkContext.preferâ¦
â¦redNodeLocationData
You can merge this pull request into a Git repository by running:
$ git pull https
Github user jaceklaskowski closed the pull request at:
https://github.com/apache/spark/pull/8753
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8795#issuecomment-141894712
> Still, probably best to just merge this, as it's unlikely to cause much
if any trouble.
Would you? I'd greatly appreciate (and propos
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8795#issuecomment-141827044
I disagree with not accepting this change in this version **with** the
superfluous spaces at the end of lines removed -- they're simply a garbage (and
shoul
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8795#issuecomment-141779426
Do you want me to split the pull requests to two - one with code formatting
in table and another for the *unfortunate* excessive spaces? And what JIRA
would that
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8795#issuecomment-141675546
It should be better now. While fixing the docs, Atom fixed the additional
spaces at the end (that I remember you mentioned not to fix, but since it was
done
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8795#issuecomment-141569374
So you want me to review the other documents for
no-backtick-code-formatted-in-table issue? I'm fine with it, but just need to
confirm my thinking.
---
If
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8795#issuecomment-141048359
Is [SPARK-10662](https://issues.apache.org/jira/browse/SPARK-10662) what
you were thinking about? What should be the next steps? Guide me in the jira,
please
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/8795
[MINOR][DOCS] Fixes markup so code is properly formatted
* Backticks are processed properly in Spark Properties table
* Removed unnecessary spaces
* See
http://people.apache.org
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8753#issuecomment-140414166
Should we then move the discussion to the mailing list and/or creating a
task in JIRA?
---
If your project is set up for it, you can reply to this email and
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/8762#discussion_r39517063
--- Diff: docs/running-on-yarn.md ---
@@ -54,8 +54,8 @@ In `yarn-cluster` mode, the driver runs on a different
machine than the client
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/8762#discussion_r39504260
--- Diff: docs/running-on-yarn.md ---
@@ -54,8 +54,8 @@ In `yarn-cluster` mode, the driver runs on a different
machine than the client
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/8762
[DOCS] Small fixes to Spark on Yarn doc
* a follow-up to 16b6d18613e150c7038c613992d80a7828413e66 as
`--num-executors` flag is not suppported.
* links + formatting
You can merge this
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/8759
Small fixes to docs
Links work now properly + consistent use of *Spark standalone cluster*
(Spark uppercase + lowercase the rest -- seems agreed in the other places in
the docs).
You can
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/8753#issuecomment-140193323
Is this better discussed on the user or dev mailing lists? I disagree on
the use of another dependency hop using such an object, but can agree with you
on doing
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/8753
Introduce config constants object
A small refactoring to introduce a Scala object to keep
property/environment names in a single place for YARN cluster deployment first
(as I hate seeing
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/8629#discussion_r39024388
--- Diff: docs/cluster-overview.md ---
@@ -33,9 +34,9 @@ There are several useful things to note about this
architecture:
2. Spark is agnostic to
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/8629#discussion_r39024259
--- Diff: docs/building-spark.md ---
@@ -163,11 +164,9 @@ the `spark-parent` module).
Thus, the full flow for running continuous-compilation
Github user jaceklaskowski closed the pull request at:
https://github.com/apache/spark/pull/8630
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/8630
Space out
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/jaceklaskowski/spark build-space-out
Alternatively you can review and apply
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/8629
Docs small fixes
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/jaceklaskowski/spark docs-fixes
Alternatively you can review and apply
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/8479
[SPARK-9613] [HOTFIX] Fix usage of JavaConverters removed in Scala 2.11
Fix for
[JavaConverters.asJavaListConverter](http://www.scala-lang.org/api/2.10.5/index.html
Github user jaceklaskowski closed the pull request at:
https://github.com/apache/spark/pull/830
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/830#issuecomment-54374855
No worries. Do what you think is going to be the best solution for the
project. I don't mind closing the PR.
---
If your project is set up for it, you can
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/830#issuecomment-43816303
Please review the changes that were introduced after @tdas's comments.
---
If your project is set up for it, you can reply to this email and have your
reply a
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/830#discussion_r12870325
--- Diff: docs/streaming-programming-guide.md ---
@@ -306,12 +304,16 @@ need to know to write your streaming applications.
## Linking
To
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/830#discussion_r12870184
--- Diff: docs/streaming-programming-guide.md ---
@@ -105,23 +104,22 @@ generating multiple new records from each record in
the source DStream. In this
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/830#discussion_r12870111
--- Diff: docs/streaming-programming-guide.md ---
@@ -83,21 +82,21 @@ import org.apache.spark.streaming.api._
val ssc = new StreamingContext("
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/830#discussion_r12869294
--- Diff: docs/streaming-programming-guide.md ---
@@ -83,21 +82,21 @@ import org.apache.spark.streaming.api._
val ssc = new StreamingContext("
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/830
Small updates to Streaming Programming Guide
Please merge. More update will come soon.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/706#discussion_r12490430
--- Diff: project/SparkBuild.scala ---
@@ -16,17 +16,18 @@
*/
import sbt._
-import sbt.Classpaths.publishTask
-import sbt.Keys
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/706#issuecomment-42659515
Hey @pwendell I'd love being engaged in the effort if possible. Where could
we discuss how much I could do regarding sbt (I think I might be quite helpful
her
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/706
Simplify the build with sbt 0.13.2 features
It's a WIP, but am pull request'ing the current changes hoping that someone
from the dev team would have a look at the changes and guide
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/748#discussion_r12665202
--- Diff: core/src/main/scala/org/apache/spark/SparkEnv.scala ---
@@ -296,18 +297,15 @@ object SparkEnv extends Logging {
// System
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/748
String interpolation + some other small changes
After having been invited to make the change in
https://github.com/apache/spark/commit/6bee01dd04ef73c6b829110ebcdd622d521ea8ff#commitcomment
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/706#issuecomment-42785356
Thanks! I've said it before (when @pwendell asked to hold off) and now I'll
say it again as my changes don't seem to find home soon before the
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/706#discussion_r12508408
--- Diff: project/SparkBuild.scala ---
@@ -297,7 +273,7 @@ object SparkBuild extends Build {
val chillVersion = "0.3.6"
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/706#issuecomment-42767260
Knock, knock. Could I ask to approve the changes or let me know what
portion to change to get it in?
---
If your project is set up for it, you can reply to this
Github user jaceklaskowski commented on a diff in the pull request:
https://github.com/apache/spark/pull/706#discussion_r12508264
--- Diff: project/SparkBuild.scala ---
@@ -16,17 +16,18 @@
*/
import sbt._
-import sbt.Classpaths.publishTask
-import sbt.Keys
Github user jaceklaskowski commented on the pull request:
https://github.com/apache/spark/pull/706#issuecomment-42700893
As to the changes I proposed in the PR, I think that however the future
steps with sbt-pom-reader they're easily applicable to the build. They (are
suppos
GitHub user jaceklaskowski opened a pull request:
https://github.com/apache/spark/pull/671
sbt assembly and environment variables
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/jaceklaskowski/spark docs-index
Alternatively you
501 - 595 of 595 matches
Mail list logo