think
things will mostly work, except for possibly the REPL (which has require
porting over code form the Scala REPL in each version).
Hi,
Would that be possible to have a JIRA issue for this (so I could have
a branch for the cross-build in sbt and give the task a try)?
Jacek
--
Jacek
:54 PM, Jacek Laskowski ja...@japila.pl wrote:
On Sun, May 11, 2014 at 11:08 PM, Matei Zaharia matei.zaha...@gmail.com
wrote:
We do want to support it eventually, possibly as early as Spark 1.1 (which
we’d cross-build on Scala 2.10 and 2.11). If someone wants to look at it
before, feel free
Hi,
I'm curious if it's a common approach to have discussions in JIRA not here.
I don't think it's the ASF way.
Pozdrawiam,
Jacek Laskowski
http://blog.japila.pl
17 maj 2014 23:55 Matei Zaharia matei.zaha...@gmail.com napisał(a):
We do actually have replicated StorageLevels in Spark. You can
might accept
ultimately, but that would require some adoption time).
What wrong with linking a discussion thread to a JIRA issue?
Jacek
--
Jacek Laskowski | http://blog.japila.pl
Never discourage anyone who continually makes progress, no matter how
slow. Plato
,
Jacek
--
Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
Follow me at https://twitter.com/jaceklaskowski
Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
-
To unsubscribe, e-mail: dev-unsubscr
Worked for me. Thanks!
Pozdrawiam,
Jacek
--
Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
Follow me at https://twitter.com/jaceklaskowski
Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
On Sat, Nov 7, 2015 at 1:56 PM, Ted Yu <yuzhih...@gmail.
Hi,
It appears it's time to switch to my lovely sbt then!
Pozdrawiam,
Jacek
--
Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
Follow me at https://twitter.com/jaceklaskowski
Upvote at http://stackoverflow.com/users/1305344/jacek-laskowski
On Tue, Nov 3, 2015 at 2:58
- 14.0.1
[warn] Run 'evicted' to see detailed eviction warnings
[success] Total time: 3 s, completed Aug 27, 2015 11:58:18 AM
Pozdrawiam,
Jacek
--
Jacek Laskowski | http://blog.japila.pl | http://blog.jaceklaskowski.pl
Follow me at https://twitter.com/jaceklaskowski
Upvote at http
On Thu, Dec 10, 2015 at 8:10 PM, Shixiong Zhu wrote:
> Jacek, could you create a JIRA for it? I just reproduced it. It's a bug in
> how Master handles the Worker disconnection.
Hi Shixiong,
I'm saved. Kept thinking I'm lost in the sources and see ghosts :-)
s "./sbin/start-slave.sh spark://localhost:7077".
p.s. Are such questions appropriate for this mailing list?
Pozdrawiam,
Jacek
--
Jacek Laskowski | https://medium.com/@jaceklaskowski/ |
http://blog.jaceklaskowski.pl
Mastering Spark https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
Follow
Hi,
I'm on yesterday's master HEAD.
Pozdrawiam,
Jacek
--
Jacek Laskowski | https://medium.com/@jaceklaskowski/ |
http://blog.jaceklaskowski.pl
Mastering Spark https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
Follow me at https://twitter.com/jaceklaskowski
Upvote at http
Hi,
Done. See https://github.com/apache/spark/pull/10636
Pozdrawiam,
Jacek
Jacek Laskowski | https://medium.com/@jaceklaskowski/
Mastering Apache Spark
==> https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
Follow me at https://twitter.com/jaceklaskowski
On Thu, Jan 7, 2016 at 8:10
Figured it out and reported
https://issues.apache.org/jira/browse/SPARK-12736. Fix's coming...
Pozdrawiam,
Jacek
Jacek Laskowski | https://medium.com/@jaceklaskowski/
Mastering Apache Spark
==> https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
Follow me at https://twitter.
Hi,
https://github.com/apache/spark/pull/10674
Please review and merge at your convenience. Thanks!
Pozdrawiam,
Jacek
Jacek Laskowski | https://medium.com/@jaceklaskowski/
Mastering Apache Spark
==> https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
Follow me at https://twitter.
On Sat, Jan 9, 2016 at 1:48 PM, Sean Owen wrote:
> (For similar reasons I personally don't favor supporting Java 7 or
> Scala 2.10 in Spark 2.x.)
That reflects my sentiments as well. Thanks Sean for bringing that up!
Jacek
dClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 15 more
Pozdrawiam,
Jacek
Jacek Laskowski | http
,
Jacek
Jacek Laskowski | https://medium.com/@jaceklaskowski/
Mastering Apache Spark
==> https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
Follow me at https://twitter.com/jaceklaskowski
-
To unsubscribe, e-mail:
atever you want (log to stdout or whatever)? Just a thought.
Pozdrawiam,
Jacek
--
Jacek Laskowski | https://medium.com/@jaceklaskowski/ |
http://blog.jaceklaskowski.pl
Mastering Spark https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
Follow me at https://twitter.com/jaceklaskowski
Upvo
On Mon, Nov 30, 2015 at 10:53 PM, Josh Rosen wrote:
> In SBT, these wind up on the Docker JDBC tests' classpath as a transitive
> dependency of the `spark-sql` test JAR. However, what we should be doing is
> adding them as explicit test dependencies of the
s]
[INFO] Spark Project External Flume ... FAILURE [ 1.010 s]
[INFO] Spark Project External Flume Assembly .. SKIPPED
[1]
https://github.com/apache/spark/commit/3ab0138b0fe0f9208b4b476855294a7c729583b7
Pozdrawiam,
Jacek
Jacek Laskowski | https://medium.com
Hi,
Congrats Yanbo!
p.s. It should go to user@, too.
Jacek
On 4 Jun 2016 4:49 a.m., "Matei Zaharia" wrote:
Hi all,
The PMC recently voted to add Yanbo Liang as a committer. Yanbo has been a
super active contributor in many areas of MLlib. Please join me in
welcoming
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
]
https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala#L1298
[2] Path.SEPARATOR
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com
. And
only then I could work on
https://issues.apache.org/jira/browse/YARN-5247. Is this about
changing the annotation(s) only?
Thanks for your support!
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https
my limited understanding of the things (and I'm not even sure
how trustworthy it is). Use with extreme caution.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Sat, May 28, 20
nt (and yarn-cluster) are no longer in use, I'm pretty
sure it's of no use and could be safely removed. If not, we should do
something with it anyway.
Please guide before I file a JIRA issue. Thanks.
p.s. On to hunting SPARK_YARN_MODE...
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskows
On Sun, May 29, 2016 at 5:30 PM, jpivar...@gmail.com
wrote:
> If I find a way to provide
> access by modifying Spark source, can I just submit a pull request, or do I
> need to be a recognized Spark developer? If so, is there a process for
> becoming one?
Start a discussion
That's be awesome to have another 2.0 RC! I know many people who'd
consider it as a call to action to play with 2.0.
+1000
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com
On Sat, Jun 18, 2016 at 6:13 AM, Pedro Rodriguez
wrote:
> using Datasets (eg using $ to select columns).
Or even my favourite one - the tick ` :-)
Jacek
-
To unsubscribe, e-mail:
`backend.reviveOffers()`?
p.s. I understand that it's somehow related to how Mesos manages
resources where it offers resources, but can't find anything related
to `reviving offers` in Mesos docs :(
Please guide. Thanks!
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark
+1
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Sat, Jun 18, 2016 at 9:13 AM, Reynold Xin <r...@databricks.com> wrote:
> Looks like that's resolved n
Spark application for execution
both -- memory and cores -- can be specified explicitly.
Would you agree? Do I miss anything important?
I was very surprised when I found it out as I thought that memory
would also have been a limiting factor.
Pozdrawiam,
Jacek Laskowski
https://medium.com
Hi,
I've just git pull and it worked for me. Looks like
https://github.com/apache/spark/commit/f13c7f8f7dc8766b0a42406b5c3639d6be55cf33
fixed the issue (or something in-between).
Thanks for such a quick fix!
p.s. Had time for swimming :-)
Pozdrawiam,
Jacek
Jacek Laskowski | https://medium.com
Thanks Sean. I'm going to create a JIRA for it and start the work under it.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Mon, Jun 27, 2016 at 9:19 AM, Sean Owen
/mesos/MesosCoarseGrainedSchedulerBackend.scala#L71
[3]
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/storage/BlockManager.scala#L73-L74
[4]
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/Utils.scala#L748
Pozdrawiam,
Jacek
Hi,
Is there a reason to use conf to read SPARK_WORKER_MEMORY not
System.getenv as for the other env vars?
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/worker/WorkerArguments.scala#L45
Pozdrawiam,
Jacek
Jacek Laskowski | https://medium.com
Test Tags FAILURE [ 0.321 s]
Pozdrawiam,
Jacek
Jacek Laskowski | https://medium.com/@jaceklaskowski/
Mastering Apache Spark
==> https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
Follow me at https://twitter.com/j
On Wed, Jan 20, 2016 at 8:48 PM, Marcelo Vanzin <van...@cloudera.com> wrote:
> On Wed, Jan 20, 2016 at 11:46 AM, Jacek Laskowski <ja...@japila.pl> wrote:
>> /Users/jacek/dev/oss/spark/tags/target/scala-2.11/classes...
>> [error] Cannot run program "javac"
On Wed, Jan 20, 2016 at 8:48 PM, Marcelo Vanzin <van...@cloudera.com> wrote:
> On Wed, Jan 20, 2016 at 11:46 AM, Jacek Laskowski <ja...@japila.pl> wrote:
>> /Users/jacek/dev/oss/spark/tags/target/scala-2.11/classes...
>> [error] Cannot run program "javac"
Hi,
My very rough investigation has showed that the commit to may have
broken the build was
https://github.com/apache/spark/commit/555127387accdd7c1cf236912941822ba8af0a52
(nongli committed with rxin 7 hours ago).
Found a fix and building the source again...
Pozdrawiam,
Jacek
Jacek Laskowski
Hi devs,
Following up on this, it appears that spark.worker.ui.port can only be
set in --properties-file. I wonder why conf/spark-defaults.conf is
*not* used for the spark.worker.ui.port property? Any reason for the
decision?
Pozdrawiam,
Jacek
Jacek Laskowski | https://medium.com
On Fri, Feb 12, 2016 at 11:08 PM, Sean Owen wrote:
> I think that difference in the code is just an oversight. They
> actually do the same thing.
Correct. Just meant to know the reason if there was any.
> Why do you say this property can only be set in a file?
I said that
ark-mllib-local_2.11
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
-
To unsubscribe, e-mail: dev-unsub
/961M
[INFO]
Thank you so much for the prompt solution! And that's while I was
driving from Toronto to Mississauga. Thanks!
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http
it.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
Hi Ted,
Yeah, I saw the line, but forgot it's a test that may have been
testing that closures should not have return. More clear now. Thanks!
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https
https://github.com/apache/spark/commit/c59abad052b7beec4ef550049413e95578e545be.
Is this a real issue with the build now or is this just me? I may have
seen a similar case before, but can't remember what the fix was.
Looking into it.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http:/
OS X
➜ spark git:(master) ✗ java -version
java version "1.8.0_77"
Java(TM) SE Runtime Environment (build 1.8.0_77-b03)
Java HotSpot(TM) 64-Bit Server VM (build 25.77-b03, mixed mode)
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly
Hi,
After few weeks with spark.ml now, I came to conclusion that
Transformer concept from Pipeline API (spark.ml/MLlib) should be part
of DataFrame (SQL) where they fit better. Are there any plans to
migrate Transformer API (ML) to DataFrame (SQL)?
Pozdrawiam,
Jacek Laskowski
https
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
-
To unsubscribe, e-mail: dev-unsubscr...@spark.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Sat, Mar 26, 2016 at 3:23 AM, Joseph Bradley <jos...@databricks.com> wrote:
> There have been some comments ab
(and this discussion is a sign that the process has not been
conducted properly as people have concerns, me including).
Thanks Mridul!
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
l` (`final`)
** `LogisticRegressionModel`
** `NaiveBayesModel`
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
-
To unsubscri
Hi Praveen,
I've spent few hours on the changes related to streaming dataframes
(included in the SPARK-8360) and concluded that it's currently only
possible to read.stream(), but not write.stream() since there are no
streaming Sinks yet.
Pozdrawiam,
Jacek Laskowski
https://medium.com
Okey...it's building now
properly...https://github.com/apache/spark/pull/11567 + git mv
scalastyle-config.xml dev/
How to fix it in the repo? Should I send a pull request to...pull
request #11567? Guide me or fix it yourself...somehow :-)
Pozdrawiam,
Jacek Laskowski
https://medium.com
Hi,
At first glance it appears the commit *yesterday* (Warsaw time) broke
the build :(
https://github.com/apache/spark/commit/0eea12a3d956b54bbbd73d21b296868852a04494
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache
Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Pozdrawiam,
Jacek Laskow
Hi,
Nope. It's not. It's at
https://github.com/apache/spark/commit/0eea12a3d956b54bbbd73d21b296868852a04494#diff-600376dffeb79835ede4a0b285078036L2249.
I've got that and testing...
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering
Sure! Go ahead and...fix the build. Thanks.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Tue, Mar 8, 2016 at 8:48 AM, Dongjoon Hyun <dongj...@apache.org>
Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Tue, Mar 8, 2016 at 8:55 AM, Jacek Laskowski <ja...@japila.pl> wrote:
> Sure! Go ahead and...fix the build. Thanks.
>
Hi Praveen,
I don't really know. I think TD or Michael should know as they
personally involved in the task (as far as I could figure it out from
the JIRA and the changes). Ping people on the JIRA so they notice your
question(s).
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski
g? Please guide.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
-
To unsubscribe, e-mail: dev-unsubscr...@s
implicits._ (52 terms, 31 are implicit)
2) import sqlContext.sql (1 terms)
scala> sc.version
res19: String = 2.0.0-SNAPSHOT
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https
if at least some
> traits from org.apache.spark.ml.param.shared.sharedParams were
> public?HasInputCol(s) and HasOutputCol for example. These are useful
> pretty much every time you create custom Transformer.
>
> --
> Pozdrawiam,
> Maciej Szymkiewicz
>
>
> On 03
Hi,
Looks related to the recent commit...
Repository: spark
Updated Branches:
refs/heads/master 2262a9335 -> 1f0c5dceb
[SPARK-14350][SQL] EXPLAIN output should be in a single cell
Jacek
03.04.2016 7:00 PM "Ted Yu" napisał(a):
> Hi,
> Based on master branch refreshed
]
Thanks Reynold! :)
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Tue, Apr 26, 2016 at 6:38 AM
]^
[error] one error found
[error] Compile failed at Apr 26, 2016 6:28:01 AM [0.449s]
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Tue, Apr 26, 2016 at 6
?
[1]
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala#L131-L142
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com
it's not a release feature I didn't mean to file an issue in
JIRA - please guide if needed).
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
When you say "in the future", do you have any specific timeframe in
mind? You got me curious :)
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Mon, Ap
ing it could have.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
-
To unsubscribe, e-mail: dev-unsub
Hi Yanbo,
https://issues.apache.org/jira/browse/SPARK-14730
Thanks!
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Tue, Apr 19, 2016 at 8:55 AM, Yanbo Liang
? It appears that we calls
the former to execute the latter (?) I'm confused. Please explain :)
I'd appreciate.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
Kill 'em all -- one by one slowly yet gradually! :)
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Wed, Jul 27, 2016 at 9:11 PM, Holden Karau <
Hi,
It seems that the current master is broken twice. I've just sent a PR
for the first one. Please review and merge.
https://github.com/apache/spark/pull/14315
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
hts before filling an JIRA issue.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
-
To unsubscribe e
om/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala#L205
[4]
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/util/ShutdownHookManager.scala#L146-L147
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jacek
la#L355
[3]
https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala#L360
[4]
https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala#L200
Pozdrawiam,
Jacek Laskowski
https://medium
Hi Olivier,
I don't know either, but am curious what you've tried already.
Jacek
On 3 Aug 2016 10:50 a.m., "Olivier Girardot" <
o.girar...@lateral-thoughts.com> wrote:
> Hi everyone,
> I'm currently to use Spark 2.0.0 and making Dataframes work with kryo.
> registrationRequired=true
> Is it
s and PMCs should do not
users:
"Do not include any links on the project website that might encourage
non-developers to download and use nightly builds, snapshots, release
candidates, or any other similar package."
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Master
+1000
Thanks Ismael for bringing this up! I meant to have send it earlier too
since I've been struggling with a sbt-based Scala project for a Spark
package myself this week and haven't yet found out how to do local
publishing.
If such a guide existed for Maven I could use it for sbt easily too
And the reason is that not all Spark installations are for YARN as the
cluster manager.
Jacek
On 13 Jul 2016 9:23 a.m., "Sean Owen" wrote:
> It's activated by a profile called 'yarn', like several other modules.
>
> On Wed, Jul 13, 2016 at 5:15 AM, Niranda Perera
>
Hi,
That makes sense. Thanks Dongjoon for the very prompt response!
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Tue, Jun 28, 2016 at 6:58 PM, Dongjoon Hyun
Hi,
While reviewing the release notes for 1.6.2 I stumbled upon
https://issues.apache.org/jira/browse/SPARK-13522. It's got Target
Version/s: 2.0.0 with Fix Version/s: 1.6.2, 2.0.0.
What's the meaning of Target Version/s in Spark?
Pozdrawiam,
Jacek Laskowski
https://medium.com
Hi Reynold,
Is this already reported and tracked somewhere. I'm quite sure that
people will be asking about the reasons Spark does this. Where are
such issues reported usually?
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering
On Mon, Jul 4, 2016 at 6:14 AM, wrote:
> Repository: spark
> Updated Branches:
> refs/heads/master 88134e736 -> 8cdb81fa8
>
>
> [SPARK-15204][SQL] improve nullability inference for Aggregator
>
> ## What changes were proposed in this pull request?
>
>
unc:
org.apache.spark.api.java.function.ForeachPartitionFunction[Record])Unit
(f: Iterator[Record] => Unit)Unit
cannot be applied to (Unit)
ds.foreachPartition(println)
^
scala> sc.version
res9: String = 2.0.0-SNAPSHOT
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jacek
ds is Dataset and the problem is that println (or any other
one-element function) would not work here (and perhaps other methods
with two variants - Java's and Scala's).
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache
Hi,
Use jps -lm and see the processes on the machine(s) to kill.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Wed, Jul 6, 2016 at 9:49 PM, Mr rty ff <y
Hi,
It appears you're running local mode (local[*] assumed) so killing
spark-shell *will* kill the one and only executor -- the driver :)
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https
Hi,
Read the doc http://spark.apache.org/docs/latest/spark-standalone.html
which seems to be the cluster manager the OP uses.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com
Scala I want to use Scala API right). It appears that
any single-argument-function operators in Datasets are affected :(
My question was to know whether there are works to fix it (if possible
-- I don't know if it is).
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering
Thanks Cody, Reynold, and Ryan! Learnt a lot and feel "corrected".
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Wed, Jul 6, 2016 at 2:46 AM, Shixiong
Hi,
Fixed now. git pull and start over.
https://github.com/apache/spark/commit/e1bd70f44b11141b000821e9754efeabc14f24a5
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com
Hi,
Thanks Sean! It makes sense.
I'm not fully convinced that's how it should be, so I apologize if I
ever ask about the version management in Spark again :)
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow
Hi,
Always release from master. What could be the gotchas?
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Sat, Jul 2, 2016 at 11:36 PM, Sean Owen <
k
On 3 Jul 2016 2:59 a.m., "Reynold Xin" <r...@databricks.com> wrote:
> Because in that case you cannot merge anything meant for 2.1 until 2.0 is
> released.
>
> On Saturday, July 2, 2016, Jacek Laskowski <ja...@japila.pl> wrote:
>
>> Hi,
>>
>&
g? I must be missing something, but can't see it.
You're right, it has nothing to do with pace of release but the
project needs frequent releases say quarterly.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me
On Sun, Jul 3, 2016 at 3:49 PM, Sean Owen <so...@cloudera.com> wrote:
> On Sun, Jul 3, 2016 at 2:42 PM, Jacek Laskowski <ja...@japila.pl> wrote:
>> 2. Add new features to master (versions - master: 2.0.0-SNAPSHOT
>> branch: 2.0.0-RC1)
>
> Either:
> a) you proh
Hi Tim,
AWESOME. Thanks a lot for releasing it. That makes me even more eager
to see it in Spark's codebase (and replacing the current RDD-based
API)!
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me
1 - 100 of 333 matches
Mail list logo