[
https://issues.apache.org/jira/browse/SPARK-7286?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15014647#comment-15014647
]
Jakob Odersky edited comment on SPARK-7286 at 11/19/15 10:52 PM:
-
I just
[
https://issues.apache.org/jira/browse/SPARK-11288?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15011855#comment-15011855
]
Jakob Odersky commented on SPARK-11288:
---
Tell me if I'm missing the point, but you can also pass
Jakob Odersky created SPARK-11832:
-
Summary: Spark shell does not work from sbt with scala 2.11
Key: SPARK-11832
URL: https://issues.apache.org/jira/browse/SPARK-11832
Project: Spark
Issue
[
https://issues.apache.org/jira/browse/SPARK-7286?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15012643#comment-15012643
]
Jakob Odersky commented on SPARK-7286:
--
Going through the code, I saw that catalyst also defines
[
https://issues.apache.org/jira/browse/SPARK-7286?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15012532#comment-15012532
]
Jakob Odersky commented on SPARK-7286:
--
The problem is that !== is recognized as an assignment
[
https://issues.apache.org/jira/browse/SPARK-7286?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15012532#comment-15012532
]
Jakob Odersky edited comment on SPARK-7286 at 11/19/15 1:16 AM
[
https://issues.apache.org/jira/browse/SPARK-11832?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15012372#comment-15012372
]
Jakob Odersky commented on SPARK-11832:
---
I'm on to something, it seems as though the SBT actually
[
https://issues.apache.org/jira/browse/SPARK-9875?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15009891#comment-15009891
]
Jakob Odersky commented on SPARK-9875:
--
I'm not sure I understand the issue. Are you trying to force
[
https://issues.apache.org/jira/browse/SPARK-11765?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15007740#comment-15007740
]
Jakob Odersky commented on SPARK-11765:
---
I think adding a "blacklist" of ports
[
https://issues.apache.org/jira/browse/SPARK-11688?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15003600#comment-15003600
]
Jakob Odersky commented on SPARK-11688:
---
Registering a UDF requires a function (instance
[
https://issues.apache.org/jira/browse/SPARK-11688?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15003603#comment-15003603
]
Jakob Odersky commented on SPARK-11688:
---
As a workaround, you could register your function twice
Hey Jeff,
Do you mean reading from multiple text files? In that case, as a
workaround, you can use the RDD#union() (or ++) method to concatenate
multiple rdds. For example:
val lines1 = sc.textFile("file1")
val lines2 = sc.textFile("file2")
val rdd = lines1 union lines2
regards,
--Jakob
On 11
Hey Jeff,
Do you mean reading from multiple text files? In that case, as a
workaround, you can use the RDD#union() (or ++) method to concatenate
multiple rdds. For example:
val lines1 = sc.textFile("file1")
val lines2 = sc.textFile("file2")
val rdd = lines1 union lines2
regards,
--Jakob
On 11
Hi Sukant,
Regarding the first point: when building spark during my daily work, I
always use Scala 2.11 and have only run into build problems once. Assuming
a working build I have never had any issues with the resulting artifacts.
More generally however, I would advise you to go with Scala 2.11
if they are still actively being developed?
thanks,
--Jakob
On 10 November 2015 at 14:58, Jakob Odersky <joder...@gmail.com> wrote:
> (accidental keyboard-shortcut sent the message)
> ... spark-shell from the spark 1.5.2 binary distribution.
> Also, running "spPublishLocal&
Hi Simone,
I'm afraid I don't have an answer to your question. However I noticed the
DAG figures in the attachment. How did you generate these? I am myself
working on a project in which I am trying to generate visual
representations of the spark scheduler DAG. If such a tool already exists,
I
(accidental keyboard-shortcut sent the message)
... spark-shell from the spark 1.5.2 binary distribution.
Also, running "spPublishLocal" has the same effect.
thanks,
--Jakob
On 10 November 2015 at 14:55, Jakob Odersky <joder...@gmail.com> wrote:
> Hi,
> I ran into in err
Hi,
I ran into in error trying to run spark-shell with an external package that
I built and published locally
using the spark-package sbt plugin (
https://github.com/databricks/sbt-spark-package).
To my understanding, spark packages can be published simply as maven
artifacts, yet after running
it will change.
>> >
>> > Any improvements for the sbt build are of course welcome (it is still
>> used
>> > by many developers), but i would not do anything that increases the
>> burden
>> > of maintaining two build systems.
>> >
>> &g
Hi everyone,
in the process of learning Spark, I wanted to get an overview of the
interaction between all of its sub-projects. I therefore decided to have a
look at the build setup and its dependency management.
Since I am alot more comfortable using sbt than maven, I decided to try to
port the
[repost to mailing list, ok I gotta really start hitting that
reply-to-all-button]
Hi,
Spark uses Log4j which unfortunately does not support fine-grained
configuration over the command line. Therefore some configuration file
editing will have to be done (unless you want to configure Loggers
[repost to mailing list]
I don't know much about packages, but have you heard about the
sbt-spark-package plugin?
Looking at the code, specifically
https://github.com/databricks/sbt-spark-package/blob/master/src/main/scala/sbtsparkpackage/SparkPackagePlugin.scala,
might give you insight on the
:05 PM, Adrian Tanase <atan...@adobe.com> wrote:
>
>> Do you mean hadoop-2.4 or 2.6? not sure if this is the issue but I'm also
>> compiling the 1.5.1 version with scala 2.11 and hadoop 2.6 and it works.
>>
>> -adrian
>>
>> Sent from my iPhone
>>
>
[
https://issues.apache.org/jira/browse/SPARK-0?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14957721#comment-14957721
]
Jakob Odersky commented on SPARK-0:
---
exactly what I got, I'll take a look at it
> Scala 2
Jakob Odersky created SPARK-11122:
-
Summary: Fatal warnings in sbt are not displayed as such
Key: SPARK-11122
URL: https://issues.apache.org/jira/browse/SPARK-11122
Project: Spark
Issue Type
Jakob Odersky created SPARK-11094:
-
Summary: Test runner script fails to parse Java version.
Key: SPARK-11094
URL: https://issues.apache.org/jira/browse/SPARK-11094
Project: Spark
Issue Type
Jakob Odersky created SPARK-11092:
-
Summary: Add source URLs to API documentation.
Key: SPARK-11092
URL: https://issues.apache.org/jira/browse/SPARK-11092
Project: Spark
Issue Type
[
https://issues.apache.org/jira/browse/SPARK-11092?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Jakob Odersky updated SPARK-11092:
--
Description:
It would be nice to have source URLs in the Spark scaladoc, similar
[
https://issues.apache.org/jira/browse/SPARK-11092?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14955996#comment-14955996
]
Jakob Odersky commented on SPARK-11092:
---
I can't set the assignee field, though I'd like to resolve
[
https://issues.apache.org/jira/browse/SPARK-11092?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Jakob Odersky updated SPARK-11092:
--
Description:
It would be nice to have source URLs in the Spark scaladoc, similar
the path of the source file defining the event API is
`core/src/main/scala/org/apache/spark/scheduler/SparkListener.scala`
On 13 October 2015 at 16:29, Jakob Odersky <joder...@gmail.com> wrote:
> Hi,
> I came across the spark listener API while checking out possible UI
> extensi
Hi,
I came across the spark listener API while checking out possible UI
extensions recently. I noticed that all events inherit from a sealed trait
`SparkListenerEvent` and that a SparkListener has a corresponding
`onEventXXX(event)` method for every possible event.
Considering that events inherit
I'm having trouble compiling Spark with SBT for Scala 2.11. The command I
use is:
dev/change-version-to-2.11.sh
build/sbt -Pyarn -Phadoop-2.11 -Dscala-2.11
followed by
compile
in the sbt shell.
The error I get specifically is:
Hi everyone,
I am just getting started working on spark and was thinking of a first way
to contribute whilst still trying to wrap my head around the codebase.
Exploring the web UI, I noticed it is a classic request-response website,
requiring manual refresh to get the latest data.
I think it
[
https://issues.apache.org/jira/browse/SPARK-10876?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14951419#comment-14951419
]
Jakob Odersky commented on SPARK-10876:
---
I'm not sure what you mean. The UI already has a "Dur
[
https://issues.apache.org/jira/browse/SPARK-10876?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Jakob Odersky updated SPARK-10876:
--
Comment: was deleted
(was: I'm not sure what you mean. The UI already has a "Duration&q
[
https://issues.apache.org/jira/browse/SPARK-10876?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14951434#comment-14951434
]
Jakob Odersky commented on SPARK-10876:
---
Do you mean to display the total run time of uncompleted
[
https://issues.apache.org/jira/browse/SPARK-10306?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14951362#comment-14951362
]
Jakob Odersky commented on SPARK-10306:
---
Same issue here
> sbt hive/update is
301 - 338 of 338 matches
Mail list logo