I've created a PR to add the error message guidelines to the Spark
contributing guide. Would appreciate some eyes on it!
https://github.com/apache/spark-website/pull/332
On Wed, Apr 14, 2021 at 5:34 PM Yuming Wang wrote:
> +1 LGTM.
>
> On Thu, Apr 15, 2021 at 1:50 AM Karen wrote:
>
>> That
+1 LGTM.
On Thu, Apr 15, 2021 at 1:50 AM Karen wrote:
> That makes sense to me - given that an assert failure throws an
> AssertException, I would say that the same guidelines should apply for
> asserts.
>
> On Tue, Apr 13, 2021 at 7:41 PM Yuming Wang wrote:
>
>> Do we have plans to apply
Do we have plans to apply these guidelines to assert? For example:
https://github.com/apache/spark/blob/5b478416f8e3fe2f015af1b6c8faa7fe9f15c05d/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/orc/OrcUtils.scala#L136-L138
I would just go ahead and create a PR for that. Nothing written there looks
unreasonable.
But probably it should be best to wait a couple of days to make sure people
are happy with it.
2021년 4월 14일 (수) 오전 6:38, Karen 님이 작성:
> If the proposed guidelines look good, it would be useful to share
If the proposed guidelines look good, it would be useful to share these
guidelines with the wider community. A good landing page for contributors
could be https://spark.apache.org/contributing.html. What do you think?
Thank you,
Karen Feng
On Wed, Apr 7, 2021 at 8:19 PM Hyukjin Kwon wrote:
>
LGTM (I took a look, and had some offline discussions w/ some corrections
before it came out)
2021년 4월 8일 (목) 오전 5:28, Karen 님이 작성:
> Hi all,
>
> As discussed in SPIP: Standardize Exception Messages in Spark (
>
Hi all,
As discussed in SPIP: Standardize Exception Messages in Spark (
https://docs.google.com/document/d/1XGj1o3xAFh8BA7RCn3DtwIPC6--hIFOaNUNSlpaOIZs/edit?usp=sharing),
improving error message quality in Apache Spark involves establishing an
error message guideline for developers. Error message
Actually, there is a really trivial fix for that (an existing file not
being deleted when packaging). Opened SPARK-30489 for it.
On Fri, Jan 10, 2020 at 3:52 PM Jeff Evans
wrote:
> Thanks for the tip. Fixed by simply removing python/lib/pyspark.zip
> (since it's apparently generated), and
Thanks for the tip. Fixed by simply removing python/lib/pyspark.zip (since
it's apparently generated), and rebuilding. I guess clean does not remove
it.
On Fri, Jan 10, 2020 at 3:50 PM Sean Owen wrote:
> Sounds like you might have some corrupted file locally. I don't see
> any of the
Sounds like you might have some corrupted file locally. I don't see
any of the automated test builders failing. Nuke your local assembly
build and try again?
On Fri, Jan 10, 2020 at 3:49 PM Jeff Evans
wrote:
>
> Greetings,
>
> I'm getting an error when building, on latest master (2bd873181 as of
Greetings,
I'm getting an error when building, on latest master (2bd873181 as of this
writing). Full build command I'm running is: ./build/mvn -DskipTests clean
package
[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-antrun-plugin:1.8:run (create-tmp-dir) on
project
Hi,
Fixed now. git pull and start over.
https://github.com/apache/spark/commit/e1bd70f44b11141b000821e9754efeabc14f24a5
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at
I get this error when trying to build from Git master branch:
[ERROR] Failed to execute goal
net.alchim31.maven:scala-maven-plugin:3.2.2:doc-jar (attach-scaladocs) on
project spark-catalyst_2.11: MavenReportException: Error while creating
archive: wrap: Process exited with an error: 1 (Exit
Hi,
I have only encountered 'code too large' errors when changing grammars. I
am using SBT/Idea, no Eclipse.
The size of an ANTLR Parser/Lexer is dependent on the rules inside the
source grammar and the rules it depends on. So we should take a look at the
IdentifiersParser.g/ExpressionParser.g;
Thanks for the pointer. It seems to be really a pathological case, since
the file that's in error is part of the splinter file (the smaller one,
IndetifiersParser). I'll see if I can work around by splitting it some more.
iulian
On Thu, Jan 28, 2016 at 4:43 PM, Ted Yu
After this change:
[SPARK-12681] [SQL] split IdentifiersParser.g into two files
the biggest file under
sql/catalyst/src/main/antlr3/org/apache/spark/sql/catalyst/parser is
SparkSqlParser.g
Maybe split SparkSqlParser.g up as well ?
On Thu, Jan 28, 2016 at 5:21 AM, Iulian Dragoș
Hi,
Has anyone seen this error?
The code of method specialStateTransition(int, IntStream) is exceeding
the 65535 bytes limitSparkSqlParser_IdentifiersParser.java:39907
The error is in ANTLR generated files and it’s (according to Stack
Overflow) due to state explosion in parser (or lexer).
One more question:
Is there a way only to build the MLlib using command line?
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/latest-Spark-build-error-tp15782p15794.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com
lib using command line?
>
>
>
>
>--
>View this message in context:
>http://apache-spark-developers-list.1001551.n3.nabble.com/latest-Spark-build-error-tp15782p15794.html
>Sent from the Apache Spark Deve
Updating Maven version to 3.3.9 solved the issue
Thanks everyone!
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/latest-Spark-build-error-tp15782p15787.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com
This is because to build Spark requires maven 3.3.3 or later.
http://spark.apache.org/docs/latest/building-spark.html
Regards,
Kazuaki Ishizaki
From: salexln <sale...@gmail.com>
To: dev@spark.apache.org
Date: 2015/12/25 15:52
Subject:latest Spark build error
Hi all
Hi all,
I'm getting build error when trying to build a clean version of latest
Spark. I did the following
1) git clone https://github.com/apache/spark.git
2) build/mvn -DskipTests clean package
But I get the following error:
Spark Project Parent POM .. FAILURE [2.338s
://repository.apache.org/snapshots)
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-tp11441p11449.html
Sent from the Apache Spark Developers List mailing list archive at
Nabble.com
)
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-tp11441p11449.html
Sent from the Apache Spark Developers List mailing list archive at
Nabble.com
will not be reattempted until the update interval of
apache.snapshots has elapsed or updates are forced
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-tp11441p11442.html
Sent from the Apache Spark Developers List mailing list
-
developers-list.1001551.n3.nabble.com/1-3-Build-Error-
with-Scala-2-11-tp11441p11442.html
Sent from the Apache Spark Developers List mailing list archive at
Nabble.com.
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
in the local
repository,
resolution will not be reattempted until the update interval of
apache.snapshots has elapsed or updates are forced
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-tp11441p11442.html
Sent
://repository.apache.org/snapshots was cached in the local
repository,
resolution will not be reattempted until the update interval of
apache.snapshots has elapsed or updates are forced
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error
?
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-tp11441p11447.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com
)
at
org.sonatype.aether.impl.internal.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:427)
... 26 more
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-tp11441.html
Sent from the Apache Spark Developers
are forced
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11-tp11441p11442.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com
) and then re-cloning from github and switching
to the 1.2 or 1.3 branch.
Does anything persist outside of the spark directory?
Are you able to build either 1.2 or 1.3 w/ Scala-2.11?
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error
:jar:1.3.2-SNAPSHOT: Could not
find artifact org.apache.spark:spark-network-common_2.10:jar:1.3.2-SNAPSHOT
in apache.snapshots (http://repository.apache.org/snapshots)
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/1-3-Build-Error-with-Scala-2-11
It means a test failed but you have not shown the test failure. This would
have been logged earlier. You would need to say how you ran tests too. The
tests for 1.2.0 pass for me on several common permutations.
On Dec 29, 2014 3:22 AM, Naveen Madhire vmadh...@umail.iu.edu wrote:
Hi,
I am follow
I am getting The command is too long error.
Is there anything which needs to be done.
However for the time being I followed the sbt way of buidling spark in
IntelliJ.
On Mon, Dec 29, 2014 at 3:52 AM, Sean Owen so...@cloudera.com wrote:
It means a test failed but you have not shown the test
Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/25#issuecomment-37161832
What is news there? You say your environment requires proxy settings and
you successfully identified them. Here you fail to set them.
---
If your project is set up for
36 matches
Mail list logo