I tried the same command on MacBook and didn't experience the same error.
Which OS are you using ?
Cheers
On Mon, Dec 1, 2014 at 6:42 PM, Stephen Boesch java...@gmail.com wrote:
It seems there were some additional settings required to build spark now .
This should be a snap for most of you
zip for 0.3.5.3 was downloaded and exploded. Then I ran
sbt dist/create . zinc is being launched from
dist/target/zinc-0.3.5.3/bin/zinc
2014-12-01 20:12 GMT-08:00 Ted Yu yuzhih...@gmail.com:
I use zinc 0.2.0 and started zinc with the same command shown below.
I don't observe such error
bq. spark-0.12 also has some nice feature added
Minor correction: you meant Spark 1.2.0 I guess
Cheers
On Fri, Nov 21, 2014 at 3:45 PM, Zhan Zhang zzh...@hortonworks.com wrote:
Thanks Dean, for the information.
Hive-on-spark is nice. Spark sql has the advantage to take the full
advantage
Sorry for the late reply.
I tested my patch on Mac with the following JDK:
java version 1.7.0_60
Java(TM) SE Runtime Environment (build 1.7.0_60-b19)
Java HotSpot(TM) 64-Bit Server VM (build 24.60-b09, mixed mode)
Let me see if the problem can be solved upstream in HBase hbase-annotations
it from various hbase modules:
https://github.com/apache/spark/pull/3286
Cheers
https://github.com/apache/spark/pull/3286
On Sat, Nov 15, 2014 at 6:56 AM, Ted Yu yuzhih...@gmail.com wrote:
Sorry for the late reply.
I tested my patch on Mac with the following JDK:
java version 1.7.0_60
Java
Have you seen this thread ?
http://search-hadoop.com/m/LgpTk2Pnw6O/andrew+apache+mirrorsubj=Re+All+mirrored+download+links+from+the+Apache+Hadoop+site+are+broken
Cheers
On Wed, Nov 5, 2014 at 7:36 PM, Nicholas Chammas nicholas.cham...@gmail.com
wrote:
As part of my work for SPARK-3821
pointed to also appears to be broken now:
http://apache.mesi.com.ar/hadoop/common/
Nick
On Wed, Nov 5, 2014 at 10:43 PM, Ted Yu yuzhih...@gmail.com wrote:
Have you seen this thread ?
http://search-hadoop.com/m/LgpTk2Pnw6O/andrew+apache+mirrorsubj=Re+All+mirrored+download+links+from+the+Apache
Koert:
Have you tried adding the following on your commandline ?
-Dscalastyle.failOnViolation=false
Cheers
On Thu, Oct 23, 2014 at 11:07 AM, Patrick Wendell pwend...@gmail.com
wrote:
Hey Koert,
I think disabling the style checks in maven package could be a good
idea for the reason you
goal
org.scalastyle:scalastyle-maven-plugin:0.4.0:check (default) on project
spark-core_2.10: Failed during scalastyle execution: You have 3 Scalastyle
violation(s). - [Help 1]
On Thu, Oct 23, 2014 at 2:14 PM, Ted Yu yuzhih...@gmail.com wrote:
Koert:
Have you tried adding the following
Created SPARK-4066 and attached patch there.
On Thu, Oct 23, 2014 at 1:07 PM, Koert Kuipers ko...@tresata.com wrote:
great thanks i will do that
On Thu, Oct 23, 2014 at 3:55 PM, Ted Yu yuzhih...@gmail.com wrote:
Koert:
If you have time, you can try this diff - with which you would be able
I performed build on latest master branch but didn't get compilation error.
FYI
On Mon, Oct 20, 2014 at 3:51 PM, Nan Zhu zhunanmcg...@gmail.com wrote:
Hi,
I just submitted a patch https://github.com/apache/spark/pull/2864/files
with one line change
but the Jenkins told me it's failed to
Please take a look at WhitespaceEndOfLineChecker under:
http://www.scalastyle.org/rules-0.1.0.html
Cheers
On Wed, Oct 1, 2014 at 2:01 PM, Nicholas Chammas nicholas.cham...@gmail.com
wrote:
As discussed here https://github.com/apache/spark/pull/2619, it would be
good to extend our Scala style
Hi,
Running test suite in trunk, I got:
^[[32mBasicOperationsSuite:^[[0m
^[[32m- map^[[0m
^[[32m- flatMap^[[0m
^[[32m- filter^[[0m
^[[32m- glom^[[0m
^[[32m- mapPartitions^[[0m
^[[32m- repartition (more partitions)^[[0m
^[[32m- repartition (fewer partitions)^[[0m
^[[32m- groupByKey^[[0m
^[[32m-
From output of dependency:tree:
[INFO] --- maven-dependency-plugin:2.8:tree (default-cli) @
spark-streaming_2.10 ---
[INFO] org.apache.spark:spark-streaming_2.10:jar:1.1.0-SNAPSHOT
INFO] +- org.apache.spark:spark-core_2.10:jar:1.1.0-SNAPSHOT:compile
[INFO] | +-
Hi,
Using the following command on (refreshed) master branch:
mvn clean package -DskipTests
I got:
constituent[36]: file:/homes/hortonzy/apache-maven-3.1.1/conf/logging/
---
java.lang.reflect.InvocationTargetException
at
How about using parallel execution feature of maven-surefire-plugin
(assuming all the tests were made parallel friendly) ?
http://maven.apache.org/surefire/maven-surefire-plugin/examples/fork-options-and-parallel-execution.html
Cheers
On Fri, Aug 8, 2014 at 9:14 AM, Sean Owen
I refreshed my workspace.
I got the following error with this command:
mvn -Pyarn -Phive -Phadoop-2.4 -DskipTests install
[ERROR] bad symbolic reference. A signature in package.class refers to term
scalalogging
in package com.typesafe which is not available.
It may be completely missing from the
Forgot to do that step.
Now compilation passes.
On Wed, Aug 6, 2014 at 1:36 PM, Zongheng Yang zonghen...@gmail.com wrote:
Hi Ted,
By refreshing do you mean you have done 'mvn clean'?
On Wed, Aug 6, 2014 at 1:17 PM, Ted Yu yuzhih...@gmail.com wrote:
I refreshed my workspace.
I got
The following command succeeded (on Linux) on Spark master checked out this
morning:
mvn -Pyarn -Phive -Phadoop-2.4 -DskipTests install
FYI
On Thu, Jul 31, 2014 at 1:36 PM, yao yaosheng...@gmail.com wrote:
Hi TD,
I've asked my colleagues to do the same thing but compile still fails.
See Mailing list section of:
https://spark.apache.org/community.html
On Wed, Jul 30, 2014 at 6:53 PM, Grace syso...@gmail.com wrote:
I found 0.13.1 artifacts in maven:
http://search.maven.org/#artifactdetails%7Corg.apache.hive%7Chive-metastore%7C0.13.1%7Cjar
However, Spark uses groupId of org.spark-project.hive, not org.apache.hive
Can someone tell me how it is supposed to work ?
Cheers
On Mon, Jul 28, 2014 at 7:44 AM,
?
I am no expert on the Hive artifacts, just remembering what the issue
was initially in case it helps you get to a similar solution.
On Mon, Jul 28, 2014 at 4:47 PM, Ted Yu yuzhih...@gmail.com wrote:
hive-exec (as of 0.13.1) is published here:
http://search.maven.org/#artifactdetails
can fix that issue. If not, we'll
have to continue forking our own version of Hive to change the way it
publishes artifacts.
- Patrick
On Mon, Jul 28, 2014 at 9:34 AM, Ted Yu yuzhih...@gmail.com wrote:
Talked with Owen offline. He confirmed that as of 0.13, hive-exec is
still
uber jar
:
AFAIK, according a recent talk, Hulu team in China has built Spark SQL
against Hive 0.13 (or 0.13.1?) successfully. Basically they also
re-packaged Hive 0.13 as what the Spark team did. The slides of the
talk
hasn't been released yet though.
On Tue, Jul 29, 2014 at 1:01 AM, Ted Yu
PM, Ted Yu yuzhih...@gmail.com wrote:
After manually copying hive 0.13.1 jars to local maven repo, I got the
following errors when building spark-hive_2.10 module :
[ERROR]
/homes/xx/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveContext.scala:182:
type mismatch;
found
have is, What is the goal of upgrading to hive 0.13.0? Is
it purely because you are having problems connecting to newer metastores?
Are there some features you are hoping for? This will help me prioritize
this effort.
Michael
On Mon, Jul 28, 2014 at 4:05 PM, Ted Yu yuzhih...@gmail.com wrote
HADOOP-10456 is fixed in hadoop 2.4.1
Does this mean that synchronization
on HadoopRDD.CONFIGURATION_INSTANTIATION_LOCK can be bypassed for hadoop
2.4.1 ?
Cheers
On Fri, Jul 25, 2014 at 6:00 PM, Patrick Wendell pwend...@gmail.com wrote:
The most important issue in this release is actually an
See http://spark.apache.org/news/spark-mailing-lists-moving-to-apache.html
Cheers
On Jul 8, 2014, at 4:17 AM, Leon Zhang leonca...@gmail.com wrote:
This is the correct page: http://spark.apache.org/community.html
Cheers
On Jul 8, 2014, at 4:43 AM, Ted Yu yuzhih...@gmail.com wrote:
See http://spark.apache.org/news/spark-mailing-lists-moving-to-apache.html
Cheers
On Jul 8, 2014, at 4:17 AM, Leon Zhang leonca...@gmail.com wrote:
/RELEASE_X86_64 x86_64
On Mon, Jun 16, 2014 at 10:04 PM, Andrew Ash and...@andrewash.com wrote:
Maybe it's a Mac OS X thing?
On Mon, Jun 16, 2014 at 9:57 PM, Ted Yu yuzhih...@gmail.com wrote:
I used the same command on Linux and it passed:
Linux k.net 2.6.32-220.23.1.el6.YAHOO.20120713
I used the same command on Linux and it passed:
Linux k.net 2.6.32-220.23.1.el6.YAHOO.20120713.x86_64 #1 SMP Fri Jul 13
11:40:51 CDT 2012 x86_64 x86_64 x86_64 GNU/Linux
Cheers
On Mon, Jun 16, 2014 at 9:29 PM, Andrew Ash and...@andrewash.com wrote:
I can't run sbt/sbt gen-idea on a clean
301 - 331 of 331 matches
Mail list logo