Hi,
Is there any way to bypass the limitations of SparkSqlSerializer2 in module
SQL? Said that,
1) it does not support complex types,
2) assumes key-value pairs.
Is there any other pluggable serializer that can be used here?
Thanks!
spark-ec2 is kind of a mini project within a project.
It’s composed of a set of EC2 AMIs
https://github.com/mesos/spark-ec2/tree/branch-1.4/ami-list under
someone’s account (maybe Patrick’s?) plus the following 2 code bases:
- Main command line tool:
I'll render an opinion although I'm only barely qualified by having
just had a small discussion on this --
It does seem like mesos/spark-ec2 is in the wrong place, although
really, that is at best an issue for Mesos. But it does highlight that
the Spark EC2 support doesn't entirely live with and
As the person maintaining the mesos/spark-ec2 repo, here are my 2 cents
- I don't think it makes sense to put the scripts in the Spark repo itself.
Cloning the scripts on the EC2 instances is an intentional design which
allows us to make minor config changes in EC2 launches without needing a
new
That's mine
Apache Maven 3.3.3 (7994120775791599e205a5524ec3e0dfe41d4a06;
2015-04-22T04:57:37-07:00)
Maven home: /usr/local/Cellar/maven/3.3.3/libexec
Java version: 1.8.0_45, vendor: Oracle Corporation
Java home:
/Library/Java/JavaVirtualMachines/jdk1.8.0_45.jdk/Contents/Home/jre
Default
Thanks, I just tried it with 3.3.3 and I was able to reproduce it as well.
2015-07-03 18:51 GMT-07:00 Tarek Auel tarek.a...@gmail.com:
That's mine
Apache Maven 3.3.3 (7994120775791599e205a5524ec3e0dfe41d4a06;
2015-04-22T04:57:37-07:00)
Maven home: /usr/local/Cellar/maven/3.3.3/libexec
Hm - what if you do a fresh git checkout (just to make sure you don't
have an older maven version downloaded). It also might be that this
really is an issue even with Maven 3.3.3. I just am not sure why it's
not reflected in our continuous integration or the build of the
release packages
I have 3.3.3
USS-Defiant:NW ksankar$ mvn -version
Apache Maven 3.3.3 (7994120775791599e205a5524ec3e0dfe41d4a06;
2015-04-22T04:57:37-07:00)
Maven home: /usr/local/apache-maven-3.3.3
Java version: 1.7.0_60, vendor: Oracle Corporation
Java home:
Let's continue the disucssion on the other thread relating to the master build.
On Fri, Jul 3, 2015 at 4:13 PM, Patrick Wendell pwend...@gmail.com wrote:
Thanks - it appears this is just a legitimate issue with the build,
affecting all versions of Maven.
On Fri, Jul 3, 2015 at 4:02 PM,
Okay I did some forensics with Sean Owen. Some things about this bug:
1. The underlying cause is that we added some code to make the tests
of sub modules depend on the core tests. For unknown reasons this
causes Spark to hit MSHADE-148 for *some* combinations of build
profiles.
2. MSHADE-148 can
Patch that added test-jar dependencies:
https://github.com/apache/spark/commit/bfe74b34
Patch that originally disabled dependency reduced poms:
https://github.com/apache/spark/commit/984ad60147c933f2d5a2040c87ae687c14eb1724
Patch that reverted the disabling of dependency reduced poms:
Sorry to say same happens on 3.3.3. I tried Shade 2.4 too. It is
indeed MSHADE-148 that Andrew was trying to fix in the first place.
I'm also trying to think of workarounds here.
On Fri, Jul 3, 2015 at 11:41 PM, Patrick Wendell pwend...@gmail.com wrote:
What if you use the built-in maven (i.e.
I used the following build command:
build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package
this also gave the ‘Dependency-reduced POM’ loop
Robin
On 3 Jul 2015, at 23:41, Patrick Wendell pwend...@gmail.com wrote:
What if you use the built-in maven (i.e. build/mvn).
Patrick,
I assume an RC3 will be out for folks like me to test the distribution.
As usual, I will run the tests when you have a new distribution.
Cheers
k/
On Fri, Jul 3, 2015 at 4:38 PM, Patrick Wendell pwend...@gmail.com wrote:
Patch that added test-jar dependencies:
Yep, happens to me as well. Build loops.
Cheers
k/
On Fri, Jul 3, 2015 at 2:40 PM, Ted Yu yuzhih...@gmail.com wrote:
Patrick:
I used the following command:
~/apache-maven-3.3.1/bin/mvn -DskipTests -Phadoop-2.4 -Pyarn -Phive clean
package
The build doesn't seem to stop.
Here is tail of
Can you try using the built in maven build/mvn...? All of our builds
are passing on Jenkins so I wonder if it's a maven version issue:
https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA-Compile/
- Patrick
On Fri, Jul 3, 2015 at 3:14 PM, Ted Yu yuzhih...@gmail.com wrote:
Please take a look at
Doesn't change anything for me.
On Fri, Jul 3, 2015 at 3:45 PM Patrick Wendell pwend...@gmail.com wrote:
Can you try using the built in maven build/mvn...? All of our builds
are passing on Jenkins so I wonder if it's a maven version issue:
Thanks - it appears this is just a legitimate issue with the build,
affecting all versions of Maven.
On Fri, Jul 3, 2015 at 4:02 PM, Krishna Sankar ksanka...@gmail.com wrote:
I have 3.3.3
USS-Defiant:NW ksankar$ mvn -version
Apache Maven 3.3.3 (7994120775791599e205a5524ec3e0dfe41d4a06;
@Tarek and Ted, what maven versions are you using?
2015-07-03 17:35 GMT-07:00 Krishna Sankar ksanka...@gmail.com:
Patrick,
I assume an RC3 will be out for folks like me to test the distribution.
As usual, I will run the tests when you have a new distribution.
Cheers
k/
On Fri, Jul 3,
Here is mine:
Apache Maven 3.3.1 (cab6659f9874fa96462afef40fcf6bc033d58c1c;
2015-03-13T13:10:27-07:00)
Maven home: /home/hbase/apache-maven-3.3.1
Java version: 1.8.0_45, vendor: Oracle Corporation
Java home: /home/hbase/jdk1.8.0_45/jre
Default locale: en_US, platform encoding: UTF-8
OS name:
This vote is cancelled in favor of RC2. Thanks very much to Sean Owen
for triaging an important bug associated with RC1.
I took a look at the branch-1.4 contents and I think its safe to cut
RC2 from the head of that branch (i.e no very high risk patches that I
could see). JIRA management around
Hi all,
I am trying to build the master, but it stucks and prints
[INFO] Dependency-reduced POM written at:
/Users/tarek/test/spark/bagel/dependency-reduced-pom.xml
build command: mvn -DskipTests clean package
Do others have the same issue?
Regards,
Tarek
This is what I got (the last line was repeated non-stop):
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing
/home/hbase/spark/bagel/target/spark-bagel_2.10-1.5.0-SNAPSHOT.jar with
/home/hbase/spark/bagel/target/spark-bagel_2.10-1.5.0-SNAPSHOT-shaded.jar
[INFO]
I found a solution, there might be a better one.
https://github.com/apache/spark/pull/7217
On Fri, Jul 3, 2015 at 2:28 PM Robin East robin.e...@xense.co.uk wrote:
Yes me too
On 3 Jul 2015, at 22:21, Ted Yu yuzhih...@gmail.com wrote:
This is what I got (the last line was repeated non-stop):
Please vote on releasing the following candidate as Apache Spark version 1.4.1!
This release fixes a handful of known issues in Spark 1.4.0, listed here:
http://s.apache.org/spark-1.4.1
The tag to be voted on is v1.4.1-rc2 (commit 07b95c7):
Yes me too
On 3 Jul 2015, at 22:21, Ted Yu yuzhih...@gmail.com wrote:
This is what I got (the last line was repeated non-stop):
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing
/home/hbase/spark/bagel/target/spark-bagel_2.10-1.5.0-SNAPSHOT.jar with
Patrick:
I used the following command:
~/apache-maven-3.3.1/bin/mvn -DskipTests -Phadoop-2.4 -Pyarn -Phive clean
package
The build doesn't seem to stop.
Here is tail of build output:
[INFO] Dependency-reduced POM written at:
/home/hbase/spark-1.4.1/bagel/dependency-reduced-pom.xml
[INFO]
Thanks. Forgot about that ;o(
On Thu, Jul 2, 2015 at 11:57 PM, Reynold Xin r...@databricks.com wrote:
except is a keyword in Python unfortunately.
On Thu, Jul 2, 2015 at 11:54 PM, Krishna Sankar ksanka...@gmail.com
wrote:
Guys,
Scala says except while python has subtract. (I verified
Hi all,
I have a table named test like this:
| a | b |
| 1 | null |
| 2 | null |
After upgraded the cluster from spark 1.3.1 to 1.4.0, I found the Sum
function in spark 1.4 and 1.3 are different.
The SQL is: select sum(b) from test
In Spark 1.4.0 the result is 0.0, in spark 1.3.1 the
Dear Spark Devs,
I have written an experimental 1d laplace parallel Spark solver
http://myunsoo-dataworks.blogspot.kr/2015/06/solve-differential-equations-with-spark.html
, out of curiousity regarding this
Guys,
Scala says except while python has subtract. (I verified that except
doesn't exist in python) Why the difference in syntax for the same
functionality ?
Cheers
k/
Great, thanks for the fix.
Anything marked as fixed for 1.4.2 should now be marked as fixed for
1.4.1 right? I saw you were already updating many of those; OK to
finish that?
From skimming them, it looks like mostly bug fixes and docs, which are
pretty safe. A few things are kind of minor
32 matches
Mail list logo