Re: [ANNOUNCE] YCSB 0.2.0 Release

2015-07-07 Thread Sudhir Menon
This is good news. Lets make sure that the Geode team has had the opportunity to go over the Geode binding and has the ability to commit changes that allow proper configuration of Geode datastores in the benchmark Suds On Tue, Jul 7, 2015 at 9:38 PM, Sean Busbey wrote: > On Tue, Jul 7, 2015 at

Re: [ANNOUNCE] YCSB 0.2.0 Release

2015-07-07 Thread Sean Busbey
On Tue, Jul 7, 2015 at 11:06 PM, Roman Shaposhnik wrote: > Hi Sean! > > On Tue, Jul 7, 2015 at 8:46 PM, Sean Busbey wrote: > > On behalf of the development community, I am pleased to announce the > > release of YCSB version 0.2.0. > > Awesome news! > > > * ~5 additional datastore bindings in exp

Re: [ANNOUNCE] YCSB 0.2.0 Release

2015-07-07 Thread Roman Shaposhnik
Hi Sean! On Tue, Jul 7, 2015 at 8:46 PM, Sean Busbey wrote: > On behalf of the development community, I am pleased to announce the > release of YCSB version 0.2.0. Awesome news! > * ~5 additional datastore bindings in experimental status (including > GemFire) Obviously I'm super interested in

[ANNOUNCE] YCSB 0.2.0 Release

2015-07-07 Thread Sean Busbey
On behalf of the development community, I am pleased to announce the release of YCSB version 0.2.0. Highlights: * Apache Cassandra 2.0 CQL support * Apache HBase 1.0 support * Apache Accumulo 1.6 support * MongoDB - support for all production versions released since 2011 * Tarantool 1.6 support *

Re: [DISCUSS] Fix / update intoductory training materials and labs for Apache Geode?

2015-07-07 Thread Roman Shaposhnik
I think this could be pretty useful to the project. Please file a JIRA with the details from this email. Thanks, Roman. On Wed, Jul 1, 2015 at 12:43 PM, Gregory Chase wrote: > Greetings Apache Geode committers, > I wanted to bring this up for discussion before filing a JIRA. > > As part of the i

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread John Blum
For clarification... what I specifically mean when I say "level of modularity" can be reflected in the dependencies between modules. The POM distinguishes required vs. non-required dependencies based on the "scope" (i.e. 'compile'-time vs. 'optional', and so on). If you look at the Maven POM files

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread Bruce Schuchardt
+1 Le 7/7/2015 3:58 PM, John Blum a écrit : There are a few Spring projects that are exemplary (examples) in their modularity, contained within a single repo. The core Spring Framework and Spring Boot are 2 such projects that immediately come to mind. However, this sort of disciplined modulari

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread John Blum
There are a few Spring projects that are exemplary (examples) in their modularity, contained within a single repo. The core Spring Framework and Spring Boot are 2 such projects that immediately come to mind. However, this sort of disciplined modularity requires a very important delineation of res

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread William Markito
Folks, There is a lot of good and valuable points on this thread, however we need to discuss some practical actions here and maybe even see what other projects have already done during their incubation. For example, Apache Zeppelin (incubating) is also dependent on Spark and what they do is selec

[GitHub] incubator-geode pull request: Port projectgeode.org

2015-07-07 Thread stumped2
Github user stumped2 closed the pull request at: https://github.com/apache/incubator-geode/pull/3 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the featur

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread John Blum
Just a quick word on maintaining different (release) branches for main dependencies (.e.g. "driver" dependencies). Again, this is exactly what Spring Data GemFire does to support GemFire, and now Geode. In fact, it has to be this way for Apache Geode and Pivotal GemFire given the fork in the code

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread Dan Smith
To support different versions of spark, wouldn't it be better to have a single code base that has adapters for different versions of spark? It seems like that would be better than maintaining several active branches with semi-duplicate code. I do think it would be better to keep the geode spark co

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread John Blum
+1 - Bingo, that tis the question. Part of the answer lies in having planned, predictable and a consistent cadence of releases. E.g. the *Spring Data* project [0] is an umbrella project managing 12 individual modules (e.g. SD... JPA, Mongo, Redis, Neo4j, G

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread Jianxia Chen
I agree that Spark Geode connector has its own repo. In fact, in order to use Spark Geode Connector, the users write Spark application (instead of Geode application) that calls the Spark Geode Connector APIs. There are a bunch of similar Spark connector projects which connect Spark with other da

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread Roman Shaposhnik
On Tue, Jul 7, 2015 at 11:21 AM, Gregory Chase wrote: > More important than easy to develop is easy to pick up and use. > > Improving the new user experience is something that needs attention from > Geode. How we develop and provide Spark integration needs to take this > into account. > > Once we

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread Gregory Chase
More important than easy to develop is easy to pick up and use. Improving the new user experience is something that needs attention from Geode. How we develop and provide Spark integration needs to take this into account. Once we are able to provide official releases, how can a user know and mak

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread Roman Shaposhnik
On Tue, Jul 7, 2015 at 10:34 AM, Anilkumar Gingade wrote: > Agree...And thats the point...The connector code needs to catch up with > spark release train; if its part of Geode then the Geode releases needs to > happen as often as Spark release (along with other planned Geode release)... I don't t

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread Jason Huynh
I agree with the github approach as the Spark connector was originally designed to be in it's own repo with dependencies on the Spark and Geode jars. I think the backwards compatibility for the Spark versions would be as John described, based on the sbt dependencies file. If we go with the single

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread Anilkumar Gingade
Agree...And thats the point...The connector code needs to catch up with spark release train; if its part of Geode then the Geode releases needs to happen as often as Spark release (along with other planned Geode release)... Even if the connector code is compatible with latest Spark; the previous c

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread Eric Pederson
I would vote to support at least the previous Spark release. The big Hadoop distros usually are a version behind in their Spark support. For example, we use MapR which, in their latest release (4.1.0), only supports Spark 1.2.1 and 1.3.1

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread Kirk Lund
The recommended ideal time for building and executing all unit tests for a project is 10 minutes.[0][1][2] "Builds should be fast. Anything beyond 10 minutes becomes a dysfunction in the process, because people won’t commit as frequently. Large builds can be broken into multiple jobs and executed

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread Anthony Baker
Given the rate of change, it doesn’t seem like we should be trying to add (and maintain) support for every single Spark release. We’re early in the lifecycle of the Spark connector and too much emphasis on backwards-compatibility will be a drag on our ongoing development, particularly since the

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread John Blum
> *for each Geode release, we probably need multiple Connector releases, and probably need to maintain last 2 or 3 Connector releases, for example, we need to support both Spark1.3 and 1.4 with the current Geode code.* Exactly my point for maintaining the GemFire/Geode Spark Connector as a separat

Re: Where to place "Spark + GemFire" connector.

2015-07-07 Thread Kirk Lund
I would think that github would be a better option for the Spark Geode Connector. That way it's not tightly coupled to the Geode release cycle. I don't see why it's desirable to bloat Geode with every single script, tool, or connector that might interact with Geode. Another reason to consider sep