For the clueless (like me):

https://bahir.apache.org/#home

Apache Bahir provides extensions to distributed analytic platforms such as
Apache Spark.

Initially Apache Bahir will contain streaming connectors that were a part
of Apache Spark prior to version 2.0:

   - streaming-akka
   - streaming-mqtt
   - streaming-twitter
   - streaming-zeromq

The Apache Bahir community welcomes the proposal of new extensions.

Nick
​

On Wed, Jun 22, 2016 at 10:40 AM Sean Owen <so...@cloudera.com> wrote:

> I profess ignorance again though I really should know by now, but,
> what's opposing that? I personally thought this was going to be in 2.0
> and didn't kind of notice it wasn't ...
>
> On Wed, Jun 22, 2016 at 3:29 PM, Cody Koeninger <c...@koeninger.org>
> wrote:
> > I don't have a vote, but I'd just like to reiterate that I think kafka
> > 0.10 support should be added to a 2.0 release candidate; if not now,
> > then well before release.
> >
> > - it's a completely standalone jar, so shouldn't break anyone who's
> > using the existing 0.8 support
> > - it's like the 5th highest voted open ticket, and has been open for
> months
> > - Luciano has said multiple times that he wants to merge that PR into
> > Bahir if it isn't in a RC for spark 2.0, which I think would confuse
> > users and cause maintenance problems
> >
> > On Wed, Jun 22, 2016 at 12:38 AM, Sean Owen <so...@cloudera.com> wrote:
> >> While I'd officially -1 this while there are still many blockers, this
> >> should certainly be tested as usual, because they're mostly doc and
> >> "audit" type issues.
> >>
> >> On Wed, Jun 22, 2016 at 2:26 AM, Reynold Xin <r...@databricks.com>
> wrote:
> >>> Please vote on releasing the following candidate as Apache Spark
> version
> >>> 2.0.0. The vote is open until Friday, June 24, 2016 at 19:00 PDT and
> passes
> >>> if a majority of at least 3+1 PMC votes are cast.
> >>>
> >>> [ ] +1 Release this package as Apache Spark 2.0.0
> >>> [ ] -1 Do not release this package because ...
> >>>
> >>>
> >>> The tag to be voted on is v2.0.0-rc1
> >>> (0c66ca41afade6db73c9aeddd5aed6e5dcea90df).
> >>>
> >>> This release candidate resolves ~2400 issues:
> >>> https://s.apache.org/spark-2.0.0-rc1-jira
> >>>
> >>> The release files, including signatures, digests, etc. can be found at:
> >>> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc1-bin/
> >>>
> >>> Release artifacts are signed with the following key:
> >>> https://people.apache.org/keys/committer/pwendell.asc
> >>>
> >>> The staging repository for this release can be found at:
> >>>
> https://repository.apache.org/content/repositories/orgapachespark-1187/
> >>>
> >>> The documentation corresponding to this release can be found at:
> >>>
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc1-docs/
> >>>
> >>>
> >>> =======================================
> >>> == How can I help test this release? ==
> >>> =======================================
> >>> If you are a Spark user, you can help us test this release by taking an
> >>> existing Spark workload and running on this release candidate, then
> >>> reporting any regressions from 1.x.
> >>>
> >>> ================================================
> >>> == What justifies a -1 vote for this release? ==
> >>> ================================================
> >>> Critical bugs impacting major functionalities.
> >>>
> >>> Bugs already present in 1.x, missing features, or bugs related to new
> >>> features will not necessarily block this release. Note that
> historically
> >>> Spark documentation has been published on the website separately from
> the
> >>> main release so we do not need to block the release due to
> documentation
> >>> errors either.
> >>>
> >>>
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> >> For additional commands, e-mail: dev-h...@spark.apache.org
> >>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to