Hi Alastair Cypher support looks like promising and the dev list thread discussion is interesting. thanks for your feedback.
On Thu, Oct 17, 2019 at 09:19:28AM +0100, Alastair Green wrote: > Hi Nicolas, > > I was following the current thread on the dev channel about Spark > Graph, including Cypher support, > > http://apache-spark-developers-list.1001551.n3.nabble.com/ > Add-spark-dependency-on-on-org-opencypher-okapi-shade-okapi-td28118.html > > and I remembered your post. > > Actually, GraphX and GraphFrames are both not being developed actively, so far > as I can tell. > > The only activity on GraphX in the last two years was a fix for Scala 2.13 > functionality: to quote the PR > > > ### Does this PR introduce any user-facing change? > > No behavior change at all. > > The only activity on GraphFrames since the addition of Pregel support in Scala > back in December 2018, has been build/test improvements and recent builds > against 2.4 and 3.0 snapshots. I’m not sure there was a lot of functional > change before that either. > > The efforts to provide graph processing in Spark with the more full-featured > Cypher query language that you can see in the proposed 3.0 changes discussed > in > the dev list, and the related openCypher/morpheus project (which among many > other things allows you to cast a Morpheus graph into a GraphX graph) and > extends the proposed 3.0 changes in a compatible way, are active. > > Yrs, > > Alastair > > > Alastair Green > > Query Languages Standards and Research > > > Neo4j UK Ltd > > Union House > 182-194 Union Street > London, SE1 0LH > > > +44 795 841 2107 > > > On Sun, Sep 22, 2019 at 21:17, Nicolas Paris <nicolas.pa...@riseup.net> wrote: > > hi all > > graphframes was intended to replace graphx. > > however the former looks not maintained anymore while the latter is > still active. > > any thought ? > -- > nicolas > > --------------------------------------------------------------------- > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > > -- nicolas --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org