[ https://issues.apache.org/jira/browse/SPARK-5025?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Patrick Wendell resolved SPARK-5025. ------------------------------------ Resolution: Won't Fix I'm closing this as wont fix. There are now a bunch of community packages as examples, so I think people can just follow those examples. > Write a guide for creating well-formed packages for Spark > --------------------------------------------------------- > > Key: SPARK-5025 > URL: https://issues.apache.org/jira/browse/SPARK-5025 > Project: Spark > Issue Type: Improvement > Components: Documentation > Reporter: Patrick Wendell > Assignee: Patrick Wendell > > There are an increasing number of OSS projects providing utilities and > extensions to Spark. We should write a guide in the Spark docs that explains > how to create, package, and publish a third party Spark library. There are a > few issues here such as how to list your dependency on Spark, how to deal > with your own third party dependencies, etc. We should also cover how to do > this for Python libraries. > In general, we should make it easy to build extension points against any of > Spark's API's (e.g. for new data sources, streaming receivers, ML algos, etc) > and self-publish libraries. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org