Github user alope107 commented on the pull request:
https://github.com/apache/spark/pull/8318#issuecomment-155524260
@holdenk @gracew
Thanks, I bumped py4j version.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user alope107 commented on the pull request:
https://github.com/apache/spark/pull/8318#issuecomment-147128537
Added check for version number in assembly jar if pom.xml is not present.
@davies is this what you had in mind?
---
If your project is set up for it, you can reply to
Github user alope107 commented on a diff in the pull request:
https://github.com/apache/spark/pull/8318#discussion_r41334727
--- Diff: python/pyspark/pyspark_version.py ---
@@ -0,0 +1,17 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
Github user alope107 commented on a diff in the pull request:
https://github.com/apache/spark/pull/8318#discussion_r41334652
--- Diff: python/setup.py ---
@@ -0,0 +1,18 @@
+#!/usr/bin/env python
+
+from setuptools import setup
+
+exec(compile(open("py
Github user alope107 commented on a diff in the pull request:
https://github.com/apache/spark/pull/8318#discussion_r41213092
--- Diff: python/pyspark/pyspark_version.py ---
@@ -0,0 +1,17 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
Github user alope107 commented on a diff in the pull request:
https://github.com/apache/spark/pull/8318#discussion_r41212659
--- Diff: python/pyspark/__init__.py ---
@@ -36,6 +36,31 @@
Finer-grained cache persistence levels.
"""
+import os
Github user alope107 commented on a diff in the pull request:
https://github.com/apache/spark/pull/8318#discussion_r41212537
--- Diff: python/setup.py ---
@@ -0,0 +1,18 @@
+#!/usr/bin/env python
+
+from setuptools import setup
+
+exec(compile(open("py
Github user alope107 commented on a diff in the pull request:
https://github.com/apache/spark/pull/8318#discussion_r37657884
--- Diff: python/setup.py ---
@@ -0,0 +1,19 @@
+#!/usr/bin/env python
+
+from setuptools import setup
+
+exec(compile(open("py
Github user alope107 commented on a diff in the pull request:
https://github.com/apache/spark/pull/8318#discussion_r37657387
--- Diff: python/pyspark/pyspark_version.py ---
@@ -0,0 +1,17 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
Github user alope107 commented on a diff in the pull request:
https://github.com/apache/spark/pull/8318#discussion_r37570383
--- Diff: python/setup.py ---
@@ -0,0 +1,19 @@
+#!/usr/bin/env python
+
+from setuptools import setup
+
+exec(compile(open("py
Github user alope107 commented on a diff in the pull request:
https://github.com/apache/spark/pull/8318#discussion_r37569488
--- Diff: python/pyspark/pyspark_version.py ---
@@ -0,0 +1,17 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
Github user alope107 commented on a diff in the pull request:
https://github.com/apache/spark/pull/8318#discussion_r37569058
--- Diff: python/pyspark/__init__.py ---
@@ -36,6 +36,31 @@
Finer-grained cache persistence levels.
"""
+import os
Github user alope107 commented on the pull request:
https://github.com/apache/spark/pull/8318#issuecomment-133104217
@rgbkrk Yes, as the pyspark and Spark versions must match each other
exactly, it makes sense for deployments to pin both.
---
If your project is set up for it, you
Github user alope107 commented on a diff in the pull request:
https://github.com/apache/spark/pull/8318#discussion_r37561130
--- Diff: python/setup.py ---
@@ -0,0 +1,19 @@
+#!/usr/bin/env python
+
+from setuptools import setup
+
+exec(compile(open("py
Github user alope107 commented on a diff in the pull request:
https://github.com/apache/spark/pull/8318#discussion_r37559580
--- Diff: python/pyspark/__init__.py ---
@@ -36,6 +36,31 @@
Finer-grained cache persistence levels.
"""
+import os
Github user alope107 commented on a diff in the pull request:
https://github.com/apache/spark/pull/8318#discussion_r37558783
--- Diff: python/setup.py ---
@@ -0,0 +1,19 @@
+#!/usr/bin/env python
+
+from setuptools import setup
+
+exec(compile(open("py
Github user alope107 commented on the pull request:
https://github.com/apache/spark/pull/8318#issuecomment-132773416
@justinuang and @nchammas thanks for the feedback; I've made the suggested
changes.
---
If your project is set up for it, you can reply to this email and have
Github user alope107 commented on a diff in the pull request:
https://github.com/apache/spark/pull/8318#discussion_r37464406
--- Diff: python/pyspark/pyspark_version.py ---
@@ -0,0 +1,17 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
GitHub user alope107 opened a pull request:
https://github.com/apache/spark/pull/8318
[SPARK-1267][PYSPARK] Adds pip installer for pyspark
Adds a setup.py so that pyspark can be installed and packaged for pip.
This allows for easier setup, and declaration of dependencies. Please
19 matches
Mail list logo