Yes, my bad. The code in session.py needs to also catch TypeError like before.
On Thu, Jun 14, 2018 at 11:03 AM, Li Jin wrote:
> Sounds good. Thanks all for the quick reply.
>
> https://issues.apache.org/jira/browse/SPARK-24563
>
>
> On Thu, Jun 14, 2018 at 12:19 PM, Xiao Li wrote:
>>
>> Thanks
Sounds good. Thanks all for the quick reply.
https://issues.apache.org/jira/browse/SPARK-24563
On Thu, Jun 14, 2018 at 12:19 PM, Xiao Li wrote:
> Thanks for catching this. Please feel free to submit a PR. I do not think
> Vanzin wants to introduce the behavior changes in that PR. We should do
Thanks for catching this. Please feel free to submit a PR. I do not think
Vanzin wants to introduce the behavior changes in that PR. We should do the
code review more carefully.
Xiao
2018-06-14 9:18 GMT-07:00 Li Jin :
> Are there objection to restore the behavior for PySpark users? I am happy
>
Are there objection to restore the behavior for PySpark users? I am happy
to submit a patch.
On Thu, Jun 14, 2018 at 12:15 PM Reynold Xin wrote:
> The behavior change is not good...
>
> On Thu, Jun 14, 2018 at 9:05 AM Li Jin wrote:
>
>> Ah, looks like it's this change:
>>
>> https://github.com/a
The behavior change is not good...
On Thu, Jun 14, 2018 at 9:05 AM Li Jin wrote:
> Ah, looks like it's this change:
>
> https://github.com/apache/spark/commit/b3417b731d4e323398a0d7ec6e86405f4464f4f9#diff-3b5463566251d5b09fd328738a9e9bc5
>
> It seems strange that by default Spark doesn't build w
Ah, looks like it's this change:
https://github.com/apache/spark/commit/b3417b731d4e323398a0d7ec6e86405f4464f4f9#diff-3b5463566251d5b09fd328738a9e9bc5
It seems strange that by default Spark doesn't build with Hive but by
default PySpark requires it...
This might also be a behavior change to PySpa
I think you would have to build with the 'hive' profile? but if so that
would have been true for a while now.
On Thu, Jun 14, 2018 at 10:38 AM Li Jin wrote:
> Hey all,
>
> I just did a clean checkout of github.com/apache/spark but failed to
> start PySpark, this is what I did:
>
> git clone g...
I can work around by using:
bin/pyspark --conf spark.sql.catalogImplementation=in-memory
now, but still wonder what's going on with HiveConf..
On Thu, Jun 14, 2018 at 11:37 AM, Li Jin wrote:
> Hey all,
>
> I just did a clean checkout of github.com/apache/spark but failed to
> start PySpark, th
Hey all,
I just did a clean checkout of github.com/apache/spark but failed to start
PySpark, this is what I did:
git clone g...@github.com:apache/spark.git; cd spark; build/sbt package;
bin/pyspark
And got this exception:
(spark-dev) Lis-MacBook-Pro:spark icexelloss$ bin/pyspark
Python 3.6.3 |