Please vote on releasing the following candidate as Apache Spark version
2.0.0. The vote is open until Friday, July 22, 2016 at 20:00 PDT and passes
if a majority of at least 3 +1 PMC votes are cast.
[ ] +1 Release this package as Apache Spark 2.0.0
[ ] -1 Do not release this package because ...
Hi All,
I have been trying to access tables from other schema's , apart from
default , to pull data into dataframe.
i was successful in doing it using the default schema in hive database.
But when i try any other schema/database in hive, i am getting below
error.(Have also not seen any examples
Ah in that case: 0
On Tue, Jul 19, 2016 at 3:26 PM, Jonathan Kelly
wrote:
> The docs link from Reynold's initial email is apparently no longer valid.
> He posted an updated link a little later in this same thread.
>
>
>
The docs link from Reynold's initial email is apparently no longer valid.
He posted an updated link a little later in this same thread.
http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc4-docs-updated/
On Tue, Jul 19, 2016 at 3:19 PM Holden Karau wrote:
> -1
I am trying to find the root cause of recent Spark application failure in
production. When the Spark application is running I can check NodeManager's
yarn.nodemanager.log-dir property to get the Spark executor container logs.
The container has logs for both the running Spark applications
Here
-1 : The docs don't seem to be fully built (e.g.
http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc4-docs/streaming-programming-guide.html
is a zero byte file currently) - although if this is a transient apache
issue no worries.
On Thu, Jul 14, 2016 at 11:59 AM, Reynold Xin
+0
Our internal test suites seem mostly happy, except for SPARK-16632.
Since there's a somewhat easy workaround, I don't think it's a blocker
for 2.0.0.
On Thu, Jul 14, 2016 at 11:59 AM, Reynold Xin wrote:
> Please vote on releasing the following candidate as Apache Spark
Hi Reynold,
So far we've been able to transition everything to `SparkSession`. I was just
following up on behalf of Maciej.
Michael
> On Jul 19, 2016, at 11:02 AM, Reynold Xin wrote:
>
> dropping user list
>
> Yup I just took a look -- you are right.
>
> What's the
This line: "build/sbt clean assembly"
should also be changed, right?
On Tue, Jul 19, 2016 at 1:18 AM, Sean Owen wrote:
> If the change is just to replace "sbt assembly/assembly" with "sbt
> package", done. LMK if there are more edits.
>
> On Mon, Jul 18, 2016 at 10:00 PM,
dropping user list
Yup I just took a look -- you are right.
What's the reason you'd need a HiveContext? The only method that
HiveContext has and SQLContext does not have is refreshTable. Given this is
meant for helping code transition, it might be easier to just use
SQLContext and change the
Sorry Reynold, I want to triple check this with you. I'm looking at the
`SparkSession.sqlContext` field in the latest 2.0 branch, and it appears that
that val is set specifically to an instance of the `SQLContext` class. A cast
to `HiveContext` will fail. Maybe there's a misunderstanding here.
Yes. But in order to access methods available only in HiveContext a user
cast is required.
On Tuesday, July 19, 2016, Maciej Bryński wrote:
> @Reynold Xin,
> How this will work with Hive Support ?
> SparkSession.sqlContext return HiveContext ?
>
> 2016-07-19 0:26 GMT+02:00
@Reynold Xin,
How this will work with Hive Support ?
SparkSession.sqlContext return HiveContext ?
2016-07-19 0:26 GMT+02:00 Reynold Xin :
> Good idea.
>
> https://github.com/apache/spark/pull/14252
>
>
>
> On Mon, Jul 18, 2016 at 12:16 PM, Michael Armbrust
Can I point out this guy? https://issues.apache.org/jira/browse/SPARK-15705
I managed to find a workaround, but this is still IMO a pretty significant
bug.
--
View this message in context:
Are there any 'work in progress' release notes for 2.0.0 yet? I don't see
anything in the rc docs like "what's new" or "migration guide"?
On Thu, 9 Jun 2016 at 10:06 Sean Owen wrote:
> Available but mostly as JIRA output:
>
If the change is just to replace "sbt assembly/assembly" with "sbt
package", done. LMK if there are more edits.
On Mon, Jul 18, 2016 at 10:00 PM, Michael Gummelt
wrote:
> I just flailed on this a bit before finding this email. Can someone please
> update
>
I think unfortunately at least this one is gonna block:
https://issues.apache.org/jira/browse/SPARK-16620
Good news is that just about anything else that's at all a blocker has
been resolved and there are only about 6 issues of any kind at all
targeted for 2.0. It seems very close.
On Thu, Jul
17 matches
Mail list logo