Hi Shivaram,

I created a Jira Issue for the documentation error.
 https://issues.apache.org/jira/browse/SPARK-8901

Thanks
Pradeep

On Wed, Jul 8, 2015 at 11:40 AM, Shivaram Venkataraman <
shiva...@eecs.berkeley.edu> wrote:

> Hi Pradeep
>
> Thanks for the catch -- Lets open a JIRA and PR for it. I don't think
> documentation changes affect the release though Patrick can confirm that.
>
> Thanks
> Shivaram
>
> On Wed, Jul 8, 2015 at 9:35 AM, Pradeep Bashyal <prad...@bashyal.com>
> wrote:
>
>> Here's one thing I ran into:
>>
>> The SparkR documentation example in
>> http://people.apache.org/~pwendell/spark-releases/latest/sparkr.html is
>> incorrect.
>>
>>     sc <- sparkR.init(packages="com.databricks:spark-csv_2.11:1.0.3")
>>
>> should be
>>
>>     sc <- sparkR.init(sparkPackages="com.databricks:spark-csv_2.11:1.0.3")
>>
>>
>> Thanks
>> Pradeep
>>
>>
>> On Wed, Jul 8, 2015 at 6:18 AM, Sean Owen <so...@cloudera.com> wrote:
>>
>>> The POM issue is resolved and the build succeeds. The license and sigs
>>> still work. The tests pass for me with "-Pyarn -Phadoop-2.6", with the
>>> following two exceptions. Is anyone else seeing these? this is
>>> consistent on Ubuntu 14 with Java 7/8:
>>>
>>> DataFrameStatSuite:
>>> ...
>>> - special crosstab elements (., '', null, ``) *** FAILED ***
>>>   java.lang.NullPointerException:
>>>   at
>>> org.apache.spark.sql.execution.stat.StatFunctions$$anonfun$4.apply(StatFunctions.scala:131)
>>>   at
>>> org.apache.spark.sql.execution.stat.StatFunctions$$anonfun$4.apply(StatFunctions.scala:121)
>>>   at
>>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>>>   at
>>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>>>   at scala.collection.immutable.Map$Map4.foreach(Map.scala:181)
>>>   at
>>> scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
>>>   at scala.collection.AbstractTraversable.map(Traversable.scala:105)
>>>   at
>>> org.apache.spark.sql.execution.stat.StatFunctions$.crossTabulate(StatFunctions.scala:121)
>>>   at
>>> org.apache.spark.sql.DataFrameStatFunctions.crosstab(DataFrameStatFunctions.scala:94)
>>>   at
>>> org.apache.spark.sql.DataFrameStatSuite$$anonfun$5.apply$mcV$sp(DataFrameStatSuite.scala:97)
>>>   ...
>>>
>>> HiveSparkSubmitSuite:
>>> - SPARK-8368: includes jars passed in through --jars *** FAILED ***
>>>   Process returned with exit code 1. See the log4j logs for more
>>> detail. (HiveSparkSubmitSuite.scala:92)
>>> - SPARK-8020: set sql conf in spark conf *** FAILED ***
>>>   Process returned with exit code 1. See the log4j logs for more
>>> detail. (HiveSparkSubmitSuite.scala:92)
>>> - SPARK-8489: MissingRequirementError during reflection *** FAILED ***
>>>   Process returned with exit code 1. See the log4j logs for more
>>> detail. (HiveSparkSubmitSuite.scala:92)
>>>
>>> On Tue, Jul 7, 2015 at 8:06 PM, Patrick Wendell <pwend...@gmail.com>
>>> wrote:
>>> > Please vote on releasing the following candidate as Apache Spark
>>> version 1.4.1!
>>> >
>>> > This release fixes a handful of known issues in Spark 1.4.0, listed
>>> here:
>>> > http://s.apache.org/spark-1.4.1
>>> >
>>> > The tag to be voted on is v1.4.1-rc3 (commit 3e8ae38):
>>> > https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=
>>> > 3e8ae38944f13895daf328555c1ad22cd590b089
>>> >
>>> > The release files, including signatures, digests, etc. can be found at:
>>> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.1-rc3-bin/
>>> >
>>> > Release artifacts are signed with the following key:
>>> > https://people.apache.org/keys/committer/pwendell.asc
>>> >
>>> > The staging repository for this release can be found at:
>>> > [published as version: 1.4.1]
>>> >
>>> https://repository.apache.org/content/repositories/orgapachespark-1123/
>>> > [published as version: 1.4.1-rc3]
>>> >
>>> https://repository.apache.org/content/repositories/orgapachespark-1124/
>>> >
>>> > The documentation corresponding to this release can be found at:
>>> >
>>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.1-rc3-docs/
>>> >
>>> > Please vote on releasing this package as Apache Spark 1.4.1!
>>> >
>>> > The vote is open until Friday, July 10, at 20:00 UTC and passes
>>> > if a majority of at least 3 +1 PMC votes are cast.
>>> >
>>> > [ ] +1 Release this package as Apache Spark 1.4.1
>>> > [ ] -1 Do not release this package because ...
>>> >
>>> > To learn more about Apache Spark, please see
>>> > http://spark.apache.org/
>>> >
>>> > ---------------------------------------------------------------------
>>> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>>> > For additional commands, e-mail: dev-h...@spark.apache.org
>>> >
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: dev-h...@spark.apache.org
>>>
>>>
>>
>

Reply via email to