The windows issue reported only affects actually running Spark on
Windows (not job submission). However, I agree it's worth cutting a
new RC. I'm going to cancel this vote and propose RC3 with a single
additional patch. Let's try to vote that through so we can ship Spark
1.2.1.

- Patrick

On Sat, Jan 31, 2015 at 7:36 PM, Matei Zaharia <matei.zaha...@gmail.com> wrote:
> This looks like a pretty serious problem, thanks! Glad people are testing on 
> Windows.
>
> Matei
>
>> On Jan 31, 2015, at 11:57 AM, MartinWeindel <martin.wein...@gmail.com> wrote:
>>
>> FYI: Spark 1.2.1rc2 does not work on Windows!
>>
>> On creating a Spark context you get following log output on my Windows
>> machine:
>> INFO  org.apache.spark.SparkEnv:59 - Registering BlockManagerMaster
>> ERROR org.apache.spark.util.Utils:75 - Failed to create local root dir in
>> C:\Users\mweindel\AppData\Local\Temp\. Ignoring this directory.
>> ERROR org.apache.spark.storage.DiskBlockManager:75 - Failed to create any
>> local dir.
>>
>> I have already located the cause. A newly added function chmod700() in
>> org.apache.util.Utils uses functionality which only works on a Unix file
>> system.
>>
>> See also pull request [https://github.com/apache/spark/pull/4299] for my
>> suggestion how to resolve the issue.
>>
>> Best regards,
>>
>> Martin Weindel
>>
>>
>>
>> --
>> View this message in context: 
>> http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-2-1-RC2-tp10317p10370.html
>> Sent from the Apache Spark Developers List mailing list archive at 
>> Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>> For additional commands, e-mail: dev-h...@spark.apache.org
>>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to