Re: Spark 1.6.2 version displayed as 1.6.1

2016-07-25 Thread Krishna Sankar
This intrigued me as well. - Just for sure, I downloaded the 1.6.2 code and recompiled. - spark-shell and pyspark both show 1.6.2 as expected. Cheers On Mon, Jul 25, 2016 at 1:45 AM, Daniel Darabos < daniel.dara...@lynxanalytics.com> wrote: > Another possible explanation is that by

Re: Spark 1.6.2 version displayed as 1.6.1

2016-07-25 Thread Daniel Darabos
Another possible explanation is that by accident you are still running Spark 1.6.1. Which download are you using? This is what I see: $ ~/spark-1.6.2-bin-hadoop2.6/bin/spark-shell log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). log4j:WARN

Re: Spark 1.6.2 version displayed as 1.6.1

2016-07-24 Thread Sean Owen
Are you certain? looks like it was correct in the release: https://github.com/apache/spark/blob/v1.6.2/core/src/main/scala/org/apache/spark/package.scala On Mon, Jul 25, 2016 at 12:33 AM, Ascot Moss wrote: > Hi, > > I am trying to upgrade spark from 1.6.1 to 1.6.2, from

Spark 1.6.2 version displayed as 1.6.1

2016-07-24 Thread Ascot Moss
Hi, I am trying to upgrade spark from 1.6.1 to 1.6.2, from 1.6.2 spark-shell, I found the version is still displayed 1.6.1 Is this a minor typo/bug? Regards ### Welcome to __ / __/__ ___ _/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\