[GitHub] spark issue #19966: Fix submission request

2017-12-13 Thread Gschiavon
Github user Gschiavon commented on the issue:

https://github.com/apache/spark/pull/19966
  
Jenkins, test this please




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19966: Fix submission request

2017-12-13 Thread Gschiavon
Github user Gschiavon commented on the issue:

https://github.com/apache/spark/pull/19966
  
@felixcheung @cloud-fan @vanzin @susanxhuynh @gatorsmile @ArtRand Here is 
the new PR


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19966: Fix submission request

2017-12-13 Thread Gschiavon
GitHub user Gschiavon opened a pull request:

https://github.com/apache/spark/pull/19966

Fix submission request

## What changes were proposed in this pull request?

PR closed with all the comments -> 
https://github.com/apache/spark/pull/19793

It solves the problem when submitting a wrong CreateSubmissionRequest to 
Spark Dispatcher was causing a bad state of Dispatcher and making it inactive 
as a Mesos framework.

https://issues.apache.org/jira/browse/SPARK-22574

## How was this patch tested?

All spark test passed successfully.

It was tested sending a wrong request (without appArgs) before and after 
the change. The point is easy, check if the value is null before being accessed.

This was before the change, leaving the dispatcher inactive:

```
Exception in thread "Thread-22" java.lang.NullPointerException
at 
org.apache.spark.scheduler.cluster.mesos.MesosClusterScheduler.getDriverCommandValue(MesosClusterScheduler.scala:444)
at 
org.apache.spark.scheduler.cluster.mesos.MesosClusterScheduler.buildDriverCommand(MesosClusterScheduler.scala:451)
at 
org.apache.spark.scheduler.cluster.mesos.MesosClusterScheduler.org$apache$spark$scheduler$cluster$mesos$MesosClusterScheduler$$createTaskInfo(MesosClusterScheduler.scala:538)
at 
org.apache.spark.scheduler.cluster.mesos.MesosClusterScheduler$$anonfun$scheduleTasks$1.apply(MesosClusterScheduler.scala:570)
at 
org.apache.spark.scheduler.cluster.mesos.MesosClusterScheduler$$anonfun$scheduleTasks$1.apply(MesosClusterScheduler.scala:555)
at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at 
org.apache.spark.scheduler.cluster.mesos.MesosClusterScheduler.scheduleTasks(MesosClusterScheduler.scala:555)
at 
org.apache.spark.scheduler.cluster.mesos.MesosClusterScheduler.resourceOffers(MesosClusterScheduler.scala:621)
```

And after:

```
  "message" : "Malformed request: 
org.apache.spark.deploy.rest.SubmitRestProtocolException: Validation of message 
CreateSubmissionRequest 
failed!\n\torg.apache.spark.deploy.rest.SubmitRestProtocolMessage.validate(SubmitRestProtocolMessage.scala:70)\n\torg.apache.spark.deploy.rest.SubmitRequestServlet.doPost(RestSubmissionServer.scala:272)\n\tjavax.servlet.http.HttpServlet.service(HttpServlet.java:707)\n\tjavax.servlet.http.HttpServlet.service(HttpServlet.java:790)\n\torg.spark_project.jetty.servlet.ServletHolder.handle(ServletHolder.java:845)\n\torg.spark_project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:583)\n\torg.spark_project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)\n\torg.spark_project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511)\n\torg.spark_project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)\n\torg.spark_project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\torg.s
 
park_project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)\n\torg.spark_project.jetty.server.Server.handle(Server.java:524)\n\torg.spark_project.jetty.server.HttpChannel.handle(HttpChannel.java:319)\n\torg.spark_project.jetty.server.HttpConnection.onFillable(HttpConnection.java:253)\n\torg.spark_project.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)\n\torg.spark_project.jetty.io.FillInterest.fillable(FillInterest.java:95)\n\torg.spark_project.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)\n\torg.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)\n\torg.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)\n\torg.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)\n\torg.spark_project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.
 
java:671)\n\torg.spark_project.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)\n\tjava.lang.Thread.run(Thread.java:745)"
```

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/Gschiavon/spark fix-submission-request

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/19966.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #19966


commit 44fd5d3921299f93d1aab7fe971078a6bce835a2
Author: German Schiavon <germanschia...@gmail.com>
Date:   2017-11-21T15:32:04Z

Check submission request parameters

commit 14d64172500483f9e984ac28ba5f3b52db33ad9e
Author: German Schiavon <germanschia...@gmail.com>
Date:   2017-11-27T08

[GitHub] spark issue #19793: [SPARK-22574] [Mesos] [Submit] Check submission request ...

2017-12-13 Thread Gschiavon
Github user Gschiavon commented on the issue:

https://github.com/apache/spark/pull/19793
  
Hi @vanzin ! I just fixed it and pushed it to my branch but it's not 
opening the PR, maybe I need to do a new one?



---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19793: [SPARK-22574] [Mesos] [Submit] Check submission r...

2017-11-30 Thread Gschiavon
Github user Gschiavon commented on a diff in the pull request:

https://github.com/apache/spark/pull/19793#discussion_r154072747
  
--- Diff: 
core/src/test/scala/org/apache/spark/deploy/rest/SubmitRestProtocolSuite.scala 
---
@@ -86,6 +86,8 @@ class SubmitRestProtocolSuite extends SparkFunSuite {
 message.clientSparkVersion = "1.2.3"
 message.appResource = "honey-walnut-cherry.jar"
 message.mainClass = "org.apache.spark.examples.SparkPie"
+message.appArgs = Array("hdfs://tmp/auth")
+message.environmentVariables = Map("SPARK_HOME" -> "/test")
--- End diff --

Cool, I can do it when I have some time.

Let me know if I have to something else here!

Thanks @susanxhuynh 


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19793: [SPARK-22574] [Mesos] [Submit] Check submission r...

2017-11-29 Thread Gschiavon
Github user Gschiavon commented on a diff in the pull request:

https://github.com/apache/spark/pull/19793#discussion_r153901749
  
--- Diff: 
core/src/test/scala/org/apache/spark/deploy/rest/SubmitRestProtocolSuite.scala 
---
@@ -86,6 +86,8 @@ class SubmitRestProtocolSuite extends SparkFunSuite {
 message.clientSparkVersion = "1.2.3"
 message.appResource = "honey-walnut-cherry.jar"
 message.mainClass = "org.apache.spark.examples.SparkPie"
+message.appArgs = Array("hdfs://tmp/auth")
+message.environmentVariables = Map("SPARK_HOME" -> "/test")
--- End diff --

Perfect then @susanxhuynh . Fix it maybe in another PR? 




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19793: [SPARK-22574] [Mesos] [Submit] Check submission r...

2017-11-28 Thread Gschiavon
Github user Gschiavon commented on a diff in the pull request:

https://github.com/apache/spark/pull/19793#discussion_r153709924
  
--- Diff: 
core/src/test/scala/org/apache/spark/deploy/rest/SubmitRestProtocolSuite.scala 
---
@@ -86,6 +86,8 @@ class SubmitRestProtocolSuite extends SparkFunSuite {
 message.clientSparkVersion = "1.2.3"
 message.appResource = "honey-walnut-cherry.jar"
 message.mainClass = "org.apache.spark.examples.SparkPie"
+message.appArgs = Array("hdfs://tmp/auth")
+message.environmentVariables = Map("SPARK_HOME" -> "/test")
--- End diff --

@susanxhuynh Yes we have to make sure we don't break the Standalone mode.
I've reviewed StandaloneSubmitRequestServlet class and i think we might be 
facing the same problem here, (L.126-131) You can find checks for 'appResource' 
and 'mainClass' but not for 'appArgs'(L.140) or 'environmentVariables'(L.143) 
when they could be null as they are initialised as null.

I think is the same case, let me know what you think.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19793: [SPARK-22574] [Mesos] [Submit] Check submission r...

2017-11-27 Thread Gschiavon
Github user Gschiavon commented on a diff in the pull request:

https://github.com/apache/spark/pull/19793#discussion_r153415074
  
--- Diff: 
core/src/test/scala/org/apache/spark/deploy/rest/SubmitRestProtocolSuite.scala 
---
@@ -86,6 +86,8 @@ class SubmitRestProtocolSuite extends SparkFunSuite {
 message.clientSparkVersion = "1.2.3"
 message.appResource = "honey-walnut-cherry.jar"
 message.mainClass = "org.apache.spark.examples.SparkPie"
+message.appArgs = Array("hdfs://tmp/auth")
+message.environmentVariables = Map("SPARK_HOME" -> "/test")
--- End diff --

@susanxhuynh I've checked what you said but those variables are overwritten 
below, so I can't actually check it, right?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19793: [SPARK-22574] [Mesos] [Submit] Check submission r...

2017-11-27 Thread Gschiavon
Github user Gschiavon commented on a diff in the pull request:

https://github.com/apache/spark/pull/19793#discussion_r153414519
  
--- Diff: 
core/src/test/scala/org/apache/spark/deploy/rest/SubmitRestProtocolSuite.scala 
---
@@ -86,6 +86,8 @@ class SubmitRestProtocolSuite extends SparkFunSuite {
 message.clientSparkVersion = "1.2.3"
 message.appResource = "honey-walnut-cherry.jar"
 message.mainClass = "org.apache.spark.examples.SparkPie"
+message.appArgs = Array("hdfs://tmp/auth")
+message.environmentVariables = Map("SPARK_HOME" -> "/test")
--- End diff --

@felixcheung okay done :)


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19793: [SPARK-22574] [Mesos] [Submit] Check submission r...

2017-11-27 Thread Gschiavon
Github user Gschiavon commented on a diff in the pull request:

https://github.com/apache/spark/pull/19793#discussion_r153410771
  
--- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/deploy/rest/mesos/MesosRestServer.scala
 ---
@@ -82,6 +82,12 @@ private[mesos] class MesosSubmitRequestServlet(
 val mainClass = Option(request.mainClass).getOrElse {
   throw new SubmitRestMissingFieldException("Main class is missing.")
 }
+val appArgs = Option(request.appArgs).getOrElse {
+  throw new SubmitRestMissingFieldException("Application arguments are 
missing.")
--- End diff --

Done @ArtRand 


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19793: [SPARK-22574] [Mesos] [Submit] Check submission r...

2017-11-27 Thread Gschiavon
Github user Gschiavon commented on a diff in the pull request:

https://github.com/apache/spark/pull/19793#discussion_r153410217
  
--- Diff: 
core/src/main/scala/org/apache/spark/deploy/rest/SubmitRestProtocolRequest.scala
 ---
@@ -46,6 +46,8 @@ private[rest] class CreateSubmissionRequest extends 
SubmitRestProtocolRequest {
 super.doValidate()
 assert(sparkProperties != null, "No Spark properties set!")
 assertFieldIsSet(appResource, "appResource")
+assertFieldIsSet(appArgs, "appArgs")
+assertFieldIsSet(environmentVariables, "environmentVariables")
--- End diff --

Actually If the caller wouldn't set "appArgs" or "environmentVariables" was 
causing a null pointer and leaving the Dispatcher inactive. So now I think the 
caller should pass an empty array, I could add a test for that case 
@susanxhuynh .


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19793: [SPARK-22574] [Mesos] [Submit] Check submission request ...

2017-11-27 Thread Gschiavon
Github user Gschiavon commented on the issue:

https://github.com/apache/spark/pull/19793
  
Yes @ArtRand! I think it's not documented at all, ping me when it's done 
and I will review it :)




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19793: [SPARK-22574] [Mesos] [Submit] Check submission request ...

2017-11-27 Thread Gschiavon
Github user Gschiavon commented on the issue:

https://github.com/apache/spark/pull/19793
  
can we retest this?


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19793: [SPARK-22574] [Mesos] [Submit] Check submission r...

2017-11-27 Thread Gschiavon
Github user Gschiavon commented on a diff in the pull request:

https://github.com/apache/spark/pull/19793#discussion_r153131125
  
--- Diff: 
core/src/test/scala/org/apache/spark/deploy/rest/SubmitRestProtocolSuite.scala 
---
@@ -86,6 +86,8 @@ class SubmitRestProtocolSuite extends SparkFunSuite {
 message.clientSparkVersion = "1.2.3"
 message.appResource = "honey-walnut-cherry.jar"
 message.mainClass = "org.apache.spark.examples.SparkPie"
+message.appArgs = Array("hdfs://tmp/auth")
+message.environmentVariables = Map("SPARK_SCALA_VERSION" -> "2.11")
--- End diff --

In this case it's just a name, I could change it in order to make it clear!


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19793: [SPARK-22574] [Mesos] [Submit] Check submission request ...

2017-11-23 Thread Gschiavon
Github user Gschiavon commented on the issue:

https://github.com/apache/spark/pull/19793
  
ping @ArtRand


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19793: [SPARK-22574] [Mesos] [Submit] Check submission r...

2017-11-21 Thread Gschiavon
GitHub user Gschiavon opened a pull request:

https://github.com/apache/spark/pull/19793

[SPARK-22574] [Mesos] [Submit] Check submission request parameters

## What changes were proposed in this pull request?

It solving the problem when submitting a wrong CreateSubmissionRequest to 
Spark Dispatcher was causing a bad state of Dispatcher and making it inactive 
as a Mesos framework.

## How was this patch tested?

It was tested sending a wrong request (without appArgs) before and after 
the change. The point is easy, check if the value is null before being accessed.

This was before the change, leaving the dispatcher inactive:

```
Exception in thread "Thread-22" java.lang.NullPointerException
at 
org.apache.spark.scheduler.cluster.mesos.MesosClusterScheduler.getDriverCommandValue(MesosClusterScheduler.scala:444)
at 
org.apache.spark.scheduler.cluster.mesos.MesosClusterScheduler.buildDriverCommand(MesosClusterScheduler.scala:451)
at 
org.apache.spark.scheduler.cluster.mesos.MesosClusterScheduler.org$apache$spark$scheduler$cluster$mesos$MesosClusterScheduler$$createTaskInfo(MesosClusterScheduler.scala:538)
at 
org.apache.spark.scheduler.cluster.mesos.MesosClusterScheduler$$anonfun$scheduleTasks$1.apply(MesosClusterScheduler.scala:570)
at 
org.apache.spark.scheduler.cluster.mesos.MesosClusterScheduler$$anonfun$scheduleTasks$1.apply(MesosClusterScheduler.scala:555)
at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at 
org.apache.spark.scheduler.cluster.mesos.MesosClusterScheduler.scheduleTasks(MesosClusterScheduler.scala:555)
at 
org.apache.spark.scheduler.cluster.mesos.MesosClusterScheduler.resourceOffers(MesosClusterScheduler.scala:621)
```

And after:

```
  "message" : "Malformed request: 
org.apache.spark.deploy.rest.SubmitRestProtocolException: Validation of message 
CreateSubmissionRequest 
failed!\n\torg.apache.spark.deploy.rest.SubmitRestProtocolMessage.validate(SubmitRestProtocolMessage.scala:70)\n\torg.apache.spark.deploy.rest.SubmitRequestServlet.doPost(RestSubmissionServer.scala:272)\n\tjavax.servlet.http.HttpServlet.service(HttpServlet.java:707)\n\tjavax.servlet.http.HttpServlet.service(HttpServlet.java:790)\n\torg.spark_project.jetty.servlet.ServletHolder.handle(ServletHolder.java:845)\n\torg.spark_project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:583)\n\torg.spark_project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)\n\torg.spark_project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511)\n\torg.spark_project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)\n\torg.spark_project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\torg.s
 
park_project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)\n\torg.spark_project.jetty.server.Server.handle(Server.java:524)\n\torg.spark_project.jetty.server.HttpChannel.handle(HttpChannel.java:319)\n\torg.spark_project.jetty.server.HttpConnection.onFillable(HttpConnection.java:253)\n\torg.spark_project.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)\n\torg.spark_project.jetty.io.FillInterest.fillable(FillInterest.java:95)\n\torg.spark_project.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)\n\torg.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)\n\torg.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)\n\torg.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)\n\torg.spark_project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.
 
java:671)\n\torg.spark_project.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)\n\tjava.lang.Thread.run(Thread.java:745)",

```


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Gschiavon/spark fix-submission-request

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/19793.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #19793


commit 4f28d0011892326c6ec3fd9a4fc6fb48756cd635
Author: German Schiavon <germanschia...@gmail.com>
Date:   2017-11-21T15:32:04Z

Check submission request parameters




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19205: [SPARK-21982] Set locale to US

2017-09-12 Thread Gschiavon
GitHub user Gschiavon opened a pull request:

https://github.com/apache/spark/pull/19205

[SPARK-21982] Set locale to US

## What changes were proposed in this pull request?

In UtilsSuite Locale was set by default to US, but at the format time it 
wasn't, taking by default JVM locale which could be different than US making 
this test fail.

## How was this patch tested?
Unit test (UtilsSuite)

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Gschiavon/spark fix/test-locale

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/19205.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #19205


commit 22bbb924eae20b8d3f899008317f5d623c6a49ef
Author: German Schiavon <germanschia...@gmail.com>
Date:   2017-09-12T12:05:03Z

Set locale to US




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19200: Get default Locale

2017-09-12 Thread Gschiavon
Github user Gschiavon closed the pull request at:

https://github.com/apache/spark/pull/19200


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19200: Get default Locale

2017-09-12 Thread Gschiavon
Github user Gschiavon commented on the issue:

https://github.com/apache/spark/pull/19200
  
Ok, I got it. I will do that then. 
Thanks.


---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #19200: Get default Locale

2017-09-12 Thread Gschiavon
Github user Gschiavon commented on the issue:

https://github.com/apache/spark/pull/19200
  
As far as I saw there are other test cases that set default locale to US. 
This case is not passing when your jvm default locale value differs from "US". 



---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #19200: Set default Locale

2017-09-12 Thread Gschiavon
GitHub user Gschiavon opened a pull request:

https://github.com/apache/spark/pull/19200

Set default Locale

## What changes were proposed in this pull request?

Get default Locale in UtilsSuite.scala in order to make it work with 
different Locales than US.

## How was this patch tested?

Running UtilsSuite.scala 

Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/Gschiavon/spark fix/locale

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/19200.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #19200


commit 632526ba3e9a4d72133202cf0bfcc8a997dc9cb9
Author: German Schiavon <germanschia...@gmail.com>
Date:   2017-09-12T08:33:00Z

Set default Locale




---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org