[GitHub] spark pull request #18220: [SPARK-21000][MESOS] Add Mesos labels support to ...

2017-06-11 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/18220


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18220: [SPARK-21000][MESOS] Add Mesos labels support to ...

2017-06-09 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/18220#discussion_r121203394
  
--- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosProtoUtils.scala
 ---
@@ -0,0 +1,50 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.scheduler.cluster.mesos
+
+import scala.collection.JavaConverters._
+
+import org.apache.mesos.Protos
+
+import org.apache.spark.SparkException
+import org.apache.spark.internal.Logging
+
+object MesosProtoUtils extends Logging {
+
+  /** Parses a label string of the format specified in 
spark.mesos.task.labels. */
+  def mesosLabels(labelsStr: String): Protos.Labels.Builder = {
+val labels = labelsStr.split("""(?
+  val parts = labelStr.split("""(?

[GitHub] spark pull request #18220: [SPARK-21000][MESOS] Add Mesos labels support to ...

2017-06-09 Thread mgummelt
Github user mgummelt commented on a diff in the pull request:

https://github.com/apache/spark/pull/18220#discussion_r121194945
  
--- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosProtoUtils.scala
 ---
@@ -0,0 +1,63 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.scheduler.cluster.mesos
+
+import scala.collection.JavaConverters._
+
+import org.apache.mesos.Protos
+
+import org.apache.spark.SparkException
+import org.apache.spark.internal.Logging
+
+object MesosProtoUtils extends Logging {
+
+  /** Parses a label string of the format specified in 
spark.mesos.task.labels. */
+  def mesosLabels(labelsStr: String): Protos.Labels.Builder = {
+
+// Return str split around unescaped occurrences of c.
+def splitUnescaped(str: String, c: Char): Seq[String] = {
--- End diff --

That's exactly what I was looking for! Thanks! Fixed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18220: [SPARK-21000][MESOS] Add Mesos labels support to ...

2017-06-09 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/18220#discussion_r121084431
  
--- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosProtoUtils.scala
 ---
@@ -0,0 +1,63 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.scheduler.cluster.mesos
+
+import scala.collection.JavaConverters._
+
+import org.apache.mesos.Protos
+
+import org.apache.spark.SparkException
+import org.apache.spark.internal.Logging
+
+object MesosProtoUtils extends Logging {
+
+  /** Parses a label string of the format specified in 
spark.mesos.task.labels. */
+  def mesosLabels(labelsStr: String): Protos.Labels.Builder = {
+
+// Return str split around unescaped occurrences of c.
+def splitUnescaped(str: String, c: Char): Seq[String] = {
--- End diff --

I still think you can do this much more directly with regexes, even with 
escapes. It looks a little tricky but a negative lookbehind does the trick. For 
example:

```
scala> """key:value,key2:a\:b,key3:a\,b""".split("""(?

[GitHub] spark pull request #18220: [SPARK-21000][MESOS] Add Mesos labels support to ...

2017-06-08 Thread mgummelt
Github user mgummelt commented on a diff in the pull request:

https://github.com/apache/spark/pull/18220#discussion_r120997560
  
--- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosProtoUtils.scala
 ---
@@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.scheduler.cluster.mesos
+
+import org.apache.mesos.Protos
+
+import org.apache.spark.SparkException
+import org.apache.spark.internal.Logging
+
+object MesosProtoUtils extends Logging {
+
+  /** Parses a label string of the format specified in 
spark.mesos.task.labels. */
+  def mesosLabels(labelsStr: String): Protos.Labels.Builder = {
+var key: Option[String] = None
+var value: Option[String] = None
+var currStr = ""
+var i = 0
+val labels = Protos.Labels.newBuilder()
+
+// 0 -> parsing key
+// 1 -> parsing value
+var state = 0
+
+def addLabel() = {
+  value = Some(currStr)
+  if (key.isEmpty) {
+throw new SparkException(s"Error while parsing label string: 
${labelsStr}.  " +
+  s"Empty label key.")
+  } else {
+val label = 
Protos.Label.newBuilder().setKey(key.get).setValue(value.get)
+labels.addLabels(label)
+
+key = None
+value = None
+currStr = ""
+state = 0
+  }
+}
+
+while(i < labelsStr.length) {
--- End diff --

this code was removed, so this is fixed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18220: [SPARK-21000][MESOS] Add Mesos labels support to ...

2017-06-08 Thread mgummelt
Github user mgummelt commented on a diff in the pull request:

https://github.com/apache/spark/pull/18220#discussion_r120997425
  
--- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosProtoUtils.scala
 ---
@@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.scheduler.cluster.mesos
+
+import org.apache.mesos.Protos
+
+import org.apache.spark.SparkException
+import org.apache.spark.internal.Logging
+
+object MesosProtoUtils extends Logging {
+
+  /** Parses a label string of the format specified in 
spark.mesos.task.labels. */
+  def mesosLabels(labelsStr: String): Protos.Labels.Builder = {
+var key: Option[String] = None
+var value: Option[String] = None
+var currStr = ""
+var i = 0
+val labels = Protos.Labels.newBuilder()
+
+// 0 -> parsing key
+// 1 -> parsing value
+var state = 0
--- End diff --

I simplified it, but I can't do a simple regex splitting, because I have to 
condition the match on characters (the escape sequence), that shouldn't 
actually be considered part of the matched string.  So I just wrote a custom 
`splitUnescaped` method to implement what I need.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18220: [SPARK-21000][MESOS] Add Mesos labels support to ...

2017-06-08 Thread mgummelt
Github user mgummelt commented on a diff in the pull request:

https://github.com/apache/spark/pull/18220#discussion_r120997514
  
--- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosProtoUtils.scala
 ---
@@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.scheduler.cluster.mesos
+
+import org.apache.mesos.Protos
+
+import org.apache.spark.SparkException
+import org.apache.spark.internal.Logging
+
+object MesosProtoUtils extends Logging {
+
+  /** Parses a label string of the format specified in 
spark.mesos.task.labels. */
+  def mesosLabels(labelsStr: String): Protos.Labels.Builder = {
+var key: Option[String] = None
+var value: Option[String] = None
+var currStr = ""
+var i = 0
+val labels = Protos.Labels.newBuilder()
+
+// 0 -> parsing key
+// 1 -> parsing value
+var state = 0
+
+def addLabel() = {
+  value = Some(currStr)
+  if (key.isEmpty) {
+throw new SparkException(s"Error while parsing label string: 
${labelsStr}.  " +
+  s"Empty label key.")
--- End diff --

Fixed


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18220: [SPARK-21000][MESOS] Add Mesos labels support to ...

2017-06-08 Thread mgummelt
Github user mgummelt commented on a diff in the pull request:

https://github.com/apache/spark/pull/18220#discussion_r120997036
  
--- Diff: docs/running-on-mesos.md ---
@@ -469,6 +470,15 @@ See the [configuration page](configuration.html) for 
information on Spark config
   
 
 
+  spark.mesos.driver.labels
+  (none)
+  
+Mesos labels to add to the driver.  See spark.mesos.task.labels
--- End diff --

fixed


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18220: [SPARK-21000][MESOS] Add Mesos labels support to ...

2017-06-08 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/18220#discussion_r120842747
  
--- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosProtoUtils.scala
 ---
@@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.scheduler.cluster.mesos
+
+import org.apache.mesos.Protos
+
+import org.apache.spark.SparkException
+import org.apache.spark.internal.Logging
+
+object MesosProtoUtils extends Logging {
+
+  /** Parses a label string of the format specified in 
spark.mesos.task.labels. */
+  def mesosLabels(labelsStr: String): Protos.Labels.Builder = {
+var key: Option[String] = None
+var value: Option[String] = None
+var currStr = ""
+var i = 0
+val labels = Protos.Labels.newBuilder()
+
+// 0 -> parsing key
+// 1 -> parsing value
+var state = 0
--- End diff --

This all looks excessively complex. Can't you do this with a regex in a few 
lines?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18220: [SPARK-21000][MESOS] Add Mesos labels support to ...

2017-06-08 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/18220#discussion_r120842610
  
--- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosProtoUtils.scala
 ---
@@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.scheduler.cluster.mesos
+
+import org.apache.mesos.Protos
+
+import org.apache.spark.SparkException
+import org.apache.spark.internal.Logging
+
+object MesosProtoUtils extends Logging {
+
+  /** Parses a label string of the format specified in 
spark.mesos.task.labels. */
+  def mesosLabels(labelsStr: String): Protos.Labels.Builder = {
+var key: Option[String] = None
+var value: Option[String] = None
+var currStr = ""
+var i = 0
+val labels = Protos.Labels.newBuilder()
+
+// 0 -> parsing key
+// 1 -> parsing value
+var state = 0
+
+def addLabel() = {
+  value = Some(currStr)
+  if (key.isEmpty) {
+throw new SparkException(s"Error while parsing label string: 
${labelsStr}.  " +
+  s"Empty label key.")
+  } else {
+val label = 
Protos.Label.newBuilder().setKey(key.get).setValue(value.get)
+labels.addLabels(label)
+
+key = None
+value = None
+currStr = ""
+state = 0
+  }
+}
+
+while(i < labelsStr.length) {
--- End diff --

Nit: space after while


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18220: [SPARK-21000][MESOS] Add Mesos labels support to ...

2017-06-08 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/18220#discussion_r120842205
  
--- Diff: docs/running-on-mesos.md ---
@@ -469,6 +470,15 @@ See the [configuration page](configuration.html) for 
information on Spark config
   
 
 
+  spark.mesos.driver.labels
+  (none)
+  
+Mesos labels to add to the driver.  See spark.mesos.task.labels
--- End diff --

Nit: format props and code with ``


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18220: [SPARK-21000][MESOS] Add Mesos labels support to ...

2017-06-08 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/18220#discussion_r120842136
  
--- Diff: docs/running-on-mesos.md ---
@@ -469,6 +470,15 @@ See the [configuration page](configuration.html) for 
information on Spark config
   
 
 
+  spark.mesos.driver.labels
--- End diff --

This naming differs a bit from the YARN label property, but I suppose it's 
supporting an expression. And we have .task.labels already. OK.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18220: [SPARK-21000][MESOS] Add Mesos labels support to ...

2017-06-08 Thread srowen
Github user srowen commented on a diff in the pull request:

https://github.com/apache/spark/pull/18220#discussion_r120842358
  
--- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosProtoUtils.scala
 ---
@@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.scheduler.cluster.mesos
+
+import org.apache.mesos.Protos
+
+import org.apache.spark.SparkException
+import org.apache.spark.internal.Logging
+
+object MesosProtoUtils extends Logging {
+
+  /** Parses a label string of the format specified in 
spark.mesos.task.labels. */
+  def mesosLabels(labelsStr: String): Protos.Labels.Builder = {
+var key: Option[String] = None
+var value: Option[String] = None
+var currStr = ""
+var i = 0
+val labels = Protos.Labels.newBuilder()
+
+// 0 -> parsing key
+// 1 -> parsing value
+var state = 0
+
+def addLabel() = {
+  value = Some(currStr)
+  if (key.isEmpty) {
+throw new SparkException(s"Error while parsing label string: 
${labelsStr}.  " +
+  s"Empty label key.")
--- End diff --

Nit, don't need interpolation but whatever


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18220: [SPARK-21000][MESOS] Add Mesos labels support to ...

2017-06-07 Thread mgummelt
Github user mgummelt commented on a diff in the pull request:

https://github.com/apache/spark/pull/18220#discussion_r120764076
  
--- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosProtoUtils.scala
 ---
@@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.scheduler.cluster.mesos
+
+import org.apache.mesos.Protos
+
+import org.apache.spark.SparkException
+import org.apache.spark.internal.Logging
+
+object MesosProtoUtils extends Logging {
+
+  /** Parses a label string of the format specified in 
spark.mesos.task.labels. */
+  def mesosLabels(labelsStr: String): Protos.Labels.Builder = {
+var key: Option[String] = None
+var value: Option[String] = None
+var currStr = ""
+var i = 0
+val labels = Protos.Labels.newBuilder()
+
+// 0 -> parsing key
+// 1 -> parsing value
+var state = 0
+
+def addLabel() = {
+  value = Some(currStr)
+  if (key.isEmpty) {
+throw new SparkException(s"Error while parsing label string: 
${labelsStr}.  " +
+  s"Empty label key.")
+  } else {
+val label = 
Protos.Label.newBuilder().setKey(key.get).setValue(value.get)
+labels.addLabels(label)
+
+key = None
+value = None
+currStr = ""
+state = 0
+  }
+}
+
+while(i < labelsStr.length) {
--- End diff --

Ideally, this would be more functional.  I tried to model it with map/fold, 
but I'm not smart enough.  If someone cares, I can try to rewrite it to be 
recursive, at least.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #18220: [SPARK-21000][MESOS] Add Mesos labels support to ...

2017-06-06 Thread mgummelt
GitHub user mgummelt opened a pull request:

https://github.com/apache/spark/pull/18220

[SPARK-21000][MESOS] Add Mesos labels support to the Spark Dispatcher

## What changes were proposed in this pull request?

Add Mesos labels support to the Spark Dispatcher

## How was this patch tested?

unit tests


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/mesosphere/spark SPARK-21000-dispatcher-labels

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/18220.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #18220


commit ee10af6705dfc607718b4d709bb27e57b077ae00
Author: Michael Gummelt 
Date:   2017-06-06T01:45:10Z

Add Mesos labels support to the Spark Dispatcher




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org