[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

2016-07-26 Thread asfgit
Github user asfgit closed the pull request at:

https://github.com/apache/spark/pull/14344


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

2016-07-25 Thread rxin
Github user rxin commented on a diff in the pull request:

https://github.com/apache/spark/pull/14344#discussion_r72100760
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala
 ---
@@ -501,6 +501,143 @@ case class MapObjects private(
   }
 }
 
+object ExternalMapToCatalyst {
+  private val curId = new java.util.concurrent.atomic.AtomicInteger()
+
+  def apply(
+  inputMap: Expression,
+  keyType: DataType,
+  keyConverter: Expression => Expression,
+  valueType: DataType,
+  valueConverter: Expression => Expression): ExternalMapToCatalyst = {
+val id = curId.getAndIncrement()
+val keyName = "ExternalMapToCatalyst_key" + id
+val valueName = "ExternalMapToCatalyst_value" + id
+val valueIsNull = "ExternalMapToCatalyst_value_isNull" + id
+
+ExternalMapToCatalyst(
+  keyName,
+  keyType,
+  keyConverter(LambdaVariable(keyName, "false", keyType)),
+  valueName,
+  valueIsNull,
+  valueType,
+  valueConverter(LambdaVariable(valueName, valueIsNull, valueType)),
+  inputMap
+)
+  }
+}
+
+case class ExternalMapToCatalyst private(
--- End diff --

need documentation to explain what this does


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

2016-07-25 Thread liancheng
Github user liancheng commented on a diff in the pull request:

https://github.com/apache/spark/pull/14344#discussion_r72045939
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala
 ---
@@ -501,6 +501,143 @@ case class MapObjects private(
   }
 }
 
+object ExternalMapToCatalyst {
+  private val curId = new java.util.concurrent.atomic.AtomicInteger()
+
+  def apply(
+  inputMap: Expression,
+  keyType: DataType,
+  keyConverter: Expression => Expression,
+  valueType: DataType,
+  valueConverter: Expression => Expression): ExternalMapToCatalyst = {
+val id = curId.getAndIncrement()
+val keyName = "ExternalMapToCatalyst_key" + id
+val valueName = "ExternalMapToCatalyst_value" + id
+val valueIsNull = "ExternalMapToCatalyst_value_isNull" + id
+
+ExternalMapToCatalyst(
+  keyName,
+  keyType,
+  keyConverter(LambdaVariable(keyName, "false", keyType)),
+  valueName,
+  valueIsNull,
+  valueType,
+  valueConverter(LambdaVariable(valueName, valueIsNull, valueType)),
+  inputMap
+)
+  }
+}
+
+case class ExternalMapToCatalyst private(
+key: String,
+keyType: DataType,
+keyConverter: Expression,
+value: String,
+valueIsNull: String,
+valueType: DataType,
+valueConverter: Expression,
+child: Expression)
+  extends UnaryExpression with NonSQLExpression {
+
+  override def foldable: Boolean = false
+
+  override def dataType: MapType = MapType(keyConverter.dataType, 
valueConverter.dataType)
+
+  override def eval(input: InternalRow): Any =
+throw new UnsupportedOperationException("Only code-generated 
evaluation is supported")
+
+  override protected def doGenCode(ctx: CodegenContext, ev: ExprCode): 
ExprCode = {
+val inputMap = child.genCode(ctx)
+val genKeyConverter = keyConverter.genCode(ctx)
+val genValueConverter = valueConverter.genCode(ctx)
+val length = ctx.freshName("length")
+val index = ctx.freshName("index")
+val convertedKeys = ctx.freshName("convertedKeys")
+val convertedValues = ctx.freshName("convertedValues")
+val entry = ctx.freshName("entry")
+val entries = ctx.freshName("entries")
+
+val (defineEntries, defineKeyValue) = child.dataType match {
+  case ObjectType(cls) if classOf[java.util.Map[_, 
_]].isAssignableFrom(cls) =>
+val javaIteratorCls = classOf[java.util.Iterator[_]].getName
+val javaMapEntryCls = classOf[java.util.Map.Entry[_, _]].getName
+
+val defineEntries =
+  s"final $javaIteratorCls $entries = 
${inputMap.value}.entrySet().iterator();"
+
+val defineKeyValue =
+  s"""
+final $javaMapEntryCls $entry = ($javaMapEntryCls) 
$entries.next();
+${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) 
$entry.getKey();
+${ctx.javaType(valueType)} $value = 
(${ctx.boxedType(valueType)}) $entry.getValue();
+  """
+
+defineEntries -> defineKeyValue
+
+  case ObjectType(cls) if classOf[scala.collection.Map[_, 
_]].isAssignableFrom(cls) =>
+val scalaIteratorCls = classOf[Iterator[_]].getName
+val scalaMapEntryCls = classOf[Tuple2[_, _]].getName
+
+val defineEntries = s"final $scalaIteratorCls $entries = 
${inputMap.value}.iterator();"
+
+val defineKeyValue =
+  s"""
+final $scalaMapEntryCls $entry = ($scalaMapEntryCls) 
$entries.next();
+${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) 
$entry._1();
+${ctx.javaType(valueType)} $value = 
(${ctx.boxedType(valueType)}) $entry._2();
+  """
+
+defineEntries -> defineKeyValue
+}
+
+val valueNullCheck = if (ctx.isPrimitiveType(valueType)) {
+  s"boolean $valueIsNull = false;"
+} else {
+  s"boolean $valueIsNull = $value == null;"
+}
+
+val arrayCls = classOf[GenericArrayData].getName
+val mapCls = classOf[ArrayBasedMapData].getName
+val convertedKeyType = ctx.boxedType(keyConverter.dataType)
+val convertedValueType = ctx.boxedType(valueConverter.dataType)
+val code =
+  s"""
+${inputMap.code}
+${ctx.javaType(dataType)} ${ev.value} = 
${ctx.defaultValue(dataType)};
+if (!${inputMap.isNull}) {
+  final int $length = ${inputMap.value}.size();
+  final Object[] $convertedKeys = new Object[$length];
+  final Object[] $convertedValues = new Object[$length];
+  int $index = 0;
+  $defineE

[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

2016-07-25 Thread liancheng
Github user liancheng commented on a diff in the pull request:

https://github.com/apache/spark/pull/14344#discussion_r72036250
  
--- Diff: 
sql/core/src/test/java/test/org/apache/spark/sql/JavaDatasetSuite.java ---
@@ -673,21 +694,24 @@ public void testJavaBeanEncoder() {
   new byte[]{1, 2},
   new String[]{"hello", null},
   Arrays.asList("a", "b"),
-  Arrays.asList(100L, null, 200L)});
+  Arrays.asList(100L, null, 200L),
+  map1});
 Row row2 = new GenericRow(new Object[]{
   false,
   30,
   new byte[]{3, 4},
   new String[]{null, "world"},
   Arrays.asList("x", "y"),
-  Arrays.asList(300L, null, 400L)});
+  Arrays.asList(300L, null, 400L),
+  map2});
 StructType schema = new StructType()
   .add("a", BooleanType, false)
   .add("b", IntegerType, false)
   .add("c", BinaryType)
   .add("d", createArrayType(StringType))
   .add("e", createArrayType(StringType))
-  .add("f", createArrayType(LongType));
+  .add("f", createArrayType(LongType))
+  .add("g", createMapType(IntegerType, StringType));
--- End diff --

Maybe more test cases for nested key/value types?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

2016-07-25 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/14344#discussion_r72035859
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala
 ---
@@ -501,6 +501,143 @@ case class MapObjects private(
   }
 }
 
+object ExternalMapToCatalyst {
+  private val curId = new java.util.concurrent.atomic.AtomicInteger()
+
+  def apply(
+  inputMap: Expression,
+  keyType: DataType,
+  keyConverter: Expression => Expression,
+  valueType: DataType,
+  valueConverter: Expression => Expression): ExternalMapToCatalyst = {
+val id = curId.getAndIncrement()
+val keyName = "ExternalMapToCatalyst_key" + id
+val valueName = "ExternalMapToCatalyst_value" + id
+val valueIsNull = "ExternalMapToCatalyst_value_isNull" + id
+
+ExternalMapToCatalyst(
+  keyName,
+  keyType,
+  keyConverter(LambdaVariable(keyName, "false", keyType)),
+  valueName,
+  valueIsNull,
+  valueType,
+  valueConverter(LambdaVariable(valueName, valueIsNull, valueType)),
+  inputMap
+)
+  }
+}
+
+case class ExternalMapToCatalyst private(
+key: String,
+keyType: DataType,
+keyConverter: Expression,
+value: String,
+valueIsNull: String,
+valueType: DataType,
+valueConverter: Expression,
+child: Expression)
+  extends UnaryExpression with NonSQLExpression {
+
+  override def foldable: Boolean = false
+
+  override def dataType: MapType = MapType(keyConverter.dataType, 
valueConverter.dataType)
+
+  override def eval(input: InternalRow): Any =
+throw new UnsupportedOperationException("Only code-generated 
evaluation is supported")
+
+  override protected def doGenCode(ctx: CodegenContext, ev: ExprCode): 
ExprCode = {
+val inputMap = child.genCode(ctx)
+val genKeyConverter = keyConverter.genCode(ctx)
+val genValueConverter = valueConverter.genCode(ctx)
+val length = ctx.freshName("length")
+val index = ctx.freshName("index")
+val convertedKeys = ctx.freshName("convertedKeys")
+val convertedValues = ctx.freshName("convertedValues")
+val entry = ctx.freshName("entry")
+val entries = ctx.freshName("entries")
+
+val (defineEntries, defineKeyValue) = child.dataType match {
+  case ObjectType(cls) if classOf[java.util.Map[_, 
_]].isAssignableFrom(cls) =>
+val javaIteratorCls = classOf[java.util.Iterator[_]].getName
+val javaMapEntryCls = classOf[java.util.Map.Entry[_, _]].getName
+
+val defineEntries =
+  s"final $javaIteratorCls $entries = 
${inputMap.value}.entrySet().iterator();"
+
+val defineKeyValue =
+  s"""
+final $javaMapEntryCls $entry = ($javaMapEntryCls) 
$entries.next();
+${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) 
$entry.getKey();
+${ctx.javaType(valueType)} $value = 
(${ctx.boxedType(valueType)}) $entry.getValue();
+  """
+
+defineEntries -> defineKeyValue
+
+  case ObjectType(cls) if classOf[scala.collection.Map[_, 
_]].isAssignableFrom(cls) =>
+val scalaIteratorCls = classOf[Iterator[_]].getName
+val scalaMapEntryCls = classOf[Tuple2[_, _]].getName
+
+val defineEntries = s"final $scalaIteratorCls $entries = 
${inputMap.value}.iterator();"
+
+val defineKeyValue =
+  s"""
+final $scalaMapEntryCls $entry = ($scalaMapEntryCls) 
$entries.next();
+${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) 
$entry._1();
+${ctx.javaType(valueType)} $value = 
(${ctx.boxedType(valueType)}) $entry._2();
+  """
+
+defineEntries -> defineKeyValue
+}
+
+val valueNullCheck = if (ctx.isPrimitiveType(valueType)) {
+  s"boolean $valueIsNull = false;"
+} else {
+  s"boolean $valueIsNull = $value == null;"
+}
+
+val arrayCls = classOf[GenericArrayData].getName
+val mapCls = classOf[ArrayBasedMapData].getName
+val convertedKeyType = ctx.boxedType(keyConverter.dataType)
+val convertedValueType = ctx.boxedType(valueConverter.dataType)
+val code =
+  s"""
+${inputMap.code}
+${ctx.javaType(dataType)} ${ev.value} = 
${ctx.defaultValue(dataType)};
+if (!${inputMap.isNull}) {
+  final int $length = ${inputMap.value}.size();
+  final Object[] $convertedKeys = new Object[$length];
+  final Object[] $convertedValues = new Object[$length];
+  int $index = 0;
+  $defineE

[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

2016-07-25 Thread liancheng
Github user liancheng commented on a diff in the pull request:

https://github.com/apache/spark/pull/14344#discussion_r72035518
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala
 ---
@@ -501,6 +501,143 @@ case class MapObjects private(
   }
 }
 
+object ExternalMapToCatalyst {
+  private val curId = new java.util.concurrent.atomic.AtomicInteger()
+
+  def apply(
+  inputMap: Expression,
+  keyType: DataType,
+  keyConverter: Expression => Expression,
+  valueType: DataType,
+  valueConverter: Expression => Expression): ExternalMapToCatalyst = {
+val id = curId.getAndIncrement()
+val keyName = "ExternalMapToCatalyst_key" + id
+val valueName = "ExternalMapToCatalyst_value" + id
+val valueIsNull = "ExternalMapToCatalyst_value_isNull" + id
+
+ExternalMapToCatalyst(
+  keyName,
+  keyType,
+  keyConverter(LambdaVariable(keyName, "false", keyType)),
+  valueName,
+  valueIsNull,
+  valueType,
+  valueConverter(LambdaVariable(valueName, valueIsNull, valueType)),
+  inputMap
+)
+  }
+}
+
+case class ExternalMapToCatalyst private(
+key: String,
+keyType: DataType,
+keyConverter: Expression,
+value: String,
+valueIsNull: String,
+valueType: DataType,
+valueConverter: Expression,
+child: Expression)
+  extends UnaryExpression with NonSQLExpression {
+
+  override def foldable: Boolean = false
+
+  override def dataType: MapType = MapType(keyConverter.dataType, 
valueConverter.dataType)
+
+  override def eval(input: InternalRow): Any =
+throw new UnsupportedOperationException("Only code-generated 
evaluation is supported")
+
+  override protected def doGenCode(ctx: CodegenContext, ev: ExprCode): 
ExprCode = {
+val inputMap = child.genCode(ctx)
+val genKeyConverter = keyConverter.genCode(ctx)
+val genValueConverter = valueConverter.genCode(ctx)
+val length = ctx.freshName("length")
+val index = ctx.freshName("index")
+val convertedKeys = ctx.freshName("convertedKeys")
+val convertedValues = ctx.freshName("convertedValues")
+val entry = ctx.freshName("entry")
+val entries = ctx.freshName("entries")
+
+val (defineEntries, defineKeyValue) = child.dataType match {
+  case ObjectType(cls) if classOf[java.util.Map[_, 
_]].isAssignableFrom(cls) =>
+val javaIteratorCls = classOf[java.util.Iterator[_]].getName
+val javaMapEntryCls = classOf[java.util.Map.Entry[_, _]].getName
+
+val defineEntries =
+  s"final $javaIteratorCls $entries = 
${inputMap.value}.entrySet().iterator();"
+
+val defineKeyValue =
+  s"""
+final $javaMapEntryCls $entry = ($javaMapEntryCls) 
$entries.next();
+${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) 
$entry.getKey();
+${ctx.javaType(valueType)} $value = 
(${ctx.boxedType(valueType)}) $entry.getValue();
+  """
+
+defineEntries -> defineKeyValue
+
+  case ObjectType(cls) if classOf[scala.collection.Map[_, 
_]].isAssignableFrom(cls) =>
+val scalaIteratorCls = classOf[Iterator[_]].getName
+val scalaMapEntryCls = classOf[Tuple2[_, _]].getName
+
+val defineEntries = s"final $scalaIteratorCls $entries = 
${inputMap.value}.iterator();"
+
+val defineKeyValue =
+  s"""
+final $scalaMapEntryCls $entry = ($scalaMapEntryCls) 
$entries.next();
+${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) 
$entry._1();
+${ctx.javaType(valueType)} $value = 
(${ctx.boxedType(valueType)}) $entry._2();
+  """
+
+defineEntries -> defineKeyValue
+}
+
+val valueNullCheck = if (ctx.isPrimitiveType(valueType)) {
+  s"boolean $valueIsNull = false;"
+} else {
+  s"boolean $valueIsNull = $value == null;"
+}
+
+val arrayCls = classOf[GenericArrayData].getName
+val mapCls = classOf[ArrayBasedMapData].getName
+val convertedKeyType = ctx.boxedType(keyConverter.dataType)
+val convertedValueType = ctx.boxedType(valueConverter.dataType)
+val code =
+  s"""
+${inputMap.code}
+${ctx.javaType(dataType)} ${ev.value} = 
${ctx.defaultValue(dataType)};
+if (!${inputMap.isNull}) {
+  final int $length = ${inputMap.value}.size();
+  final Object[] $convertedKeys = new Object[$length];
+  final Object[] $convertedValues = new Object[$length];
+  int $index = 0;
+  $defineE

[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

2016-07-25 Thread liancheng
Github user liancheng commented on a diff in the pull request:

https://github.com/apache/spark/pull/14344#discussion_r72035534
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/objects/objects.scala
 ---
@@ -501,6 +501,143 @@ case class MapObjects private(
   }
 }
 
+object ExternalMapToCatalyst {
+  private val curId = new java.util.concurrent.atomic.AtomicInteger()
+
+  def apply(
+  inputMap: Expression,
+  keyType: DataType,
+  keyConverter: Expression => Expression,
+  valueType: DataType,
+  valueConverter: Expression => Expression): ExternalMapToCatalyst = {
+val id = curId.getAndIncrement()
+val keyName = "ExternalMapToCatalyst_key" + id
+val valueName = "ExternalMapToCatalyst_value" + id
+val valueIsNull = "ExternalMapToCatalyst_value_isNull" + id
+
+ExternalMapToCatalyst(
+  keyName,
+  keyType,
+  keyConverter(LambdaVariable(keyName, "false", keyType)),
+  valueName,
+  valueIsNull,
+  valueType,
+  valueConverter(LambdaVariable(valueName, valueIsNull, valueType)),
+  inputMap
+)
+  }
+}
+
+case class ExternalMapToCatalyst private(
+key: String,
+keyType: DataType,
+keyConverter: Expression,
+value: String,
+valueIsNull: String,
+valueType: DataType,
+valueConverter: Expression,
+child: Expression)
+  extends UnaryExpression with NonSQLExpression {
+
+  override def foldable: Boolean = false
+
+  override def dataType: MapType = MapType(keyConverter.dataType, 
valueConverter.dataType)
+
+  override def eval(input: InternalRow): Any =
+throw new UnsupportedOperationException("Only code-generated 
evaluation is supported")
+
+  override protected def doGenCode(ctx: CodegenContext, ev: ExprCode): 
ExprCode = {
+val inputMap = child.genCode(ctx)
+val genKeyConverter = keyConverter.genCode(ctx)
+val genValueConverter = valueConverter.genCode(ctx)
+val length = ctx.freshName("length")
+val index = ctx.freshName("index")
+val convertedKeys = ctx.freshName("convertedKeys")
+val convertedValues = ctx.freshName("convertedValues")
+val entry = ctx.freshName("entry")
+val entries = ctx.freshName("entries")
+
+val (defineEntries, defineKeyValue) = child.dataType match {
+  case ObjectType(cls) if classOf[java.util.Map[_, 
_]].isAssignableFrom(cls) =>
+val javaIteratorCls = classOf[java.util.Iterator[_]].getName
+val javaMapEntryCls = classOf[java.util.Map.Entry[_, _]].getName
+
+val defineEntries =
+  s"final $javaIteratorCls $entries = 
${inputMap.value}.entrySet().iterator();"
+
+val defineKeyValue =
+  s"""
+final $javaMapEntryCls $entry = ($javaMapEntryCls) 
$entries.next();
+${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) 
$entry.getKey();
+${ctx.javaType(valueType)} $value = 
(${ctx.boxedType(valueType)}) $entry.getValue();
+  """
+
+defineEntries -> defineKeyValue
+
+  case ObjectType(cls) if classOf[scala.collection.Map[_, 
_]].isAssignableFrom(cls) =>
+val scalaIteratorCls = classOf[Iterator[_]].getName
+val scalaMapEntryCls = classOf[Tuple2[_, _]].getName
+
+val defineEntries = s"final $scalaIteratorCls $entries = 
${inputMap.value}.iterator();"
+
+val defineKeyValue =
+  s"""
+final $scalaMapEntryCls $entry = ($scalaMapEntryCls) 
$entries.next();
+${ctx.javaType(keyType)} $key = (${ctx.boxedType(keyType)}) 
$entry._1();
+${ctx.javaType(valueType)} $value = 
(${ctx.boxedType(valueType)}) $entry._2();
+  """
+
+defineEntries -> defineKeyValue
+}
+
+val valueNullCheck = if (ctx.isPrimitiveType(valueType)) {
+  s"boolean $valueIsNull = false;"
+} else {
+  s"boolean $valueIsNull = $value == null;"
+}
+
+val arrayCls = classOf[GenericArrayData].getName
+val mapCls = classOf[ArrayBasedMapData].getName
+val convertedKeyType = ctx.boxedType(keyConverter.dataType)
+val convertedValueType = ctx.boxedType(valueConverter.dataType)
+val code =
+  s"""
+${inputMap.code}
+${ctx.javaType(dataType)} ${ev.value} = 
${ctx.defaultValue(dataType)};
+if (!${inputMap.isNull}) {
+  final int $length = ${inputMap.value}.size();
+  final Object[] $convertedKeys = new Object[$length];
+  final Object[] $convertedValues = new Object[$length];
+  int $index = 0;
+  $defineE

[GitHub] spark pull request #14344: [SPARK-16706][SQL] support java map in encoder

2016-07-25 Thread cloud-fan
GitHub user cloud-fan opened a pull request:

https://github.com/apache/spark/pull/14344

[SPARK-16706][SQL] support java map in encoder

## What changes were proposed in this pull request?

finish the TODO, create a new expression `ExternalMapToCatalyst` to iterate 
the map directly.

## How was this patch tested?

new test in `JavaDatasetSuite`


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/cloud-fan/spark java-map

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/14344.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #14344


commit f23549314cd2558fda0f859418ef36e11d9fe9f9
Author: Wenchen Fan 
Date:   2016-07-25T08:50:08Z

support java map in encoder




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org