Please find attached.


On Wed, Oct 7, 2015 at 7:36 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> Hemant:
> Can you post the code snippet to the mailing list - other people would be
> interested.
>
> On Wed, Oct 7, 2015 at 5:50 AM, Hemant Bhanawat <hemant9...@gmail.com>
> wrote:
>
>> Will send you the code on your email id.
>>
>> On Wed, Oct 7, 2015 at 4:37 PM, Ophir Cohen <oph...@gmail.com> wrote:
>>
>>> Thanks!
>>> Can you check if you can provide example of the conversion?
>>>
>>>
>>> On Wed, Oct 7, 2015 at 2:05 PM, Hemant Bhanawat <hemant9...@gmail.com>
>>> wrote:
>>>
>>>> Oh, this is an internal class of our project and I had used it without
>>>> realizing the source.
>>>>
>>>> Anyway, the idea is to  wrap the InternalRow in a class that derives
>>>> from Row. When you implement the functions of the trait 'Row ', the type
>>>> conversions from Row types to InternalRow types has to be done for each of
>>>> the types. But, as I can see, the primitive types (apart from String) don't
>>>> need conversions. Map and Array would need some handling.
>>>>
>>>> I will check with the author of this code, I think this code can be
>>>> contributed to Spark.
>>>>
>>>> Hemant
>>>> www.snappydata.io
>>>> linkedin.com/company/snappydata
>>>>
>>>> On Wed, Oct 7, 2015 at 3:30 PM, Ophir Cohen <oph...@gmail.com> wrote:
>>>>
>>>>> From which jar WrappedInternalRow comes from?
>>>>> It seems that I can't find it.
>>>>>
>>>>> BTW
>>>>> What I'm trying to do now is to create scala array from the fields and
>>>>> than create Row out of that array.
>>>>> The problem is that I get types mismatches...
>>>>>
>>>>> On Wed, Oct 7, 2015 at 8:03 AM, Hemant Bhanawat <hemant9...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> An approach can be to wrap your MutableRow in WrappedInternalRow
>>>>>> which is a child class of Row.
>>>>>>
>>>>>> Hemant
>>>>>> www.snappydata.io
>>>>>> linkedin.com/company/snappydata
>>>>>>
>>>>>>
>>>>>> On Tue, Oct 6, 2015 at 3:21 PM, Ophir Cohen <oph...@gmail.com> wrote:
>>>>>>
>>>>>>> Hi Guys,
>>>>>>> I'm upgrading to Spark 1.5.
>>>>>>>
>>>>>>> In our previous version (Spark 1.3 but it was OK on 1.4 as well) we
>>>>>>> created GenericMutableRow
>>>>>>> (org.apache.spark.sql.catalyst.expressions.GenericMutableRow) and 
>>>>>>> return it
>>>>>>> as org.apache.spark.sql.Row
>>>>>>>
>>>>>>> Starting from Spark 1.5 GenericMutableRow isn't extends Row.
>>>>>>>
>>>>>>> What do you suggest to do?
>>>>>>> How can I convert GenericMutableRow to Row?
>>>>>>>
>>>>>>> Prompt answer will be highly appreciated!
>>>>>>> Thanks,
>>>>>>> Ophir
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>
/*
 * Licensed to the Apache Software Foundation (ASF) under one or more
 * contributor license agreements.  See the NOTICE file distributed with
 * this work for additional information regarding copyright ownership.
 * The ASF licenses this file to You under the Apache License, Version 2.0
 * (the "License"); you may not use this file except in compliance with
 * the License.  You may obtain a copy of the License at
 *
 *    http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package org.apache.spark.sql.collection

import org.apache.spark.sql.Row
import org.apache.spark.sql.catalyst.{CatalystTypeConverters, InternalRow}
import org.apache.spark.sql.types.StructType
import org.apache.spark.unsafe.types.UTF8String

/**
 * Wraps an `InternalRow` to expose a `Row`
 */
final class WrappedInternalRow(override val schema: StructType,
    val converters: Array[(InternalRow, Int) => Any]) extends Row {

  private var _internalRow: InternalRow = _
  private val cache = new Array[Any](schema.length)

  private[sql] def internalRow = _internalRow

  private[sql] def internalRow_=(row: InternalRow): Unit = {
    _internalRow = row
    val len = cache.length
    var i = 0
    while (i < len) {
      if (cache(i) != null) {
        cache(i) = null
      }
      i += 1
    }
  }

  override def length: Int = schema.length

  override def isNullAt(ordinal: Int): Boolean = _internalRow.isNullAt(ordinal)

  override def getBoolean(ordinal: Int) = _internalRow.getBoolean(ordinal)

  override def getByte(ordinal: Int) = _internalRow.getByte(ordinal)

  override def getShort(ordinal: Int) = _internalRow.getShort(ordinal)

  override def getInt(ordinal: Int) = _internalRow.getInt(ordinal)

  override def getLong(ordinal: Int) = _internalRow.getLong(ordinal)

  override def getFloat(ordinal: Int) = _internalRow.getFloat(ordinal)

  override def getDouble(ordinal: Int) = _internalRow.getDouble(ordinal)

  override def getString(ordinal: Int) = {
    val v = cache(ordinal)
    if (v == null) {
      val s = _internalRow.getUTF8String(ordinal).toString
      cache(ordinal) = s
      s
    } else {
      v.asInstanceOf[String]
    }
  }

  def getUTF8String(ordinal: Int): UTF8String = {
    val v = cache(ordinal)
    if (v == null) {
      val s = _internalRow.getUTF8String(ordinal)
      cache(ordinal) = s
      s
    } else {
      v.asInstanceOf[UTF8String]
    }
  }

  override def get(ordinal: Int) = {
    val v = cache(ordinal)
    if (v == null) {
      val s = converters(ordinal)(_internalRow, ordinal)
      cache(ordinal) = s
      s
    } else {
      v
    }
  }

  override def copy() = {
    val row = new WrappedInternalRow(schema, converters)
    row._internalRow = _internalRow
    row
  }
}

object WrappedInternalRow {

  def createConverters(schema: StructType) = schema.fields.map { f =>
    CatalystTypeConverters.createRowToScalaConverter(f.dataType)
  }
}
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to