bq. I followed something similar $"a.x"

Please use expr("...")
e.g. if your DataSet has two columns, you can write:
  ds.select(expr("_2 / _1").as[Int])

where _1 refers to first column and _2 refers to second.

On Tue, Feb 9, 2016 at 3:31 PM, Raghava Mutharaju <m.vijayaragh...@gmail.com
> wrote:

> Ted,
>
> Thank you for the pointer. That works, but what does a string prepended
> with $ sign mean? Is it an expression?
>
> Could you also help me with the select() parameter syntax? I followed
> something similar $"a.x" and it gives an error message that a TypedColumn
> is expected.
>
> Regards,
> Raghava.
>
>
> On Tue, Feb 9, 2016 at 10:12 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>
>> Please take a look at:
>> sql/core/src/test/scala/org/apache/spark/sql/DatasetSuite.scala
>>
>>     val ds1 = Seq(1, 2, 3).toDS().as("a")
>>     val ds2 = Seq(1, 2).toDS().as("b")
>>
>>     checkAnswer(
>>       ds1.joinWith(ds2, $"a.value" === $"b.value", "inner"),
>>
>> On Tue, Feb 9, 2016 at 7:07 AM, Raghava Mutharaju <
>> m.vijayaragh...@gmail.com> wrote:
>>
>>> Hello All,
>>>
>>> joinWith() method in Dataset takes a condition of type Column. Without
>>> converting a Dataset to a DataFrame, how can we get a specific column?
>>>
>>> For eg: case class Pair(x: Long, y: Long)
>>>
>>> A, B are Datasets of type Pair and I want to join A.x with B.y
>>>
>>> A.joinWith(B, A.toDF().col("x") == B.toDF().col("y"))
>>>
>>> Is there a way to avoid using toDF()?
>>>
>>> I am having similar issues with the usage of filter(A.x == B.y)
>>>
>>> --
>>> Regards,
>>> Raghava
>>>
>>
>>
>
>
> --
> Regards,
> Raghava
> http://raghavam.github.io
>

Reply via email to