24 AM, Michael Armbrust <
>>>>>> mich...@databricks.com> wrote:
>>>>>>
>>>>>>> You don't need the Seq, as in is a variadic function.
>>>>>>>
>>>>>>> personTable.where('name in ("foo", "bar"))
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Thu, Aug 28, 2014 at 3:09 AM, Jaonary Rabarisoa <
>>>>>>> jaon...@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi all,
>>>>>>>>
>>>>>>>> What is the expression that I should use with spark sql DSL if I
>>>>>>>> need to retreive
>>>>>>>> data with a field in a given set.
>>>>>>>> For example :
>>>>>>>>
>>>>>>>> I have the following schema
>>>>>>>>
>>>>>>>> case class Person(name: String, age: Int)
>>>>>>>>
>>>>>>>> And I need to do something like :
>>>>>>>>
>>>>>>>> personTable.where('name in Seq("foo", "bar")) ?
>>>>>>>>
>>>>>>>>
>>>>>>>> Cheers.
>>>>>>>>
>>>>>>>>
>>>>>>>> Jaonary
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>
>
--
Warm Regards
Abhinav Chowdary
hdfs 1.0.4 but we primarily use Cassandra + Spark (calliope). I tested it
with both
Are you using it with HDFS? What version of Hadoop? 1.0.4?
Ognen
On 3/10/14, 8:49 PM, abhinav chowdary wrote:
for any one who is interested to know about job server from Ooyala.. we
started using it recently and
+1 (760) 203 3257
> http://www.sigmoidanalytics.com
> @mayur_rustagi <https://twitter.com/mayur_rustagi>
>
>
>
> On Mon, Mar 10, 2014 at 6:49 PM, abhinav chowdary <
> abhinav.chowd...@gmail.com> wrote:
>
>> for any one who is interested to know about job se
h <https://twitter.com/mayur_rustagi>ttp://www.sigmoidanalytics.com
> https://twitter.com/mayur_rustagi
>
>
>
> On Tue, Feb 25, 2014 at 10:24 AM, Ognen Duzlevski <
> og...@nengoiksvelzud.com> wrote:
>
>> Doesn't the fair scheduler solve this?
>>
+1 that we have been using calliope for few months and its working out
really great for us. Any plans on integrating into spark?
On Mar 10, 2014 1:58 PM, "Rohit Rai" wrote:
> We are happy that you found Calliope useful and glad we could help.
>
> *Founder & CEO, **Tuplejump, Inc.*
> _
> og...@nengoiksvelzud.com> wrote:
>
>> Doesn't the fair scheduler solve this?
>> Ognen
>>
>>
>> On 2/25/14, 12:08 PM, abhinav chowdary wrote:
>>
>> Sorry for not being clear earlier
>> how do you want to pass the operations to the spar
to pass the operations to the spark context?
>
>
> Mayur Rustagi
> Ph: +919632149971
> h <https://twitter.com/mayur_rustagi>ttp://www.sigmoidanalytics.com
> https://twitter.com/mayur_rustagi
>
>
>
> On Tue, Feb 25, 2014 at 9:59 AM, abhinav chowdary <
&g
Hi,
I am looking for ways to share the sparkContext, meaning i need to
be able to perform multiple operations on the same spark context.
Below is code of a simple app i am testing
def main(args: Array[String]) {
println("Welcome to example application!")
val sc = new SparkContext