You can create an RDD[String] using whatever method and pass that to
jsonRDD.

On Wed, Dec 17, 2014 at 8:33 AM, Jerry Lam <chiling...@gmail.com> wrote:
>
> Hi Ted,
>
> Thanks for your help.
> I'm able to read lzo files using sparkContext.newAPIHadoopFile but I
> couldn't do the same for sqlContext because sqlContext.josnFile does not
> provide ways to configure the input file format. Do you know if there are
> some APIs to do that?
>
> Best Regards,
>
> Jerry
>
> On Wed, Dec 17, 2014 at 11:27 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>> See this thread: http://search-hadoop.com/m/JW1q5HAuFv
>> which references https://issues.apache.org/jira/browse/SPARK-2394
>>
>> Cheers
>>
>> On Wed, Dec 17, 2014 at 8:21 AM, Jerry Lam <chiling...@gmail.com> wrote:
>>>
>>> Hi spark users,
>>>
>>> Do you know how to read json files using Spark SQL that are LZO
>>> compressed?
>>>
>>> I'm looking into sqlContext.jsonFile but I don't know how to configure
>>> it to read lzo files.
>>>
>>> Best Regards,
>>>
>>> Jerry
>>>
>>

Reply via email to