Maybe some SPARQL features in Shark, then ?

 aℕdy ℙetrella
about.me/noootsab
[image: aℕdy ℙetrella on about.me]

<http://about.me/noootsab>


On Fri, Jun 20, 2014 at 9:45 PM, Mayur Rustagi <mayur.rust...@gmail.com>
wrote:

> You are looking to create Shark operators for RDF? Since Shark backend is
> shifting to SparkSQL it would be slightly hard but much better effort would
> be to shift Gremlin to Spark (though a much beefier one :) )
>
> Mayur Rustagi
> Ph: +1 (760) 203 3257
> http://www.sigmoidanalytics.com
> @mayur_rustagi <https://twitter.com/mayur_rustagi>
>
>
>
> On Fri, Jun 20, 2014 at 3:39 PM, andy petrella <andy.petre...@gmail.com>
> wrote:
>
>> For RDF, may GraphX be particularly approriated?
>>
>>  aℕdy ℙetrella
>> about.me/noootsab
>> [image: aℕdy ℙetrella on about.me]
>>
>> <http://about.me/noootsab>
>>
>>
>> On Thu, Jun 19, 2014 at 4:49 PM, Flavio Pompermaier <pomperma...@okkam.it
>> > wrote:
>>
>>> Hi guys,
>>>
>>> I'm analyzing the possibility to use Spark to analyze RDF files and
>>> define reusable Shark operators on them (custom filtering, transforming,
>>> aggregating, etc). Is that possible? Any hint?
>>>
>>> Best,
>>> Flavio
>>>
>>
>>
>

Reply via email to