Thanks guy for the response.
Basically I am migrating an oracle pl/sql procedure to spark-java. In
oracle I have a table with geometry column, on which I am able to do a
"where col = 1 and geom.within(another_geom)"
I am looking for a less complicated port in to spark for which queries. I
will
why can't you do this in Magellan?
Can you post a sample query that you are trying to run that has spatial and
logical operators combined? Maybe I am not understanding the issue properly
Ram
On Tue, Oct 10, 2017 at 2:21 AM, Imran Rajjad wrote:
> I need to have a location
Hi all,
GeoMesa integrates with Spark SQL and allows for queries like:
select * from chicago where case_number = 1 and st_intersects(geom,
st_makeBox2d(st_point(-77, 38), st_point(-76, 39)))
GeoMesa does this by calling package protected Spark methods to
implement geospatial user defined
GeoSpark (just heard of this): http://geospark.datasyslab.org/
Thanks,
Silvio
From: Imran Rajjad <raj...@gmail.com>
Date: Tuesday, October 10, 2017 at 5:22 AM
To: "user @spark" <user@spark.apache.org>
Subject: best spark spatial lib?
I need to have a location column inside my Data
What about someting like gromesa?
Anastasios Zouzias schrieb am Di. 10. Okt. 2017 um
15:29:
> Hi,
>
> Which spatial operations do you require exactly? Also, I don't follow what
> you mean by combining logical operators?
>
> I have created a library that wraps Lucene's spatial
Hi,
Which spatial operations do you require exactly? Also, I don't follow what
you mean by combining logical operators?
I have created a library that wraps Lucene's spatial functionality here:
https://github.com/zouzias/spark-lucenerdd/wiki/Spatial-search
You could give a try to the library, it
I need to have a location column inside my Dataframe so that I can do
spatial queries and geometry operations. Are there any third-party packages
that perform this kind of operations. I have seen a few like Geospark and
megalan but they don't support operations where spatial and logical
operators