Both options seems good to me, but we have to remember that not all of
Sedona users using cloud solutions, some of them are using Spark with
hadoop. What about python-adapter module within sedona project, am I
missing sth ?
Regards,
Paweł

czw., 11 lut 2021 o 14:40 Netanel Malka <netanel...@gmail.com> napisał(a):

> I think that we can make it work on Databricks without any changes.
> After creating a cluster on Databricks, the user can install the geotools
> packages and provide the osego *(or any other repo) explicitly.*
>
> As you can see in the picture:
>
> [image: image.png]
> I can provide the details on how to install it.
>
> I think it will solve the problem.
> What do you think?
>
>
> On Thu, 11 Feb 2021 at 12:24, Jia Yu <ji...@apache.org> wrote:
>
>> Hi folks,
>>
>> As you can see from the recent discussion in the mailing list
>> <[Bug][Python] Missing Java class>, in Sedona 1.0.0, because those LGPL
>> GeoTools jars are not on Maven Central (only in OSGEO repo), Databricks
>> cannot get GeoTools jars.
>>
>> I believe this will cause lots of trouble to our future Python users.
>> Reading Shapefiles and do CRS transformation are big selling points for
>> Sedona.
>>
>> The easiest way to fix this, without violating ASF policy, is that I will
>> publish a GeoTools wrapper on Maven Central using the old GeoSpark group
>> ID: https://mvnrepository.com/artifact/org.datasyslab
>>
>> For example, org.datasyslab:geotools-24-wrapper:1.0.0
>>
>> 1. This GeoTools wrapper does nothing but brings the GeoTools jars needed
>> by Sedona to Maven Central.
>> 2. When the Python user calls Sedona, they can add one more
>> package: org.datasyslab:geotools-24-wrapper:1.0.0
>>
>> Another good thing is that: this does not require a new source code
>> release from Sedona. We only need to update the website and let the users
>> know how to call it.
>>
>> Any better ideas?
>>
>> Thanks,
>> Jia
>>
>>
>>
>
> --
> Best regards,
> Netanel Malka.
>

Reply via email to