This is an automated email from the ASF dual-hosted git repository.
jiayu pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/sedona.git
The following commit(s) were added to refs/heads/master by this push:
new 3a6afea9d [DOCS] Replace `Spark SQL Example` and add Snowflake and
Flink to the names of function docs. (#1265)
3a6afea9d is described below
commit 3a6afea9d8d8f8556696beba09897d9bc70198d2
Author: Jia Yu <[email protected]>
AuthorDate: Sun Mar 3 22:47:59 2024 -0800
[DOCS] Replace `Spark SQL Example` and add Snowflake and Flink to the names
of function docs. (#1265)
* Replace `Spark SQL Example` with `SQL Example`, append Flink and
Snowflake to corresponding functions
* Fix lint issues
---
docs/api/flink/Function.md | 2 +-
docs/api/sql/AggregateFunction.md | 6 +-
docs/api/sql/Constructor.md | 32 ++---
docs/api/sql/Function.md | 228 +++++++++++++++---------------
docs/api/sql/Optimizer.md | 4 +-
docs/api/sql/Predicate.md | 26 ++--
docs/api/sql/Raster-aggregate-function.md | 2 +-
docs/api/sql/Raster-loader.md | 4 +-
docs/api/sql/Raster-operators.md | 190 ++++++++++++-------------
docs/api/sql/Raster-visualizer.md | 8 +-
docs/api/sql/Raster-writer.md | 16 +--
docs/api/viz/sql.md | 12 +-
mkdocs.yml | 24 ++--
13 files changed, 277 insertions(+), 277 deletions(-)
diff --git a/docs/api/flink/Function.md b/docs/api/flink/Function.md
index 799a58c54..1c6f4952c 100644
--- a/docs/api/flink/Function.md
+++ b/docs/api/flink/Function.md
@@ -557,7 +557,7 @@ Introduction: Returns a geometry/geography that represents
all points whose dist
Mode of buffer calculation (Since: `v1.6.0`):
-The optional third parameter, `useSpheroid`, controls the mode of buffer
calculation.
+The optional third parameter, `useSpheroid`, controls the mode of buffer
calculation.
- Planar Buffering (default): When `useSpheroid` is false, `ST_Buffer`
performs standard planar buffering based on the provided parameters.
- Spheroidal Buffering:
diff --git a/docs/api/sql/AggregateFunction.md
b/docs/api/sql/AggregateFunction.md
index c31dd0b3e..98579b78d 100644
--- a/docs/api/sql/AggregateFunction.md
+++ b/docs/api/sql/AggregateFunction.md
@@ -6,7 +6,7 @@ Format: `ST_Envelope_Aggr (A: geometryColumn)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Envelope_Aggr(ST_GeomFromText('MULTIPOINT(1.1 101.1,2.1 102.1,3.1
103.1,4.1 104.1,5.1 105.1,6.1 106.1,7.1 107.1,8.1 108.1,9.1 109.1,10.1 110.1)'))
@@ -26,7 +26,7 @@ Format: `ST_Intersection_Aggr (A: geometryColumn)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Intersection_Aggr(ST_GeomFromText('MULTIPOINT(1.1 101.1,2.1
102.1,3.1 103.1,4.1 104.1,5.1 105.1,6.1 106.1,7.1 107.1,8.1 108.1,9.1
109.1,10.1 110.1)'))
@@ -46,7 +46,7 @@ Format: `ST_Union_Aggr (A: geometryColumn)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Union_Aggr(ST_GeomFromText('MULTIPOINT(1.1 101.1,2.1 102.1,3.1
103.1,4.1 104.1,5.1 105.1,6.1 106.1,7.1 107.1,8.1 108.1,9.1 109.1,10.1 110.1)'))
diff --git a/docs/api/sql/Constructor.md b/docs/api/sql/Constructor.md
index d74b79be4..abef2d7bd 100644
--- a/docs/api/sql/Constructor.md
+++ b/docs/api/sql/Constructor.md
@@ -52,7 +52,7 @@ Format: `ST_GeomFromGeoHash(geohash: String, precision:
Integer)`
Since: `v1.1.1`
-Spark SQL example:
+SQL Example
```sql
SELECT ST_GeomFromGeoHash('s00twy01mt', 4)
@@ -72,7 +72,7 @@ Format: `ST_GeomFromGeoJSON (GeoJson: String)`
Since: `v1.0.0`
-Spark SQL example:
+SQL Example
```sql
SELECT ST_GeomFromGeoJSON('{
@@ -110,7 +110,7 @@ Output:
POLYGON ((-87.621765 34.873444, -87.617535 34.873369, -87.62119 34.85053,
-87.62144 34.865379, -87.621765 34.873444))
```
-Spark SQL example:
+SQL Example
```sql
SELECT ST_GeomFromGeoJSON('{
@@ -207,7 +207,7 @@ Since: `v1.0.0`
The optional srid parameter was added in `v1.3.1`
-Spark SQL example:
+SQL Example
```sql
SELECT ST_GeomFromText('POINT(40.7128 -74.0060)')
@@ -231,7 +231,7 @@ Format:
Since: `v1.0.0`
-Spark SQL example:
+SQL Example
```sql
SELECT ST_GeomFromWKB([01 02 00 00 00 02 00 00 00 00 00 00 00 84 D6 00 C0 00
00 00 00 80 B5 D6 BF 00 00 00 60 E1 EF F7 BF 00 00 00 80 07 5D E5 BF])
@@ -243,7 +243,7 @@ Output:
LINESTRING (-2.1047439575195312 -0.354827880859375, -1.49606454372406
-0.6676061153411865)
```
-Spark SQL example:
+SQL Example
```sql
SELECT
ST_asEWKT(ST_GeomFromWKB('01010000a0e6100000000000000000f03f000000000000f03f000000000000f03f'))
@@ -269,7 +269,7 @@ Since: `v1.0.0`
The optional srid parameter was added in `v1.3.1`
-Spark SQL example:
+SQL Example
```sql
SELECT ST_GeomFromWKT('POINT(40.7128 -74.0060)')
@@ -310,7 +310,7 @@ Format:
Since: `v1.2.1`
-Spark SQL example:
+SQL Example
```sql
SELECT ST_LineFromText('LINESTRING(1 2,3 4)')
@@ -330,7 +330,7 @@ Format: `ST_LineStringFromText (Text: String, Delimiter:
Char)`
Since: `v1.0.0`
-Spark SQL example:
+SQL Example
```sql
SELECT
ST_LineStringFromText('-74.0428197,40.6867969,-74.0421975,40.6921336,-74.0508020,40.6912794',
',')
@@ -398,7 +398,7 @@ Format:
Since: `v1.3.1`
-Spark SQL example:
+SQL Example
```sql
SELECT ST_MLineFromText('MULTILINESTRING((1 2, 3 4), (4 5, 6 7))')
@@ -422,7 +422,7 @@ Format:
Since: `v1.3.1`
-Spark SQL example:
+SQL Example
```sql
SELECT ST_MPolyFromText('MULTIPOLYGON(((0 0 1,20 0 1,20 20 1,0 20 1,0 0 1),(5
5 3,5 7 3,7 7 3,7 5 3,5 5 3)))')
@@ -445,7 +445,7 @@ Since: `v1.0.0`
In `v1.4.0` an optional Z parameter was removed to be more consistent with
other spatial SQL implementations.
If you are upgrading from an older version of Sedona - please use ST_PointZ to
create 3D points.
-Spark SQL example:
+SQL Example
```sql
SELECT ST_Point(double(1.2345), 2.3456)
@@ -470,7 +470,7 @@ Format:
Since: `v1.4.0`
-Spark SQL example:
+SQL Example
```sql
SELECT ST_AsEWKT(ST_PointZ(1.2345, 2.3456, 3.4567))
@@ -490,7 +490,7 @@ Format: `ST_PointFromText (Text: String, Delimiter: Char)`
Since: `v1.0.0`
-Spark SQL example:
+SQL Example
```sql
SELECT ST_PointFromText('40.7128,-74.0060', ',')
@@ -512,7 +512,7 @@ Format:
Since: `v1.0.0`
-Spark SQL example:
+SQL Example
```sql
SELECT
ST_PolygonFromEnvelope(double(1.234),double(2.234),double(3.345),double(3.345))
@@ -532,7 +532,7 @@ Format: `ST_PolygonFromText (Text: String, Delimiter: Char)`
Since: `v1.0.0`
-Spark SQL example:
+SQL Example
```sql
SELECT
ST_PolygonFromText('-74.0428197,40.6867969,-74.0421975,40.6921336,-74.0508020,40.6912794,-74.0428197,40.6867969',
',')
diff --git a/docs/api/sql/Function.md b/docs/api/sql/Function.md
index a5e8c4d2e..3347230ef 100644
--- a/docs/api/sql/Function.md
+++ b/docs/api/sql/Function.md
@@ -6,7 +6,7 @@ Format: `GeometryType (A: Geometry)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT GeometryType(ST_GeomFromText('LINESTRING(77.29 29.07,77.42 29.26,77.27
29.31,77.29 29.07)'));
@@ -40,7 +40,7 @@ Format: `ST_3DDistance (A: Geometry, B: Geometry)`
Since: `v1.2.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_3DDistance(ST_GeomFromText("POINT Z (0 0 -5)"),
@@ -65,7 +65,7 @@ Format:
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AddPoint(ST_GeomFromText("LINESTRING(0 0, 1 1, 1 0)"),
ST_GeomFromText("Point(21 52)"), 1)
@@ -167,7 +167,7 @@ Format: `ST_Angle(p1, p2, p3, p4) | ST_Angle(p1, p2, p3) |
ST_Angle(line1, line2
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Angle(ST_GeomFromWKT('POINT(0 0)'), ST_GeomFromWKT('POINT (1 1)'),
ST_GeomFromWKT('POINT(1 0)'), ST_GeomFromWKT('POINT(6 2)'))
@@ -179,7 +179,7 @@ Output:
0.4048917862850834
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Angle(ST_GeomFromWKT('POINT (1 1)'), ST_GeomFromWKT('POINT (0 0)'),
ST_GeomFromWKT('POINT(3 2)'))
@@ -191,7 +191,7 @@ Output:
0.19739555984988044
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Angle(ST_GeomFromWKT('LINESTRING (0 0, 1 1)'),
ST_GeomFromWKT('LINESTRING (0 0, 3 2)'))
@@ -211,7 +211,7 @@ Format: `ST_Area (A: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Area(ST_GeomFromText("POLYGON(0 0, 0 10, 10 10, 0 10, 0 0)"))
```
@@ -234,7 +234,7 @@ Format: `ST_AreaSpheroid (A: Geometry)`
Since: `v1.4.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AreaSpheroid(ST_GeomFromWKT('Polygon ((34 35, 28 30, 25 34, 34
35))'))
@@ -254,7 +254,7 @@ Format: `ST_AsBinary (A: Geometry)`
Since: `v1.1.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsBinary(ST_GeomFromWKT('POINT (1 1)'))
@@ -279,7 +279,7 @@ Format: `ST_AsEWKB (A: Geometry)`
Since: `v1.1.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsEWKB(ST_SetSrid(ST_GeomFromWKT('POINT (1 1)'), 3021))
@@ -304,7 +304,7 @@ Format: `ST_AsEWKT (A: Geometry)`
Since: `v1.2.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsEWKT(ST_SetSrid(ST_GeomFromWKT('POLYGON((0 0,0 1,1 1,1 0,0 0))'),
4326))
@@ -316,7 +316,7 @@ Output:
SRID=4326;POLYGON ((0 0, 0 1, 1 1, 1 0, 0 0))
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsEWKT(ST_MakePointM(1.0, 1.0, 1.0))
@@ -328,7 +328,7 @@ Output:
POINT M(1 1 1)
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsEWKT(ST_MakePoint(1.0, 1.0, 1.0, 1.0))
@@ -348,7 +348,7 @@ Format: `ST_AsGeoJSON (A: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsGeoJSON(ST_GeomFromWKT('POLYGON((1 1, 8 1, 8 8, 1 8, 1 1))'))
@@ -377,7 +377,7 @@ Format: `ST_AsGML (A: Geometry)`
Since: `v1.3.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsGML(ST_GeomFromWKT('POLYGON((1 1, 8 1, 8 8, 1 8, 1 1))'))
@@ -397,7 +397,7 @@ Format: `ST_AsKML (A: Geometry)`
Since: `v1.3.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsKML(ST_GeomFromWKT('POLYGON((1 1, 8 1, 8 8, 1 8, 1 1))'))
@@ -418,7 +418,7 @@ Format: `ST_AsText (A: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText(ST_SetSRID(ST_Point(1.0,1.0), 3021))
@@ -430,7 +430,7 @@ Output:
POINT (1 1)
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText(ST_MakePointM(1.0, 1.0, 1.0))
@@ -442,7 +442,7 @@ Output:
POINT M(1 1 1)
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText(ST_MakePoint(1.0, 1.0, 1.0, 1.0))
@@ -462,7 +462,7 @@ Format: `ST_Azimuth(pointA: Point, pointB: Point)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Azimuth(ST_POINT(0.0, 25.0), ST_POINT(0.0, 0.0))
@@ -510,7 +510,7 @@ Format: `ST_Boundary(geom: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Boundary(ST_GeomFromWKT('POLYGON((1 1,0 0, -1 1, 1 1))'))
@@ -532,7 +532,7 @@ Format: `ST_BoundingDiagonal(geom: Geometry)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_BoundingDiagonal(ST_GeomFromWKT(geom))
```
@@ -632,7 +632,7 @@ Format: `ST_BuildArea (A: Geometry)`
Since: `v1.2.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_BuildArea(
@@ -659,7 +659,7 @@ Format: `ST_Centroid (A: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Centroid(ST_GeomFromWKT('MULTIPOINT(-1 0, -1 2, 7 8, 9 8, 10 6)'))
@@ -680,7 +680,7 @@ Format: `ST_ClosestPoint(g1: Geometry, g2: Geometry)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText( ST_ClosestPoint(g1, g2)) As ptwkt;
```
@@ -709,7 +709,7 @@ Format:
Since: `v1.2.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Collect(
@@ -728,7 +728,7 @@ Result:
+---------------------------------------------------------------+
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Collect(
@@ -768,7 +768,7 @@ Format:
Since: `v1.2.1`
-Spark SQL Example:
+SQL Example
```sql
WITH test_data as (
@@ -803,7 +803,7 @@ Format:
Since: `v1.4.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_ConcaveHull(ST_GeomFromWKT('POLYGON((175 150, 20 40, 50 60, 125 100,
175 150))'), 1)
@@ -823,7 +823,7 @@ Format: `ST_ConvexHull (A: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_ConvexHull(ST_GeomFromText('POLYGON((175 150, 20 40, 50 60, 125 100,
175 150))'))
@@ -904,7 +904,7 @@ Format: `ST_Degrees(angleInRadian)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Degrees(0.19739555984988044)
@@ -924,7 +924,7 @@ Format: `ST_Difference (A: Geometry, B: Geometry)`
Since: `v1.2.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Difference(ST_GeomFromWKT('POLYGON ((-3 -3, 3 -3, 3 3, -3 3, -3
-3))'), ST_GeomFromWKT('POLYGON ((0 -4, 4 -4, 4 4, 0 4, 0 -4))'))
@@ -944,7 +944,7 @@ Format: `ST_Dimension (A: Geometry) | ST_Dimension (C:
Geometrycollection)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Dimension('GEOMETRYCOLLECTION(LINESTRING(1 1,0 0),POINT(0 0))');
@@ -964,7 +964,7 @@ Format: `ST_Distance (A: Geometry, B: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Distance(ST_GeomFromText('POINT(72 42)'),
ST_GeomFromText('LINESTRING(-72 -42, 82 92)'))
@@ -989,7 +989,7 @@ Format: `ST_DistanceSphere (A: Geometry)`
Since: `v1.4.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_DistanceSphere(ST_GeomFromWKT('POINT (-0.56 51.3168)'),
ST_GeomFromWKT('POINT (-3.1883 55.9533)'))
@@ -1001,7 +1001,7 @@ Output:
543796.9506134904
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_DistanceSphere(ST_GeomFromWKT('POINT (-0.56 51.3168)'),
ST_GeomFromWKT('POINT (-3.1883 55.9533)'), 6378137.0)
@@ -1026,7 +1026,7 @@ Format: `ST_DistanceSpheroid (A: Geometry)`
Since: `v1.4.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_DistanceSpheroid(ST_GeomFromWKT('POINT (-0.56 51.3168)'),
ST_GeomFromWKT('POINT (-3.1883 55.9533)'))
@@ -1047,7 +1047,7 @@ Format: `ST_Dump(geom: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Dump(ST_GeomFromText('MULTIPOINT ((10 40), (40 30), (20 20), (30
10))'))
```
@@ -1066,7 +1066,7 @@ Format: `ST_DumpPoints(geom: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_DumpPoints(ST_GeomFromText('LINESTRING (0 0, 1 1, 1 0)'))
@@ -1086,7 +1086,7 @@ Format: `ST_EndPoint(geom: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_EndPoint(ST_GeomFromText('LINESTRING(100 150,50 60, 70 80, 160
170)'))
@@ -1106,7 +1106,7 @@ Format: `ST_Envelope (A: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Envelope(ST_GeomFromWKT('LINESTRING(0 0, 1 3)'))
@@ -1126,7 +1126,7 @@ Format: `ST_ExteriorRing(geom: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_ExteriorRing(ST_GeomFromText('POLYGON((0 0 1, 1 1 1, 1 2 1, 1 1 1, 0
0 1))'))
@@ -1146,7 +1146,7 @@ Format: `ST_FlipCoordinates(A: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_FlipCoordinates(ST_GeomFromWKT("POINT (1 2)"))
@@ -1166,7 +1166,7 @@ Format: `ST_Force_2D (A: Geometry)`
Since: `v1.2.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Force_2D(ST_GeomFromText('POLYGON((0 0 2,0 5 2,5 0 2,0 0 2),(1 1 2,3
1 2,1 3 2,1 1 2))'))
@@ -1192,7 +1192,7 @@ Format: `ST_Force3D(geometry: Geometry, zValue: Double)`
Since: `v1.4.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText(ST_Force3D(ST_GeomFromText('POLYGON((0 0 2,0 5 2,5 0 2,0 0
2),(1 1 2,3 1 2,1 3 2,1 1 2))'), 2.3))
@@ -1204,7 +1204,7 @@ Output:
POLYGON Z((0 0 2, 0 5 2, 5 0 2, 0 0 2), (1 1 2, 3 1 2, 1 3 2, 1 1 2))
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText(ST_Force3D(ST_GeomFromText('LINESTRING(0 1,1 0,2 0)'), 2.3))
@@ -1216,7 +1216,7 @@ Output:
LINESTRING Z(0 1 2.3, 1 0 2.3, 2 0 2.3)
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText(ST_Force3D(ST_GeomFromText('LINESTRING EMPTY'), 3))
@@ -1239,7 +1239,7 @@ Format: `ST_FrechetDistance(g1: Geometry, g2: Geometry)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_FrechetDistance(ST_GeomFromWKT('POINT (0 1)'),
ST_GeomFromWKT('LINESTRING (0 0, 1 0, 2 0, 3 0, 4 0, 5 0)'))
@@ -1259,7 +1259,7 @@ Format: `ST_GeoHash(geom: Geometry, precision: Integer)`
Since: `v1.1.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_GeoHash(ST_GeomFromText('POINT(21.427834 52.042576573)'), 5) AS
geohash
@@ -1301,7 +1301,7 @@ Default parameters: `tolerance: 1e-6, maxIter: 1000,
failIfNotConverged: false`
Since: `v1.4.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_GeometricMedian(ST_GeomFromWKT('MULTIPOINT((0 0), (1 1), (2 2), (200
200))'))
```
@@ -1319,7 +1319,7 @@ Format: `ST_GeometryN(geom: Geometry, n: Integer)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_GeometryN(ST_GeomFromText('MULTIPOINT((1 2), (3 4), (5 6), (8 9))'),
1)
@@ -1339,7 +1339,7 @@ Format: `ST_GeometryType (A: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_GeometryType(ST_GeomFromText('LINESTRING(77.29 29.07,77.42
29.26,77.27 29.31,77.29 29.07)'))
@@ -1363,7 +1363,7 @@ Format: `ST_H3CellDistance(cell1: Long, cell2: Long)`
Since: `v1.5.0`
-Spark SQL example:
+SQL Example
```sql
select ST_H3CellDistance(ST_H3CellIDs(ST_GeomFromWKT('POINT(1 2)'), 8,
true)[0], ST_H3CellIDs(ST_GeomFromWKT('POINT(1.23 1.59)'), 8, true)[0])
```
@@ -1410,7 +1410,7 @@ Format: `ST_H3CellIDs(geom: geometry, level: Int,
fullCover: Boolean)`
Since: `v1.5.0`
-Spark SQL example:
+SQL Example
```sql
SELECT ST_H3CellIDs(ST_GeomFromText('LINESTRING(1 3 4, 5 6 7)'), 6, true)
```
@@ -1437,7 +1437,7 @@ Format: `ST_H3KRing(cell: Long, k: Int, exactRing:
Boolean)`
Since: `v1.5.0`
-Spark SQL example:
+SQL Example
```sql
SELECT ST_H3KRing(ST_H3CellIDs(ST_GeomFromWKT('POINT(1 2)'), 8, true)[0], 1,
true) cells union select ST_H3KRing(ST_H3CellIDs(ST_GeomFromWKT('POINT(1 2)'),
8, true)[0], 1, false) cells
```
@@ -1462,7 +1462,7 @@ Format: `ST_H3ToGeom(cells: Array[Long])`
Since: `v1.5.0`
-Spark SQL example:
+SQL Example
```sql
SELECT ST_H3ToGeom(ST_H3CellIDs(ST_GeomFromWKT('POINT(1 2)'), 8, true)[0], 1,
true))
```
@@ -1495,7 +1495,7 @@ Format: `ST_HausdorffDistance(g1: Geometry, g2: Geometry,
densityFrac: Double)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_HausdorffDistance(ST_GeomFromWKT('POINT (0.0 1.0)'),
ST_GeomFromWKT('LINESTRING (0 0, 1 0, 2 0, 3 0, 4 0, 5 0)'), 0.1)
@@ -1507,7 +1507,7 @@ Output:
5.0990195135927845
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_HausdorffDistance(ST_GeomFromText('POLYGON Z((1 0 1, 1 1 2, 2 1 5, 2
0 1, 1 0 1))'), ST_GeomFromText('POLYGON Z((4 0 4, 6 1 4, 6 4 9, 6 1 3, 4 0
4))'))
@@ -1527,7 +1527,7 @@ Format: `ST_InteriorRingN(geom: Geometry, n: Integer)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_InteriorRingN(ST_GeomFromText('POLYGON((0 0, 0 5, 5 5, 5 0, 0 0), (1
1, 2 1, 2 2, 1 2, 1 1), (1 3, 2 3, 2 4, 1 4, 1 3), (3 3, 4 3, 4 4, 3 4, 3
3))'), 0)
@@ -1547,7 +1547,7 @@ Format: `ST_Intersection (A: Geometry, B: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Intersection(
@@ -1570,7 +1570,7 @@ Format: `ST_IsClosed(geom: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_IsClosed(ST_GeomFromText('LINESTRING(0 0, 1 1, 1 0)'))
@@ -1594,7 +1594,7 @@ Format: `ST_IsCollection(geom: Geometry)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_IsCollection(ST_GeomFromText('MULTIPOINT(0 0), (6 6)'))
@@ -1606,7 +1606,7 @@ Output:
true
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_IsCollection(ST_GeomFromText('POINT(5 5)'))
@@ -1626,7 +1626,7 @@ Format: `ST_IsEmpty (A: Geometry)`
Since: `v1.2.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_IsEmpty(ST_GeomFromWKT('POLYGON((0 0,0 1,1 1,1 0,0 0))'))
@@ -1646,7 +1646,7 @@ Format: `ST_IsRing(geom: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_IsRing(ST_GeomFromText("LINESTRING(0 0, 0 1, 1 1, 1 0, 0 0)"))
@@ -1666,7 +1666,7 @@ Format: `ST_IsSimple (A: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_IsSimple(ST_GeomFromWKT('POLYGON((1 1, 3 1, 3 3, 1 3, 1 1))'))
@@ -1764,7 +1764,7 @@ Format: `ST_Length (A: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Length(ST_GeomFromWKT('LINESTRING(38 16,38 50,65 50,66 16,38 16)'))
@@ -1789,7 +1789,7 @@ Format: `ST_LengthSpheroid (A: Geometry)`
Since: `v1.4.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_LengthSpheroid(ST_GeomFromWKT('Polygon ((0 0, 90 0, 0 0))'))
@@ -1809,7 +1809,7 @@ Format: `ST_LineFromMultiPoint (A: Geometry)`
Since: `v1.3.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_LineFromMultiPoint(ST_GeomFromText('MULTIPOINT((10 40), (40 30), (20
20), (30 10))'))
@@ -1829,7 +1829,7 @@ Format: `ST_LineInterpolatePoint (geom: Geometry,
fraction: Double)`
Since: `v1.0.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_LineInterpolatePoint(ST_GeomFromWKT('LINESTRING(25 50, 100 125, 150
190)'), 0.2)
@@ -1870,7 +1870,7 @@ Format: `ST_LineMerge (A: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_LineMerge(ST_GeomFromWKT('MULTILINESTRING ((-29 -27, -30 -29.7, -45
-33), (-45 -33, -46 -32))'))
@@ -1892,7 +1892,7 @@ Format:
Since: `v1.0.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_LineSubstring(ST_GeomFromWKT('LINESTRING(25 50, 100 125, 150 190)'),
0.333, 0.666)
@@ -1916,7 +1916,7 @@ Format:
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText( ST_MakeLine(ST_Point(1,2), ST_Point(3,4)) );
@@ -1928,7 +1928,7 @@ Output:
LINESTRING(1 2,3 4)
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText( ST_MakeLine( 'LINESTRING(0 0, 1 1)', 'LINESTRING(2 2, 3 3)'
) );
@@ -1948,7 +1948,7 @@ Format: `ST_MakePolygon(geom: Geometry, holes:
ARRAY[Geometry])`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_MakePolygon(
@@ -1978,7 +1978,7 @@ Format:
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
WITH linestring AS (
@@ -2010,7 +2010,7 @@ Format:
Since: `v1.0.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_MinimumBoundingCircle(ST_GeomFromWKT('LINESTRING(0 0, 0 1)'))
@@ -2030,7 +2030,7 @@ Format: `ST_MinimumBoundingRadius(geom: Geometry)`
Since: `v1.0.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_MinimumBoundingRadius(ST_GeomFromText('POLYGON((1 1,0 0, -1 1, 1
1))'))
@@ -2051,7 +2051,7 @@ Format: `ST_Multi(geom: Geometry)`
Since: `v1.2.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Multi(ST_GeomFromText('POINT(1 1)'))
@@ -2105,7 +2105,7 @@ Format:
Since: `v1.3.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsEWKT(ST_Normalize(ST_GeomFromWKT('POLYGON((0 1, 1 1, 1 0, 0 0, 0
1))')))
@@ -2125,7 +2125,7 @@ Format: `ST_NPoints (A: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_NPoints(ST_GeomFromText('LINESTRING(77.29 29.07,77.42 29.26,77.27
29.31,77.29 29.07)'))
@@ -2175,7 +2175,7 @@ Format: `ST_NumGeometries (A: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_NumGeometries(ST_GeomFromWKT('LINESTRING (-29 -27, -30 -29.7, -45
-33)'))
@@ -2195,7 +2195,7 @@ Format: `ST_NumInteriorRings(geom: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_NumInteriorRings(ST_GeomFromText('POLYGON ((0 0, 0 5, 5 5, 5 0, 0
0), (1 1, 2 1, 2 2, 1 2, 1 1))'))
@@ -2222,7 +2222,7 @@ Format: `ST_NumPoints(geom: Geometry)`
Since: `v1.4.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_NumPoints(ST_GeomFromText('LINESTRING(0 1, 1 0, 2 0)'))
@@ -2242,7 +2242,7 @@ Format: `ST_PointN(geom: Geometry, n: Integer)`
Since: `v1.2.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_PointN(ST_GeomFromText("LINESTRING(0 0, 1 2, 2 4, 3 6)"), 2)
```
@@ -2294,7 +2294,7 @@ Format: `ST_Polygon(geom: Geometry, srid: Integer)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText( ST_Polygon(ST_GeomFromEWKT('LINESTRING(75 29 1, 77 29 2, 77
29 3, 75 29 1)'), 4326) );
@@ -2314,7 +2314,7 @@ Format: `ST_ReducePrecision (A: Geometry, B: Integer)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_ReducePrecision(ST_GeomFromWKT('Point(0.1234567890123456789
0.1234567890123456789)')
@@ -2340,7 +2340,7 @@ Format:
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_RemovePoint(ST_GeomFromText("LINESTRING(0 0, 1 1, 1 0)"), 1)
@@ -2360,7 +2360,7 @@ Format: `ST_Reverse (A: Geometry)`
Since: `v1.2.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Reverse(ST_GeomFromWKT('LINESTRING(0 0, 1 2, 2 4, 3 6)'))
@@ -2382,7 +2382,7 @@ Format: `ST_S2CellIDs(geom: Geometry, level: Integer)`
Since: `v1.4.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_S2CellIDs(ST_GeomFromText('LINESTRING(1 3 4, 5 6 7)'), 6)
@@ -2401,7 +2401,7 @@ Format: `ST_SetPoint (linestring: Geometry, index:
Integer, point: Geometry)`
Since: `v1.3.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_SetPoint(ST_GeomFromText('LINESTRING (0 0, 0 1, 1 1)'), 2,
ST_GeomFromText('POINT (1 0)'))
@@ -2421,7 +2421,7 @@ Format: `ST_SetSRID (A: Geometry, srid: Integer)`
Since: `v1.1.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsEWKT(ST_SetSRID(ST_GeomFromWKT('POLYGON((1 1, 8 1, 8 8, 1 8, 1
1))'), 3021))
@@ -2463,7 +2463,7 @@ Since: `v1.0.0`
Format: `ST_SimplifyPreserveTopology (A: Geometry, distanceTolerance: Double)`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_SimplifyPreserveTopology(ST_GeomFromText('POLYGON((8 25, 28 22, 28
20, 15 11, 33 3, 56 30, 46 33,46 34, 47 44, 35 36, 45 33, 43 19, 29 21, 29
22,35 26, 24 39, 8 25))'), 10)
@@ -2489,7 +2489,7 @@ Since: `v1.4.0`
Format: `ST_Split (input: Geometry, blade: Geometry)`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Split(
@@ -2511,7 +2511,7 @@ Format: `ST_SRID (A: Geometry)`
Since: `v1.1.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_SRID(ST_SetSRID(ST_GeomFromWKT('POLYGON((1 1, 8 1, 8 8, 1 8, 1
1))'), 3021))
@@ -2531,7 +2531,7 @@ Format: `ST_StartPoint(geom: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_StartPoint(ST_GeomFromText('LINESTRING(100 150,50 60, 70 80, 160
170)'))
@@ -2551,7 +2551,7 @@ Format: `ST_SubDivide(geom: Geometry, maxVertices:
Integer)`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_SubDivide(ST_GeomFromText("POLYGON((35 10, 45 45, 15 40, 10 20, 35
10), (20 30, 35 35, 30 20, 20 30))"), 5)
@@ -2577,7 +2577,7 @@ Output:
]
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_SubDivide(ST_GeomFromText("LINESTRING(0 0, 85 85, 100 100, 120 120,
21 21, 10 10, 5 5)"), 5)
@@ -2604,7 +2604,7 @@ Format: `ST_SubDivideExplode(geom: Geometry, maxVertices:
Integer)`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
Query:
```sql
@@ -2666,7 +2666,7 @@ Format: `ST_SymDifference (A: Geometry, B: Geometry)`
Since: `v1.2.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_SymDifference(ST_GeomFromWKT('POLYGON ((-3 -3, 3 -3, 3 3, -3 3, -3
-3))'), ST_GeomFromWKT('POLYGON ((-2 -3, 4 -3, 4 3, -2 3, -2 -3))'))
@@ -2748,7 +2748,7 @@ ST_Transform (A: Geometry, TargetCRS: String)
Since: `v1.2.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText(ST_Transform(ST_GeomFromText('POLYGON((170 50,170 72,-130
72,-130 50,170 50))'),'EPSG:4326', 'EPSG:32649'))
@@ -2779,7 +2779,7 @@ Format:
Since: `v1.4.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Translate(ST_GeomFromText('GEOMETRYCOLLECTION(MULTIPOLYGON(((3 2,3
3,4 3,4 2,3 2)),((3 4,5 6,5 7,3 4))), POINT(1 1 1), LINESTRING EMPTY)'), 2, 2,
3)
@@ -2791,7 +2791,7 @@ Output:
GEOMETRYCOLLECTION (MULTIPOLYGON (((5 4, 5 5, 6 5, 6 4, 5 4)), ((5 6, 7 8, 7
9, 5 6))), POINT (3 3), LINESTRING EMPTY)
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Translate(ST_GeomFromText('POINT(-71.01 42.37)'),1,2)
@@ -2811,7 +2811,7 @@ Format: `ST_Union (A: Geometry, B: Geometry)`
Since: `v1.2.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Union(ST_GeomFromWKT('POLYGON ((-3 -3, 3 -3, 3 3, -3 3, -3 -3))'),
ST_GeomFromWKT('POLYGON ((1 -2, 5 0, 1 2, 1 -2))'))
@@ -2837,7 +2837,7 @@ Optional parameters:
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT st_astext(ST_VoronoiPolygons(ST_GeomFromText('MULTIPOINT ((0 0), (1
1))')));
@@ -2857,7 +2857,7 @@ Format: `ST_X(pointA: Point)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_X(ST_POINT(0.0 25.0))
@@ -2877,7 +2877,7 @@ Format: `ST_XMax (A: Geometry)`
Since: `v1.2.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_XMax(ST_GeomFromText('POLYGON ((-1 -11, 0 10, 1 11, 2 12, -1 -11))'))
@@ -2897,7 +2897,7 @@ Format: `ST_XMin (A: Geometry)`
Since: `v1.2.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_XMin(ST_GeomFromText('POLYGON ((-1 -11, 0 10, 1 11, 2 12, -1 -11))'))
@@ -2917,7 +2917,7 @@ Format: `ST_Y(pointA: Point)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Y(ST_POINT(0.0 25.0))
@@ -2937,7 +2937,7 @@ Format: `ST_YMax (A: Geometry)`
Since: `v1.2.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_YMax(ST_GeomFromText('POLYGON((0 0 1, 1 1 1, 1 2 1, 1 1 1, 0 0 1))'))
```
@@ -2956,7 +2956,7 @@ Format: `ST_Y_Min (A: Geometry)`
Since: `v1.2.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_YMin(ST_GeomFromText('POLYGON((0 0 1, 1 1 1, 1 2 1, 1 1 1, 0 0 1))'))
@@ -2976,7 +2976,7 @@ Format: `ST_Z(pointA: Point)`
Since: `v1.2.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Z(ST_POINT(0.0 25.0 11.0))
@@ -2996,7 +2996,7 @@ Format: `ST_ZMax(geom: Geometry)`
Since: `v1.3.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_ZMax(ST_GeomFromText('POLYGON((0 0 1, 1 1 1, 1 2 1, 1 1 1, 0 0 1))'))
@@ -3016,7 +3016,7 @@ Format: `ST_ZMin(geom: Geometry)`
Since: `v1.3.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_ZMin(ST_GeomFromText('LINESTRING(1 3 4, 5 6 7)'))
diff --git a/docs/api/sql/Optimizer.md b/docs/api/sql/Optimizer.md
index 21528e248..98fc12f95 100644
--- a/docs/api/sql/Optimizer.md
+++ b/docs/api/sql/Optimizer.md
@@ -10,7 +10,7 @@ Sedona Spatial operators fully supports Apache SparkSQL query
optimizer. It has
Introduction: Find geometries from A and geometries from B such that each
geometry pair satisfies a certain predicate. Most predicates supported by
SedonaSQL can trigger a range join.
-Spark SQL Example:
+SQL Example
```sql
SELECT *
@@ -290,7 +290,7 @@ FROM lefts
Introduction: Given a join query and a predicate in the same WHERE clause,
first executes the Predicate as a filter, then executes the join query.
-Spark SQL Example:
+SQL Example
```sql
SELECT *
diff --git a/docs/api/sql/Predicate.md b/docs/api/sql/Predicate.md
index c68be94f0..7a245254a 100644
--- a/docs/api/sql/Predicate.md
+++ b/docs/api/sql/Predicate.md
@@ -6,7 +6,7 @@ Format: `ST_Contains (A: Geometry, B: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Contains(ST_GeomFromWKT('POLYGON((175 150,20 40,50 60,125 100,175
150))'), ST_GeomFromWKT('POINT(174 149)'))
@@ -26,7 +26,7 @@ Format: `ST_Crosses (A: Geometry, B: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Crosses(ST_GeomFromWKT('POLYGON((1 1, 4 1, 4 4, 1 4, 1
1))'),ST_GeomFromWKT('POLYGON((2 2, 5 2, 5 5, 2 5, 2 2))'))
@@ -46,7 +46,7 @@ Format: `ST_Disjoint (A: Geometry, B: Geometry)`
Since: `v1.2.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Disjoint(ST_GeomFromWKT('POLYGON((1 4, 4.5 4, 4.5 2, 1 2, 1
4))'),ST_GeomFromWKT('POLYGON((5 4, 6 4, 6 2, 5 2, 5 4))'))
@@ -72,7 +72,7 @@ Format: `ST_DWithin (leftGeometry: Geometry, rightGeometry:
Geometry, distance:
Since: `v1.5.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_DWithin(ST_GeomFromWKT('POINT (0 0)'), ST_GeomFromWKT('POINT (1
0)'), 2.5)
@@ -105,7 +105,7 @@ Format: `ST_Equals (A: Geometry, B: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Equals(ST_GeomFromWKT('LINESTRING(0 0,10 10)'),
ST_GeomFromWKT('LINESTRING(0 0,5 5,10 10)'))
@@ -125,7 +125,7 @@ Format: `ST_Intersects (A: Geometry, B: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Intersects(ST_GeomFromWKT('LINESTRING(-43.23456 72.4567,-43.23456
72.4568)'), ST_GeomFromWKT('POINT(-43.23456 72.4567772)'))
@@ -145,7 +145,7 @@ Format: `ST_OrderingEquals(A: geometry, B: geometry)`
Since: `v1.2.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_OrderingEquals(ST_GeomFromWKT('POLYGON((2 0, 0 2, -2 0, 2 0))'),
ST_GeomFromWKT('POLYGON((2 0, 0 2, -2 0, 2 0))'))
@@ -157,7 +157,7 @@ Output:
true
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_OrderingEquals(ST_GeomFromWKT('POLYGON((2 0, 0 2, -2 0, 2 0))'),
ST_GeomFromWKT('POLYGON((0 2, -2 0, 2 0, 0 2))'))
@@ -177,7 +177,7 @@ Format: `ST_Overlaps (A: Geometry, B: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Overlaps(ST_GeomFromWKT('POLYGON((2.5 2.5, 2.5 4.5, 4.5 4.5, 4.5
2.5, 2.5 2.5))'), ST_GeomFromWKT('POLYGON((4 4, 4 6, 6 6, 6 4, 4 4))'))
@@ -197,7 +197,7 @@ Format: `ST_Touches (A: Geometry, B: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Touches(ST_GeomFromWKT('LINESTRING(0 0,1 1,0 2)'),
ST_GeomFromWKT('POINT(0 2)'))
@@ -217,7 +217,7 @@ Format: `ST_Within (A: Geometry, B: Geometry)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Within(ST_GeomFromWKT('POLYGON((0 0,3 0,3 3,0 3,0 0))'),
ST_GeomFromWKT('POLYGON((1 1,2 1,2 2,1 2,1 1))'))
@@ -237,7 +237,7 @@ Format: `ST_Covers (A: Geometry, B: Geometry)`
Since: `v1.3.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Covers(ST_GeomFromWKT('POLYGON((-2 0,0 2,2 0,-2 0))'),
ST_GeomFromWKT('POLYGON((-1 0,0 1,1 0,-1 0))'))
@@ -257,7 +257,7 @@ Format: `ST_CoveredBy (A: Geometry, B: Geometry)`
Since: `v1.3.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_CoveredBy(ST_GeomFromWKT('POLYGON((0 0,3 0,3 3,0 3,0 0))'),
ST_GeomFromWKT('POLYGON((1 1,2 1,2 2,1 2,1 1))'))
diff --git a/docs/api/sql/Raster-aggregate-function.md
b/docs/api/sql/Raster-aggregate-function.md
index b1a68ae5e..2535ff075 100644
--- a/docs/api/sql/Raster-aggregate-function.md
+++ b/docs/api/sql/Raster-aggregate-function.md
@@ -13,7 +13,7 @@ Format: `RS_Union_Aggr(A: rasterColumn, B: indexColumn)`
Since: `v1.5.1`
-Spark SQL Example:
+SQL Example
Contents of `raster_table`.
diff --git a/docs/api/sql/Raster-loader.md b/docs/api/sql/Raster-loader.md
index df31b3715..ef648838e 100644
--- a/docs/api/sql/Raster-loader.md
+++ b/docs/api/sql/Raster-loader.md
@@ -23,7 +23,7 @@ Format: `RS_FromArcInfoAsciiGrid(asc: ARRAY[Byte])`
Since: `v1.4.0`
-Spark SQL Example:
+SQL Example
```scala
var df = sedona.read.format("binaryFile").load("/some/path/*.asc")
@@ -38,7 +38,7 @@ Format: `RS_FromGeoTiff(asc: ARRAY[Byte])`
Since: `v1.4.0`
-Spark SQL Example:
+SQL Example
```scala
var df = sedona.read.format("binaryFile").load("/some/path/*.tiff")
diff --git a/docs/api/sql/Raster-operators.md b/docs/api/sql/Raster-operators.md
index 5b091ac2b..a3b75dbd8 100644
--- a/docs/api/sql/Raster-operators.md
+++ b/docs/api/sql/Raster-operators.md
@@ -10,7 +10,7 @@ Format: `RS_PixelAsCentroid(raster: Raster, colX: Integer,
rowY: Integer)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText(RS_PixelAsCentroid(RS_MakeEmptyRaster(1, 12, 13, 134, -53,
9), 3, 3))
@@ -31,7 +31,7 @@ Format: `RS_PixelAsCentroids(raster: Raster, band: Integer)`
Since: `v1.5.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText(RS_PixelAsCentroids(raster, 1)) from rasters
```
@@ -82,7 +82,7 @@ Format: `RS_PixelAsPoint(raster: Raster, colX: Integer, rowY:
Integer)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText(RS_PixelAsPoint(raster, 2, 1)) from rasters
@@ -93,7 +93,7 @@ Output:
POINT (123.19, -12)
```
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText(RS_PixelAsPoint(raster, 6, 2)) from rasters
@@ -112,7 +112,7 @@ Format: `RS_PixelAsPoints(raster: Raster, band: Integer)`
Since: `v1.5.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText(RS_PixelAsPoints(raster, 1)) from rasters
```
@@ -162,7 +162,7 @@ Format: `RS_PixelAsPolygon(raster: Raster, colX: Integer,
rowY: Integer)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText(RS_PixelAsPolygon(RS_MakeEmptyRaster(1, 5, 10, 123, -230, 8),
2, 3))
@@ -182,7 +182,7 @@ Format: `RS_PixelAsPolygons(raster: Raster, band: Integer)`
Since: `v1.5.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_AsText(RS_PixelAsPolygons(raster, 1)) from rasters
```
@@ -234,7 +234,7 @@ Format: `RS_Envelope (raster: Raster)`
Since: `v1.4.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_Envelope(raster) FROM raster_table
@@ -255,7 +255,7 @@ Format: `RS_ConvexHull(raster: Raster)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_ConvexHull(RS_MakeEmptyRaster(1, 5, 10, 156, -132, 5, 10, 3, 5, 0));
@@ -284,7 +284,7 @@ Format:
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```scala
val inputDf = Seq((Seq(0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0,
0, 0, 1, 0, 0, 0, 0),
@@ -302,7 +302,7 @@ Output:
+----------------------------------------+
```
-Spark SQL Example:
+SQL Example
```scala
val inputDf = Seq((Seq(0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0,
0, 0, 1, 0, 0, 0, 0),
@@ -320,7 +320,7 @@ Output:
+----------------------------------------+
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_MinConvexHull(raster, 3) from rasters;
@@ -389,7 +389,7 @@ UpperLeftX + ScaleX * 0.5
UpperLeftY + ScaleY * 0.5
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_GeoReference(ST_MakeEmptyRaster(1, 100, 100, -53, 51, 2, -2, 4, 5,
4326))
@@ -406,7 +406,7 @@ Output:
51.000000
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_GeoReferrence(ST_MakeEmptyRaster(1, 3, 4, 100.0, 200.0,2.0, -3.0,
0.1, 0.2, 0), "GDAL")
@@ -423,7 +423,7 @@ Output:
200.000000
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_GeoReferrence(ST_MakeEmptyRaster(1, 3, 4, 100.0, 200.0,2.0, -3.0,
0.1, 0.2, 0), "ERSI")
@@ -456,7 +456,7 @@ Format: `RS_GeoTransform(raster: Raster)`
Since: `v1.5.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_GeoTransform(
@@ -478,7 +478,7 @@ Format: `RS_Height(raster: Raster)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_Height(raster) FROM rasters
@@ -498,7 +498,7 @@ Format: `RS_RasterToWorldCoordX(raster: Raster, colX:
Integer, rowY: Integer)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_RasterToWorldCoordX(ST_MakeEmptyRaster(1, 5, 10, -123, 54, 5, -10,
0, 0, 4326), 1, 1) from rasters
@@ -518,7 +518,7 @@ Format: `RS_RasterToWorldCoordY(raster: Raster, colX:
Integer, rowY: Integer)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_RasterToWorldCoordY(ST_MakeEmptyRaster(1, 5, 10, -123, 54, 5, -10,
0, 0, 4326), 1, 1) from rasters
@@ -538,7 +538,7 @@ Format: `RS_RasterToWorldCoord(raster: Raster, colX:
Integer, rowY: Integer)`
Since: `v1.5.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_RasterToWorldCoord(ST_MakeEmptyRaster(1, 5, 10, -123, 54, 5, -10, 0,
0, 4326), 1, 1) from rasters
@@ -558,7 +558,7 @@ Format: `RS_Rotation(raster: Raster)`
Since: `v1.5.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_Rotation(
@@ -585,7 +585,7 @@ Format: `RS_ScaleX(raster: Raster)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_ScaleX(raster) FROM rasters
@@ -610,7 +610,7 @@ Format: `RS_ScaleY(raster: Raster)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_ScaleY(raster) FROM rasters
@@ -629,7 +629,7 @@ Format: `RS_SkewX(raster: Raster)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_SkewX(RS_MakeEmptyRaster(2, 10, 10, 0.0, 0.0, 1.0, -1.0, 0.1, 0.2,
4326))
@@ -649,7 +649,7 @@ Format: `RS_SkewY(raster: Raster)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_SkewY(RS_MakeEmptyRaster(2, 10, 10, 0.0, 0.0, 1.0, -1.0, 0.1, 0.2,
4326))
@@ -669,7 +669,7 @@ Format: `RS_UpperLeftX(raster: Raster)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_UpperLeftX(raster) FROM rasters
@@ -689,7 +689,7 @@ Format: `RS_UpperLeftY(raster: Raster)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_UpperLeftY(raster) FROM rasters
@@ -709,7 +709,7 @@ Format: `RS_Width(raster: Raster)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_Width(raster) FROM rasters
@@ -732,7 +732,7 @@ Format:
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_WorldToRasterCoord(ST_MakeEmptyRaster(1, 5, 5, -53, 51, 1, -1, 0, 0,
4326), -53, 51) from rasters;
@@ -744,7 +744,7 @@ Output:
POINT (1 1)
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_WorldToRasterCoord(ST_MakeEmptyRaster(1, 5, 5, -53, 51, 1, -1, 0, 0,
4326), ST_GeomFromText('POINT (-52 51)')) from rasters;
@@ -771,7 +771,7 @@ Format:
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_WorldToRasterCoordX(ST_MakeEmptyRaster(1, 5, 5, -53, 51, 1, -1, 0,
0), -53, 51) from rasters;
@@ -783,7 +783,7 @@ Output:
1
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_WorldToRasterCoordX(ST_MakeEmptyRaster(1, 5, 5, -53, 51, 1, -1, 0,
0), ST_GeomFromText('POINT (-53 51)')) from rasters;
@@ -810,7 +810,7 @@ Format:
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_WorldToRasterCoordY(ST_MakeEmptyRaster(1, 5, 5, -53, 51, 1, -1, 0,
0), ST_GeomFromText('POINT (-50 50)'));
@@ -822,7 +822,7 @@ Output:
2
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_WorldToRasterCoordY(ST_MakeEmptyRaster(1, 5, 5, -53, 51, 1, -1, 0,
0), -50, 49);
@@ -850,7 +850,7 @@ Format:
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_NumBands(
@@ -881,7 +881,7 @@ Format: `RS_BandNoDataValue (raster: Raster, band: Integer
= 1)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_BandNoDataValue(raster, 1) from rasters;
@@ -893,7 +893,7 @@ Output:
0.0
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_BandNoDataValue(raster) from rasters_without_nodata;
@@ -905,7 +905,7 @@ Output:
null
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_BandNoDataValue(raster, 3) from rasters;
@@ -925,7 +925,7 @@ Format: `RS_BandIsNoData(raster: Raster, band: Integer = 1)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
WITH rast_table AS (SELECT RS_AddBandFromArray(RS_MakeEmptyRaster(1, 2, 2, 0,
0, 1), ARRAY(10d, 10d, 10d, 10d), 1, 10d) as rast)
@@ -956,7 +956,7 @@ Format: `RS_BandPixelType(rast: Raster, band: Integer = 1)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_BandPixelType(RS_MakeEmptyRaster(2, "D", 5, 5, 53, 51, 1, 1, 0, 0,
0), 2);
@@ -1009,7 +1009,7 @@ Format:
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_Count(RS_MakeEmptyRaster(2, 5, 5, 0, 0, 1, -1, 0, 0, 0), 1, false)
@@ -1021,7 +1021,7 @@ Output:
25
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_Count(RS_MakeEmptyRaster(2, 5, 5, 0, 0, 1, -1, 0, 0, 0), 1)
@@ -1052,7 +1052,7 @@ Introduction: Returns summary stats consisting of count,
sum, mean, stddev, min,
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_SummaryStats(RS_MakeEmptyRaster(2, 5, 5, 0, 0, 1, -1, 0, 0, 0), 1,
false)
@@ -1064,7 +1064,7 @@ Output:
25.0, 204.0, 8.16, 9.4678403028357, 0.0, 25.0
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_SummaryStats(RS_MakeEmptyRaster(2, 5, 5, 0, 0, 1, -1, 0, 0, 0), 1)
@@ -1120,7 +1120,7 @@ RS_ZonalStats(raster: Raster, zone: Geometry, statType:
String)
Since: `v1.5.1`
-Spark SQL Example:
+SQL Example
```sql
RS_ZonalStats(rast1, geom1, 1, 'sum', false)
@@ -1132,7 +1132,7 @@ Output:
10690406
```
-Spark SQL Example:
+SQL Example
```sql
RS_ZonalStats(rast2, geom2, 1, 'mean', true)
@@ -1188,7 +1188,7 @@ RS_ZonalStatsAll(raster: Raster, zone: Geometry)
Since: `v1.5.1`
-Spark SQL Example:
+SQL Example
```sql
RS_ZonalStatsAll(rast1, geom1, 1, false)
@@ -1200,7 +1200,7 @@ Output:
[184792.0, 1.0690406E7, 57.851021689230684, 0.0, 0.0, 92.13277429243035,
8488.448098819916, 0.0, 255.0]
```
-Spark SQL Example:
+SQL Example
```sql
RS_ZonalStatsAll(rast2, geom2, 1, true)
@@ -1231,7 +1231,7 @@ Format:
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_Contains(RS_MakeEmptyRaster(1, 20, 20, 2, 22, 1),
ST_GeomFromWKT('POLYGON ((5 5, 5 10, 10 10, 10 5, 5 5))')) rast_geom,
@@ -1268,7 +1268,7 @@ Format:
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_Intersects(RS_MakeEmptyRaster(1, 20, 20, 2, 22, 1),
ST_SetSRID(ST_PolygonFromEnvelope(0, 0, 10, 10), 4326)) rast_geom,
@@ -1300,7 +1300,7 @@ Format: `RS_Within(raster0: Raster, raster1: Raster)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_Within(RS_MakeEmptyRaster(1, 20, 20, 2, 22, 1),
ST_GeomFromWKT('POLYGON ((0 0, 0 50, 100 50, 100 0, 0 0))')) rast_geom,
@@ -1346,7 +1346,7 @@ RS_AddBand(toRaster: Raster, fromRaster: Raster)
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_AddBand(raster1, raster2, 2, 1) FROM rasters
@@ -1387,7 +1387,7 @@ Original Raster:
<img alt="Original raster" src="../../../image/original-raster-clip.png"
width="400"/>
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_Clip(
@@ -1401,7 +1401,7 @@ Output:
<img alt="Cropped raster" src="../../../image/cropped-raster.png" width="400"/>
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_Clip(
@@ -1434,7 +1434,7 @@ Format: `RS_MetaData (raster: Raster)`
Since: `v1.4.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_MetaData(raster) FROM raster_table
@@ -1483,7 +1483,7 @@ RS_NormalizeAll (raster: Raster, minLim: Double, maxLim:
Double, normalizeAcross
Since: `v1.6.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_NormalizeAll(raster, 0, 1)
@@ -1497,7 +1497,7 @@ Format: `RS_NumBands (raster: Raster)`
Since: `v1.4.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_NumBands(raster) FROM raster_table
@@ -1549,7 +1549,7 @@ RS_Resample(raster: Raster, referenceRaster: Raster,
useScale: Boolean, algorith
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
WITH INPUT_RASTER AS (
@@ -1575,7 +1575,7 @@ Output:
(-0.33333333333333326,0.19999999999999996,6,5,1.388888888888889,-1.24,0,0,0,1)
```
-Spark SQL Example:
+SQL Example
```sql
WITH INPUT_RASTER AS (
@@ -1601,7 +1601,7 @@ Output:
(0.0, 0.0, 7.0, 5.0, 1.2, -1.4, 0.0, 0.0, 0.0, 1.0)
```
-Spark SQL Example:
+SQL Example
```sql
WITH INPUT_RASTER AS (
@@ -1653,7 +1653,7 @@ RS_SetBandNoDataValue(raster: Raster, bandIndex: Integer
= 1, noDataValue: Doubl
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_BandNoDataValue(
@@ -1701,7 +1701,7 @@ ScaleX SkewY SkewX ScaleY UpperLeftX UpperLeftY
ScaleX SkewY SkewX ScaleY (UpperLeftX + ScaleX * 0.5) (UpperLeftY + ScaleY *
0.5)
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_GeoReference(
@@ -1723,7 +1723,7 @@ Output:
3.000000
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_GeoReference(
@@ -1745,7 +1745,7 @@ Output:
2.000000
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_GeoReference(
@@ -1779,7 +1779,7 @@ RS_SetValue(raster: Raster, bandIndex: Integer = 1, colX:
Integer, rowY: Integer
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_BandAsArray(
@@ -1832,7 +1832,7 @@ The geometry variant of this function accepts all types
of Geometries and it set
!!!Note
If the mentioned `bandIndex` doesn't exist, this will throw an
`IllegalArgumentException`.
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_BandAsArray(
@@ -1852,7 +1852,7 @@ Output:
Array(1.0, 1.0, 1.0, 0.0, 0.0, 0.0, 11.0, 12.0, 13.0, 3.0, 5.0, 14.0, 15.0,
16.0, 0.0, 3.0, 17.0, 18.0, 19.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0)
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_BandAsArray(
@@ -1880,7 +1880,7 @@ Format: `RS_SetSRID (raster: Raster, srid: Integer)`
Since: `v1.4.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_SetSRID(raster, 4326)
@@ -1895,7 +1895,7 @@ Format: `RS_SRID (raster: Raster)`
Since: `v1.4.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_SRID(raster) FROM raster_table
@@ -1966,7 +1966,7 @@ RS_Values (raster: Raster, xCoordinates: ARRAY[Integer],
yCoordinates: ARRAY[Int
Since: `v1.4.0`
-Spark SQL Example:
+SQL Example
- For Array of Point geometries:
@@ -2144,7 +2144,7 @@ Since: `v1.4.1`
BandIndex is 1-based and must be between 1 and RS_NumBands(raster). It returns
null if the bandIndex is out of range or the raster is null.
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_BandAsArray(raster, 1) FROM raster_table
@@ -2191,7 +2191,7 @@ Note that: `bandIndex == RS_NumBands(raster) + 1` is an
experimental feature and
!!!Note
RS_AddBandFromArray typecasts the double band values to the given datatype
of the raster. This can lead to overflow values if values beyond the range of
the raster's datatype are provided.
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_AddBandFromArray(raster,
RS_MultiplyFactor(RS_BandAsArray(RS_FromGeoTiff(content), 1), 2)) AS raster
FROM raster_table
@@ -2231,7 +2231,7 @@ Since: `v1.5.0`
`RS_MapAlgebra` runs a script on a raster. The script is written in a map
algebra language called
[Jiffle](https://github.com/geosolutions-it/jai-ext/wiki/Jiffle). The script
takes a raster
as input and returns a raster of the same size as output. The script can be
used to apply a map algebra expression on a raster. The input raster is named
`rast` in the Jiffle script, and the output raster is named `out`.
-Spark SQL Example:
+SQL Example
Calculate the NDVI of a raster with 4 bands (R, G, B, NIR):
@@ -2271,7 +2271,7 @@ Format: `RS_Add (Band1: ARRAY[Double], Band2:
ARRAY[Double])`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val sumDF = spark.sql("select RS_Add(band1, band2) as sumOfBands from
dataframe")
@@ -2285,7 +2285,7 @@ Format: `RS_Array(length: Integer, value: Double)`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
SELECT RS_Array(height * width, 0.0)
@@ -2299,7 +2299,7 @@ Format: `RS_BitwiseAND (Band1: ARRAY[Double], Band2:
ARRAY[Double])`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val biwiseandDF = spark.sql("select RS_BitwiseAND(band1, band2) as andvalue
from dataframe")
@@ -2313,7 +2313,7 @@ Format: `RS_BitwiseOR (Band1: ARRAY[Double], Band2:
ARRAY[Double])`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val biwiseorDF = spark.sql("select RS_BitwiseOR(band1, band2) as or from
dataframe")
@@ -2327,7 +2327,7 @@ Format: `RS_CountValue (Band1: ARRAY[Double], Target:
Double)`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val countDF = spark.sql("select RS_CountValue(band1, target) as count from
dataframe")
@@ -2341,7 +2341,7 @@ Format: `RS_Divide (Band1: ARRAY[Double], Band2:
ARRAY[Double])`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val multiplyDF = spark.sql("select RS_Divide(band1, band2) as divideBands from
dataframe")
@@ -2359,7 +2359,7 @@ RS_FetchRegion (Band: ARRAY[Double], coordinates:
ARRAY[Integer], dimensions: AR
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val region = spark.sql("select RS_FetchRegion(Band,Array(0, 0, 1, 2),Array(3,
3)) as Region from dataframe")
@@ -2373,7 +2373,7 @@ Format: `RS_GreaterThan (Band: ARRAY[Double], Target:
Double)`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val greaterDF = spark.sql("select RS_GreaterThan(band, target) as maskedvalues
from dataframe")
@@ -2387,7 +2387,7 @@ Format: `RS_GreaterThanEqual (Band: ARRAY[Double],
Target: Double)`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val greaterEqualDF = spark.sql("select RS_GreaterThanEqual(band, target) as
maskedvalues from dataframe")
@@ -2401,7 +2401,7 @@ Format: `RS_LessThan (Band: ARRAY[Double], Target:
Double)`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val lessDF = spark.sql("select RS_LessThan(band, target) as maskedvalues from
dataframe")
@@ -2415,7 +2415,7 @@ Format: `RS_LessThanEqual (Band: ARRAY[Double], Target:
Double)`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val lessEqualDF = spark.sql("select RS_LessThanEqual(band, target) as
maskedvalues from dataframe")
@@ -2429,7 +2429,7 @@ Format: `RS_LogicalDifference (Band1: ARRAY[Double],
Band2: ARRAY[Double])`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val logicalDifference = spark.sql("select RS_LogicalDifference(band1, band2)
as logdifference from dataframe")
@@ -2443,7 +2443,7 @@ Format: `RS_LogicalOver (Band1: ARRAY[Double], Band2:
ARRAY[Double])`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val logicalOver = spark.sql("select RS_LogicalOver(band1, band2) as logover
from dataframe")
@@ -2457,7 +2457,7 @@ Format: `RS_Mean (Band: ARRAY[Double])`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val meanDF = spark.sql("select RS_Mean(band) as mean from dataframe")
@@ -2471,7 +2471,7 @@ Format: `RS_Mode (Band: ARRAY[Double])`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val modeDF = spark.sql("select RS_Mode(band) as mode from dataframe")
@@ -2485,7 +2485,7 @@ Format: `RS_Modulo (Band: ARRAY[Double], Target: Double)`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val moduloDF = spark.sql("select RS_Modulo(band, target) as modulo from
dataframe")
@@ -2499,7 +2499,7 @@ Format: `RS_Multiply (Band1: ARRAY[Double], Band2:
ARRAY[Double])`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val multiplyDF = spark.sql("select RS_Multiply(band1, band2) as multiplyBands
from dataframe")
@@ -2513,7 +2513,7 @@ Format: `RS_MultiplyFactor (Band1: ARRAY[Double], Factor:
Double)`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val multiplyFactorDF = spark.sql("select RS_MultiplyFactor(band1, 2) as
multiplyfactor from dataframe")
@@ -2529,7 +2529,7 @@ Format: `RS_Normalize (Band: ARRAY[Double])`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_Normalize(band)
@@ -2543,7 +2543,7 @@ Format: `RS_NormalizedDifference (Band1: ARRAY[Double],
Band2: ARRAY[Double])`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val normalizedDF = spark.sql("select RS_NormalizedDifference(band1, band2) as
normdifference from dataframe")
@@ -2557,7 +2557,7 @@ Format: `RS_SquareRoot (Band: ARRAY[Double])`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val rootDF = spark.sql("select RS_SquareRoot(band) as squareroot from
dataframe")
@@ -2571,7 +2571,7 @@ Format: `RS_Subtract (Band1: ARRAY[Double], Band2:
ARRAY[Double])`
Since: `v1.1.0`
-Spark SQL Example:
+SQL Example
```scala
val subtractDF = spark.sql("select RS_Subtract(band1, band2) as
differenceOfOfBands from dataframe")
diff --git a/docs/api/sql/Raster-visualizer.md
b/docs/api/sql/Raster-visualizer.md
index 7b044abfe..7df0a6c6f 100644
--- a/docs/api/sql/Raster-visualizer.md
+++ b/docs/api/sql/Raster-visualizer.md
@@ -19,7 +19,7 @@ Format:
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_AsBase64(raster) from rasters
@@ -39,7 +39,7 @@ Format: `RS_AsImage(raster: Raster, imageWidth: Integer =
200)`
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_AsImage(raster, 500) from rasters
@@ -109,7 +109,7 @@ RS_AsMatrix(raster: Raster, band: Integer = 1,
postDecimalPrecision: Integer = 6
Since: `1.5.0`
-Spark SQL Example:
+SQL Example
```scala
val inputDf = Seq(Seq(1, 3.333333, 4, 0.0001, 2.2222, 9, 10, 11.11111111, 3,
4, 5, 6)).toDF("band")
@@ -124,7 +124,7 @@ Output:
| 3.00000 4.00000 5.00000 6.00000|
```
-Spark SQL Example:
+SQL Example
```scala
val inputDf = Seq(Seq(1, 3, 4, 0, 2, 9, 10, 11, 3, 4, 5, 6)).toDF("band")
diff --git a/docs/api/sql/Raster-writer.md b/docs/api/sql/Raster-writer.md
index a75ab61c2..ab324b69b 100644
--- a/docs/api/sql/Raster-writer.md
+++ b/docs/api/sql/Raster-writer.md
@@ -23,13 +23,13 @@ Format:
Since: `v1.4.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_AsArcGrid(raster) FROM my_raster_table
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_AsArcGrid(raster, 1) FROM my_raster_table
@@ -68,13 +68,13 @@ Format:
Since: `v1.4.1`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_AsGeoTiff(raster) FROM my_raster_table
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_AsGeoTiff(raster, 'LZW', '0.75') FROM my_raster_table
@@ -115,7 +115,7 @@ Format:
Since: `v1.5.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_AsPNG(raster) FROM Rasters
@@ -127,7 +127,7 @@ Output:
[-119, 80, 78, 71, 13, 10, 26, 10, 0, 0, 0, 13, 73...]
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_AsPNG(RS_Band(raster, Array(3, 1, 2)))
@@ -247,7 +247,7 @@ Since: `v1.5.0`
```
If a raster is provided with anyone of these properties then
IllegalArgumentException is thrown.
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_AsRaster(
@@ -263,7 +263,7 @@ Output:
GridCoverage2D["g...
```
-Spark SQL Example:
+SQL Example
```sql
SELECT RS_AsRaster(
diff --git a/docs/api/viz/sql.md b/docs/api/viz/sql.md
index dfea58175..67582cec0 100644
--- a/docs/api/viz/sql.md
+++ b/docs/api/viz/sql.md
@@ -37,7 +37,7 @@ Since: `v1.0.0`
This function will normalize the weight according to the max weight among all
pixels. Different pixel obtains different color.
-Spark SQL example:
+SQL Example
```sql
SELECT pixels.px, ST_Colorize(pixels.weight, 999) AS color
FROM pixels
@@ -47,7 +47,7 @@ FROM pixels
If a mandatory color name is put as the third input argument, this function
will directly output this color, without considering the weights. In this case,
every pixel will possess the same color.
-Spark SQL Example:
+SQL Example
```sql
SELECT pixels.px, ST_Colorize(pixels.weight, 999, 'red') AS color
@@ -72,7 +72,7 @@ Format: `ST_EncodeImage (A: Image)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_EncodeImage(images.img)
@@ -93,7 +93,7 @@ ST_Pixelize (A: Geometry, ResolutionX: Integer, ResolutionY:
Integer, Boundary:
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_Pixelize(shape, 256, 256, (ST_Envelope_Aggr(shape) FROM pointtable))
@@ -111,7 +111,7 @@ Format: `ST_TileName (A: Pixel, ZoomLevel: Integer)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT ST_TileName(pixels.px, 3)
@@ -128,7 +128,7 @@ Format: `ST_Render (A: Pixel, B: Color, C: Integer -
optional zoom level)`
Since: `v1.0.0`
-Spark SQL Example:
+SQL Example
```sql
SELECT tilename, ST_Render(pixels.px, pixels.color) AS tileimg
diff --git a/mkdocs.yml b/mkdocs.yml
index 1f3b533cc..3f48ae939 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -48,9 +48,9 @@ nav:
- Tune RDD application:
tutorial/Advanced-Tutorial-Tune-your-Application.md
- Storing large raster geometries in Parquet files:
tutorial/storing-blobs-in-parquet.md
- Sedona with Apache Flink:
- - Spatial SQL app: tutorial/flink/sql.md
+ - Spatial SQL app (Flink): tutorial/flink/sql.md
- Sedona with Snowflake:
- - Spatial SQL app: tutorial/snowflake/sql.md
+ - Spatial SQL app (Flink): tutorial/snowflake/sql.md
- Examples:
- Scala/Java: tutorial/demo.md
- Python: tutorial/jupyter-notebook.md
@@ -85,18 +85,18 @@ nav:
- Sedona R: api/rdocs
- Sedona with Apache Flink:
- SQL:
- - Overview: api/flink/Overview.md
- - Constructor: api/flink/Constructor.md
- - Function: api/flink/Function.md
- - Aggregator: api/flink/Aggregator.md
- - Predicate: api/flink/Predicate.md
+ - Overview (Flink): api/flink/Overview.md
+ - Constructor (Flink): api/flink/Constructor.md
+ - Function (Flink): api/flink/Function.md
+ - Aggregator (Flink): api/flink/Aggregator.md
+ - Predicate (Flink): api/flink/Predicate.md
- Sedona with Snowflake:
- SQL:
- - Overview: api/snowflake/vector-data/Overview.md
- - Constructor: api/snowflake/vector-data/Constructor.md
- - Function: api/snowflake/vector-data/Function.md
- - Aggregate Function: api/snowflake/vector-data/AggregateFunction.md
- - Predicate: api/snowflake/vector-data/Predicate.md
+ - Overview (Snowflake): api/snowflake/vector-data/Overview.md
+ - Constructor (Snowflake): api/snowflake/vector-data/Constructor.md
+ - Function (Snowflake): api/snowflake/vector-data/Function.md
+ - Aggregate Function (Snowflake):
api/snowflake/vector-data/AggregateFunction.md
+ - Predicate (Snowflake): api/snowflake/vector-data/Predicate.md
- Community:
- Community: community/contact.md
- Contributor Guide: