infoverload commented on a change in pull request #18812:
URL: https://github.com/apache/flink/pull/18812#discussion_r809826161



##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -24,39 +24,42 @@ under the License.
 
 # Connectors and Formats
 
-Flink can read from and write to various external systems via connectors and 
define the format in 
-which to store the data.
+Flink can read from and write to various external systems via connectors and 
use the format of your choice
+in order to read/write data from/into records.
 
-The way that information is serialized is represented in the external system 
and that system needs
-to know how to read this data in a format that can be read by Flink.  This is 
done through format 
-dependencies.
+An overview of available connectors and formats is available for both
+[DataStream]({{< ref "docs/connectors/datastream/overview.md" >}}) and
+[Table API/SQL]({{< ref "docs/connectors/table/overview.md" >}}).
 
-Most applications need specific connectors to run. Flink provides a set of 
formats that can be used 
-with connectors (with the dependencies for both being fairly unified). These 
are not part of Flink's 
-core dependencies and must be added as dependencies to the application.
+In order to use connectors and formats, you need to make sure Flink has access 
to the artifacts implementing them. 
+For each connector supported by the Flink community, we publish on [Maven 
Central](https://search.maven.org) two artifacts:
 
-## Adding Dependencies 
+* `flink-connector-<NAME>` which is a thin JAR including only the connector 
code, but excluding eventual 3rd party dependencies
+* `flink-sql-connector-<NAME>` which is an uber JAR ready to use with all the 
connector 3rd party dependencies.
 
-For more information on how to add dependencies, refer to the build tools 
sections on [Maven]({{< ref "docs/dev/configuration/maven" >}})
-and [Gradle]({{< ref "docs/dev/configuration/gradle" >}}). 
+The same applies for formats as well. Also note that some connectors, because 
they don't require 3rd party dependencies,
+may not have a corresponding `flink-sql-connector-<NAME>` artifact.
 
-## Packaging Dependencies
+{{< hint info >}}
+The uber JARs are supported mostly for being used in conjunction with [SQL 
client]({{< ref "docs/dev/table/sqlClient" >}}),
+but you can also use them in any DataStream/Table job.

Review comment:
       ```suggestion
   The uber/fat JARs are supported mostly for being used in conjunction with 
the [SQL client]({{< ref "docs/dev/table/sqlClient" >}}),
   but you can also use them in any DataStream/Table job.
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to