# flink-cdc-connectors **Repository Path**: mirrors_ververica/flink-cdc-connectors ## Basic Information - **Project Name**: flink-cdc-connectors - **Description**: Flink CDC is a streaming data integration tool - **Primary Language**: Unknown - **License**: Apache-2.0 - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 4 - **Forks**: 4 - **Created**: 2022-10-11 - **Last Updated**: 2026-04-13 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README
Flink CDC is a distributed data integration tool for real-time data and batch data, built on top of Apache Flink. It prioritizes efficient end-to-end data integration and offers enhanced functionalities such as full database synchronization, sharding table synchronization, schema evolution and data transformation.  ## API Layers Flink CDC provides three API layers for different usage scenarios: ### 1. YAML API (Pipeline API) The YAML API provides a declarative, zero-code approach to define data pipelines. Users describe the source, sink, [routing](https://nightlies.apache.org/flink/flink-cdc-docs-stable/docs/core-concept/route/), [transformation](https://nightlies.apache.org/flink/flink-cdc-docs-stable/docs/core-concept/transform/), and [schema evolution](https://nightlies.apache.org/flink/flink-cdc-docs-stable/docs/core-concept/schema-evolution/) rules in a YAML file and submit it via the `flink-cdc.sh` CLI. Please refer to the [Quickstart Guide](https://nightlies.apache.org/flink/flink-cdc-docs-stable/docs/get-started/introduction/) for detailed setup instructions. ```yaml source: type: mysql hostname: localhost port: 3306 username: root password: 123456 tables: app_db.\.* sink: type: doris fenodes: 127.0.0.1:8030 username: root password: "" # Transform data on-the-fly transform: - source-table: app_db.orders projection: id, order_id, UPPER(product_name) as product_name filter: id > 10 AND order_id > 100 # Route source tables to different sink tables route: - source-table: app_db.orders sink-table: ods_db.ods_orders - source-table: app_db.shipments sink-table: ods_db.ods_shipments - source-table: app_db.\.* sink-table: ods_db.others pipeline: name: Sync MySQL Database to Doris parallelism: 2 schema.change.behavior: evolve # Support schema evolution ``` **Pipeline connectors:** [Doris](https://mvnrepository.com/artifact/org.apache.flink/flink-cdc-pipeline-connector-doris) | [Elasticsearch](https://mvnrepository.com/artifact/org.apache.flink/flink-cdc-pipeline-connector-elasticsearch) | [Fluss](https://mvnrepository.com/artifact/org.apache.flink/flink-cdc-pipeline-connector-fluss) | [Hudi](https://mvnrepository.com/artifact/org.apache.flink/flink-cdc-pipeline-connector-hudi) | [Iceberg](https://mvnrepository.com/artifact/org.apache.flink/flink-cdc-pipeline-connector-iceberg) | [Kafka](https://mvnrepository.com/artifact/org.apache.flink/flink-cdc-pipeline-connector-kafka) | [MaxCompute](https://mvnrepository.com/artifact/org.apache.flink/flink-cdc-pipeline-connector-maxcompute) | [MySQL](https://mvnrepository.com/artifact/org.apache.flink/flink-cdc-pipeline-connector-mysql) | [OceanBase](https://mvnrepository.com/artifact/org.apache.flink/flink-cdc-pipeline-connector-oceanbase) | [Oracle](https://mvnrepository.com/artifact/org.apache.flink/flink-cdc-pipeline-connector-oracle) | [Paimon](https://mvnrepository.com/artifact/org.apache.flink/flink-cdc-pipeline-connector-paimon) | [PostgreSQL](https://mvnrepository.com/artifact/org.apache.flink/flink-cdc-pipeline-connector-postgres) | [StarRocks](https://mvnrepository.com/artifact/org.apache.flink/flink-cdc-pipeline-connector-starrocks) See the [connector overview](https://nightlies.apache.org/flink/flink-cdc-docs-stable/docs/connectors/pipeline-connectors/overview/) for a full list and configurations. ### 2. SQL API (Table/SQL API) The SQL API integrates with Flink SQL, allowing users to define CDC sources using SQL DDL statements. Deploy the SQL connector JAR to `FLINK_HOME/lib/` and use it directly in Flink SQL Client: ```sql CREATE TABLE mysql_binlog ( id INT NOT NULL, name STRING, description STRING, weight DECIMAL(10,3), PRIMARY KEY(id) NOT ENFORCED ) WITH ( 'connector' = 'mysql-cdc', 'hostname' = 'localhost', 'port' = '3306', 'username' = 'flinkuser', 'password' = 'flinkpw', 'database-name' = 'inventory', 'table-name' = 'products' ); SELECT id, UPPER(name), description, weight FROM mysql_binlog; ``` **Available SQL connectors** (dependencies bundled): [MySQL](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-mysql-cdc) | [PostgreSQL](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-postgres-cdc) | [Oracle](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-oracle-cdc) | [SQL Server](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-sqlserver-cdc) | [MongoDB](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-mongodb-cdc) | [OceanBase](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-oceanbase-cdc) | [TiDB](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-tidb-cdc) | [Db2](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-db2-cdc) | [Vitess](https://mvnrepository.com/artifact/org.apache.flink/flink-sql-connector-vitess-cdc) See the [source connector overview](https://nightlies.apache.org/flink/flink-cdc-docs-stable/docs/connectors/flink-sources/overview/) for a full list and configurations. ### 3. DataStream API The DataStream API provides programmatic access for building custom Flink streaming applications. Add the corresponding connector as a Maven dependency: ```xml