Debezium is an open-source platform that enables Change Data Capture (CDC) for efficient real-time updates of locally accelerated datasets. Spice supports connecting to a Kafka topic managed by Debezium to keep datasets up-to-date with the source data.
fromThe from field takes the form of debezium:kafka_topic where kafka_topic is the name of the Kafka topic where Debezium is notifying consumers about any upstream changes. In the example above it would listen to the my_kafka_topic_with_debezium_changes topic.
nameThe dataset name. This will be used as the table name within Spice.
The dataset name cannot be a reserved keyword.
params| Parameter Name | Description |
|---|---|
debezium_transport | Optional. The message broker transport to use. The default is kafka. Possible values:kafka: Use Kafka as the message broker transport. Spice may support additional transports in the future. |
debezium_message_format | Optional. The message format to use. The default is json. Possible values: json: Use JSON as the message format. Spice is expected to support additional message formats in the future, like avro. |
kafka_bootstrap_servers | Required. A list of host/port pairs for establishing the initial Kafka cluster connection. The client will use all servers, regardless of the bootstrapping servers specified here. This list only affects the initial hosts used to discover the full server set and should be formatted as host1:port1,host2:port2,.... |
kafka_security_protocol | Security protocol for Kafka connections. Default: SASL_SSL. Options: PLAINTEXTSSLSASL_PLAINTEXT |
:::warning
Using the Debezium connector requires acceleration to be enabled.
:::
The following settings are required:
| Parameter Name | Description |
|---|---|
enabled | Required. Must be set to true to enable acceleration. |
engine | Required. The acceleration engine to use. Possible valid values: duckdb: Use DuckDB as the acceleration engine.sqlite: Use SQLite as the acceleration engine.postgres: Use PostgreSQL as the acceleration engine. |
refresh_mode | Optional. The refresh mode to use. If specified, this must be set to changes. Any other value is an error. |
mode | Optional. The persistence mode to use. When using the duckdb and sqlite engines, it is recommended to set this to file to persist the data across restarts. Spice also persists metadata about the dataset, so it can resume from the last known state of the dataset instead of re-fetching the entire dataset. |
Spice integrates with multiple secret stores to help manage sensitive data securely. For detailed information on supported secret stores, refer to the secret stores documentation. Additionally, learn how to use referenced secrets in component parameters by visiting the using referenced secrets guide.
See an example of configuring a dataset to use CDC with Debezium by following the sample Streaming changes in real-time with Debezium CDC.
An example of Streaming changes in real-time with Debezium CDC and SASL/SCRAM authentication is available as well.
SASL_SSLkafka_sasl_mechanism | SASL (Simple Authentication and Security Layer) authentication mechanism. Default: SCRAM-SHA-512. Options: PLAINSCRAM-SHA-256SCRAM-SHA-512 |
kafka_sasl_username | SASL username. |
kafka_sasl_password | SASL password. |
kafka_ssl_ca_location | Path to the SSL/TLS CA certificate file for server verification. |
kafka_enable_ssl_certificate_verification | Enable SSL/TLS certificate verification. Default: true. |
kafka_ssl_endpoint_identification_algorithm | SSL/TLS endpoint identification algorithm. Default: https. Options: nonehttps |