Skip to content

Conversation

kay-kim
Copy link
Contributor

@kay-kim kay-kim commented Oct 7, 2025

@kay-kim kay-kim requested a review from a team as a code owner October 7, 2025 15:23
write data to and details the encoding of that data. You can sink data from a
**materialized** view, a source, or a table.

### Creating a sink
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Moved into the kafka page since kafka specific.

- Avoid putting sinks on the same cluster that hosts sources to allow for
[blue/green deployment](/manage/dbt/blue-green-deployments).

### Available guides
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Included in the table.


Materialize bundles a **native connector** that allow writing data to Kafka and
Redpanda.
Redpanda. When a user defines a sink to Kafka/Redpanda, Materialize
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Content ported over from Sinks pages since was specifically kafka/redpanda connector.


Currently, Materialize only supports sending sink data to Kafka. See
the [Kafka sink documentation](/sql/create-sink/kafka) for details.
## Sink methods
Copy link
Contributor Author

@kay-kim kay-kim Oct 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.


## Overview

Sinks are the inverse of sources and represent a connection to an external stream
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Replaced kafka-specific content with more generic content (same as in the server-results/sink page)

## Hydration considerations

During creation, sinks need to load an entire snapshot of the data in memory.
During creation, Kafka sinks need to load an entire snapshot of the data in
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably will tweak to Kakfa/Redpanda sinks since ...

- External system: "Various"
Method: "Use `SUBSCRIBE`"
Guide(s) or Example(s): |
- [Sink to Postgres](https://github.com/MaterializeInc/mz-catalog-sync)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should I mention dbmate in here? that is Subscribe + dbmate?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This repo is private 😅 I'll make it public.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh ... heh heh ... Thank you ❤️ ❤️ ❤️

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants