0
votes

I have a requirement to ingest data from kafka to Snowflake. I wanted to design this ingestion using snowflake kafka connector. My understanding is that my sink needs to be a variant table, so the connector will push the kafka message to the variant column. Once the data is in the variant column, I will need to run another process to read the variant column, parse it and store the data into the respective target table. Please let me know if this will be an optimal approach to ingest kafka data

Thanks

1

1 Answers

0
votes

you are correct, this would be the best path forward. Your kafka sink will create tables automatically and ingest data to those tables. Then you could use a stream based task to load data from that table into another table. Check out this doc:

https://docs.snowflake.com/en/user-guide/data-pipelines-intro.html