I'm trying to sink data directly from KSQL into InfluxDB (or any other connector that would require definitions). I'm able to get things working in the simple case, but I start having trouble when the schema requires complex types. (I.e., tags for InfuxDB).
Here's an example of my stream/schema:
Field | Type
-------------------------------------------------------------------
ROWKEY | VARCHAR(STRING) (primary key)
FIELD_1 | VARCHAR(STRING)
FIELD_2 | VARCHAR(STRING)
FIELD_3 | VARCHAR(STRING)
FIELD_4 | DOUBLE
TAGS | MAP<STRING, VARCHAR(STRING)>
If I manually create an AVRO schema and populate the records from a simple producer, I can get through the getting started guide here and embed the tags for InfluxDB.
However, when I move to KSQL, if I try to sink the AVRO stream directly into InfluxDB, I lose information on the complex types (tags). I notice the warning from this blog post, "Warning ksqlDB/KSQL cannot yet write data in an Avro format that is compatible with this connector"
Next, I try converting the AVRO stream into JSON format, but now I understand that I would have to specify the schema in each record, similar to what this question is posing. I haven't been able to convert an AVRO stream into a JSON stream and embed the schema and payload at the same time.
Finally, I see the "jiggling solution" with kafkacat, but this would force me to dump records out from KSQL into kafkacat, and then back into Kafka before finally arriving at Influx.
Is there a method to sink complex records directly from KSQL in either JSON or AVRO format into a connector?