I am having trouble figuring out how to test a Spring Cloud Stream Kafka Streams application that uses Avro as message format and a (Confluent) schema registry.
The configuration could be something like this:
spring:
application:
name: shipping-service
cloud:
stream:
schema-registry-client:
endpoint: http://localhost:8081
kafka:
streams:
binder:
configuration:
application:
id: shipping-service
default:
key:
serde: org.apache.kafka.common.serialization.Serdes$IntegerSerde
schema:
registry:
url: ${spring.cloud.stream.schema-registry-client.endpoint}
value:
subject:
name:
strategy: io.confluent.kafka.serializers.subject.RecordNameStrategy
bindings:
input:
consumer:
valueSerde: io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde
order:
consumer:
valueSerde: io.confluent.kafka.streams.serdes.avro.GenericAvroSerde
output:
producer:
valueSerde: io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde
bindings:
input:
destination: customer
order:
destination: order
output:
destination: order
server:
port: 8086
logging:
level:
org.springframework.kafka.config: debug
NOTES:
- It is using native serialization/deserialization.
- Test framework: Junit 5
I guess regarding the Kafka Broker I should use a EmbeddedKafkaBroker bean, but as you see, it also relies on a Schema Registry that should be mocked in some way. How?
@Value("\${spring.cloud.stream.schema-registry-client.endpoint}") endpoint: String), this library provides it at runtimethis.getSchemaRegistryUrl()- codependentspring.cloud.stream.schema-registry-client.endpointproperty the same way it's done with the boostrap servers here: github.com/spring-cloud/spring-cloud-stream-samples/blob/master/… - codependent