1
votes

I am using Spring cloud stream (kafka) to exchange messages between producer and consumer microservices. It exchanges data with native java serialization. As per Spring cloud documentation, It supports JSON,AVRO serialization.

Is any one tried protobuf serialization (message converter) in spring cloud stream

---------------- Later Added

I wrote this MessageConverter

public class ProtobufMessageConverter<T extends AbstractMessage > extends AbstractMessageConverter
{
    private  Parser<T> parser;
    public ProtobufMessageConverter(Parser<T> parser)

    {
        super(new MimeType("application", "protobuf"));
        this.parser = parser;
    }

    @Override
    protected boolean supports(Class<?> clazz)
    {
        if (clazz != null)
        {
            return EquipmentProto.Equipment.class.isAssignableFrom(clazz);
        }
        return true;
    }

    @Override
    public Object convertFromInternal(Message<?> message, Class<?> targetClass, Object conversionHint)
    {
        if (!(message.getPayload() instanceof byte[]))
        {
            return null;
        }
        try
        {
//            return EquipmentProto.Equipment.parseFrom((byte[]) message.getPayload());
            return parser.parseFrom((byte[]) message.getPayload());
        }
        catch (Exception e)
        {
            this.logger.error(e.getMessage(), e);
        }
        return null;
    }

    @Override
    protected Object convertToInternal(Object payload, MessageHeaders headers, Object conversionHint)
    {
        return ((AbstractMessage) payload).toByteArray();
    }

}
1

1 Answers

0
votes

It's really not a question of trying but rather just doing it, since converters are a natural extension mechanism (inherited fro spring-integration) in spring-cloud-stream that exists specifically to address these concerns. So yes, you can add your own custom converter.

Also, keep in mind that with Kafka there is also a concept of native serde, so you need to make sure that the two do not create some conflict.