1
votes

I'm trying to use spring-cloud-stream 1.0.0.M4 with various binders. I'm not sharing my data transfer objects between producer and consumer (who does that?) so I've ran into the need to include content-type configuration for the bindings.

producer config:

spring:
  cloud:
    stream:
      bindings:
        customer-save: "customer-save"
        customer-save.content-type: application/json

producer code:

public interface CustomerChannels {

    @Output("customer-save")
    MessageChannel save();

}

@Service
public class CustomerServiceImpl implements CustomerService {

    @Autowired
    private CustomerChannels customerChannels;

    ...

    @Override
    public void insertCustomer(Customer customer) {
        customerChannels.save().send(MessageBuilder.withPayload(customer).build());
    }

consumer config:

spring:
  cloud:
    stream:
      bindings:
        customer-save: "customer-save"
        customer-save.content-type: application/x-java-object;type=com.build.customer.domain.Customer

consumer code:

public interface CustomerChannels {

    String CUSTOMER_SAVE = "customer-save";

    @Input(CUSTOMER_SAVE)
    SubscribableChannel save();
}

@MessageEndpoint
public class CustomerProcessor {

    @Autowired
    private CustomerDao customerDao;

    @ServiceActivator(inputChannel = CustomerChannels.CUSTOMER_SAVE)
    public void saveCustomer(Customer customer) {
        if (customer.getId() == null) {
            customerDao.insertCustomer(customer);
        } else {
            customerDao.updateCustomer(customer);
        }
    }
}

I've tried this with rabbit, redis, and kafka binders and found that the json hack only works with rabbit. I get the following errors in the consumer with kafka and redis.

kafka consumer error:

2016-03-05 21:54:43.337 ERROR 18846 --- [pool-8-thread-1] o.s.i.k.listener.LoggingErrorHandler     : Error while processing: KafkaMessage [Message(magic = 0, attributes = 0, crc = 2188302100, key = null, payload = java.nio.HeapByteBuffer[pos=0 lim=248 cap=248]), KafkaMessageMetadata [offset=4, nextOffset=5, Partition[topic='customer-save', id=0]]

org.springframework.messaging.MessageDeliveryException: failed to send Message to channel 'customer-save'; nested exception is java.lang.IllegalArgumentException: Unknown type for contentType header value: class java.util.LinkedHashMap
    at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:468) ~[spring-integration-core-4.2.5.RELEASE.jar:na]
    at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:392) ~[spring-integration-core-4.2.5.RELEASE.jar:na]
    at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:115) ~[spring-messaging-4.2.5.RELEASE.jar:4.2.5.RELEASE]

redis consumer error:

2016-03-05 21:51:04.727 ERROR 18122 --- [hannel-adapter1] o.s.c.s.b.r.RedisMessageChannelBinder$1  : Failed to deliver message; retries exhausted; message sent to queue 'ERRORS:customer-save.anonymous.08e94dac-49fe-464e-b800-28a3844dbaf6' 

org.springframework.messaging.MessageDeliveryException: failed to send Message to channel 'customer-save'; nested exception is java.lang.IllegalArgumentException: Unknown type for contentType header value: class java.util.LinkedHashMap
    at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:468) [spring-integration-core-4.2.5.RELEASE.jar:na]
    at org.springframework.integration.channel.AbstractMessageChannel.send(AbstractMessageChannel.java:392) [spring-integration-core-4.2.5.RELEASE.jar:na]
    at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:115) [spring-messaging-4.2.5.RELEASE.jar:4.2.5.RELEASE]
1

1 Answers

0
votes

This is a known issue; it is a side effect of having to serialize the headers for Kafka and Redis.

It is fixed on master.

Try 1.0.0.BUILD-SNAPSHOT from the snapshot repo.