1
votes

I have a Spring Boot based library (using spring-data-mongo) that creates a PersistentEntities bean. PersistentEntities happens to implement the Supplier<T> interface, so the Spring Cloud Stream functional binder is creating a binding to it. More specifically, BeanFactoryAwareFunctionRegistry.discoverDefaultDefinitionIfNecessary finds it as a bean of type Supplier.

We are using the Spring Cloud Streams Kafka binder, so Spring tries to publish each of these objects to a Kafka topic it creates. This causes an infinite recursion issue in the JSON serializer:

2019-12-04 15:36:54.323 ERROR 1 --- [ scheduling-1] o.s.i.h.LoggingHandler : org.springframework.messaging.MessagingException: Failed to invoke method; nested exception is org.springframework.messaging.converter.MessageConversionException: Could not write JSON: Infinite recursion (StackOverflowError) (through reference chain: org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity["idProperty"] -> org.springframework.data.mongodb.core.mapping.CachingMongoPersistentProperty["owner"] -> org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity["idProperty"] -> org.springframework.data.mongodb.core.mapping.CachingMongoPersistentProperty["owner"] -> org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity["idProperty"] -> org.springframework.data.mongodb.core.mapping.CachingMongoPersistentProperty["owner"] -> org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity["idProperty"] -> org.springframework.data.mongodb.core.mapping.CachingMongoPersistentProperty["owner"] -> org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity["idProperty"] -> org.springframework.data.mongodb.core.mapping.CachingMongoPersistentProperty["owner"] -> org.springframework.data.mongodb.core.mapping.BasicMongoPersistentEntity["idProperty"] -> org.springframework.data.mongodb.core.mapping.CachingMongoPersistentProperty["owner"] ...

Is there a way to exclude my bean from function binding? The project consuming this library isn't using Spring Cloud Function, but I'd prefer to leave that possibility open.

For reference, my bean is defined as:

@Bean
public PersistentEntities myPersistentEntities(List<MongoTemplate> mongoTemplates) {
    return new PersistentEntities(() -> {
        List<MappingContext<?, ?>> mappingContexts = mongoTemplates.stream().map(t -> t.getConverter().getMappingContext()).collect(Collectors.toList());
        return mappingContexts.iterator();
    });
}

We just upgraded Spring Cloud from Greenwich to Hoxton, so the automatic functional bindings are new to us.

2

2 Answers

3
votes

Generally, you can exclude spring-cloud-function by explicitly excluding it as

@SpringBootApplication(exclude = ContextFunctionCatalogAutoConfiguration.class)

That said, please raise an issue - https://github.com/spring-cloud/spring-cloud-stream/issues. Variants of this has come up before and I am starting to believe we need a better solution than described above.

Another workaround is to explicitly specify spring.cloud.function.definition=blah property where blah is something that doesn't exist. Ugly, but does the trick and it does not require recompilation since no annotations or additional attributes are involved.

But as I said, please raise an issue, link to this post and we'll address it for SR1 which should be before the end of the year.

0
votes

So, I ran into something similar. TL;DR fix was to explicitly define the functions that are available to Spring Cloud Streams like this:

Consumer Bean name: inputConsumer

spring:
  cloud:
    stream:
      function:
        bindings:
          inputConsumer-in-0: DataInputBinding
        definition: inputConsumer
      bindings:
        DataInputBinding:
          binder: kinesis
          destination: whatever
          group: whatever

In my situation, my application had another spring Component that implements Supplier. Without explicitly defining the functions in the configuration, Spring Cloud Streams just adds all Function, Consumer and Supplier beans to the FunctionCatalog and then expects all of them to attach to a stream.

Apparently, if they are not, then it just doesn't attach any of them and nothing works. :/