0
votes

I work on a task that requires:

  1. consuming data from JMS;
  2. processing it;
  3. loading it into a database.

As the documentation suggests:

  1. I start with <int-jms:message-driven-channel-adapter channel="CHANNEL1" ... /> to send new JMS messages to the CHANNEL1 channel;
  2. I apply the transformer that converts messages from the CHANNEL1 channel to JobLaunchRequest with a job that inserts data to the database and the payload that contains original JMS message's payload;
  3. The transformed messages go to the CHANNEL2 channel;
  4. <batch-int:job-launching-gateway request-channel="CHANNEL2"/> starts a new job execution when a new message appears in the channel;

The problem is that I start a new database transaction each time a new jms messages received.

The question: how should I handle such a flow? What is the common pattern for this?

UPDATE

I start the job for each message. One message contains one piece of data. If I resort in just using spring-batch then I will have to manage some sort of a poller (correct me if I am wrong), but I would like to apply a message driven approach like (either one):

  1. Grace period: when a new message appears I wait for 10 more messages or start processing everything I received 10 seconds after the first message is received.
  2. I simply read everything the JMS queue contains after I got notified that the queue contains a new message.

Of course, I would like the solution to be transnational: the order of message processing does not matter.

1
If I understand correctly, you run a job for each message, is that correct? What is the payload of a message? Is one message = one item to be inserted in the database or a message can contain multiple items? This is important for the design of the solution because probably you can have a single job that reads items from a JMS queue and insert them in a database.Mahmoud Ben Hassine
@MahmoudBenHassine, I updated the post, please check it.neshkeev
I start the job for each message. One message contains one piece of data.: This means you will have one job for each item, which is not batch processing anymore and does not make sense to me to use Spring Batch. Since a message is an item, I recommend to use a single job that reads messages from the queue and process/write them to the database. For transactions, you need to use a Jta transaction manager to coordinate transactions between the DB and the queue (in case of rollback, the message goes back to the queue).Mahmoud Ben Hassine
Yes, you got my point: I understand that when I run one job for one message this is not Spring Batch anymore and I don't like it. I also use Spring Batch for other stuff, like retries. How do you suggest to read messages using a job? Is it going to be a poller-based solution? I hope for a message-driven one.neshkeev
No hope for a message driven reader. A chunk-oriented step is executed within a transaction. If the reader is a listener, this means we start a transaction and then wait for a message to arrive, process it and wait for the next message, process it, etc, until a chunk is formed then we write the data and commit the transaction. What happens if no message arrives for an entire day? Should the transaction be kept open for that long? Do you see the issue? That's why readers are pulling data and the JmsItemReader is no different, it reads data from the queue until a timeout expires.Mahmoud Ben Hassine

1 Answers

1
votes

The BatchMessageListenerContainer can be used in your use case. It enables the batching of messages within a single transaction.

Note this class is not part of the main framework, it’s actually a test class, but you can use it if it fits your needs.

Hope this helps.