2
votes

I have a flat file containing different records(header, record and footer)

HR,...
RD,...
FR,...

ItemReader

    @Bean
    @StepScope
    public FlatFileItemReader reader(@Value("#{jobParameters['inputFileName']}") String inputFileName) {
        FlatFileItemReader reader = new FlatFileItemReader();
        reader.setResource(new FileSystemResource(inputFileName));
        reader.setLineMapper(patternLineMapper());
        return reader;
    }
    @Bean
    public LineMapper patternLineMapper() {
        PatternMatchingCompositeLineMapper patternLineMapper = new PatternMatchingCompositeLineMapper<>();
        tokenizers = new HashMap<String, LineTokenizer>();
        try {
            tokenizers.put("HR*", headerLineTokenizer());
            tokenizers.put("RD*", recordLineTokenizer());
            tokenizers.put("FR*", footerLineTokenizer());
        } catch (Exception e) {
            e.printStackTrace();
        }
        fieldSetMappers = new HashMap<String, FieldSetMapper>();
        fieldSetMappers.put("HR*", new HeaderFieldSetMapper());
        fieldSetMappers.put("RD*", new RecordFieldSetMapper());
        fieldSetMappers.put("FR*", new FooterFieldSetMapper());
        patternLineMapper.setTokenizers(tokenizers);
        patternLineMapper.setFieldSetMappers(fieldSetMappers);
        return patternLineMapper;
    }

They are working fine and spring batch calls the appropriate reader for each record the problem is when it comes to item processor I want to use the same approach I get java.lang.ClassCastException cuz spring batch try to map domain object [returned from reader] to java.lang.String

ItemProcessor

    @Bean
    @StepScope
    public ItemProcessor processor() {

        ClassifierCompositeItemProcessor processor = new ClassifierCompositeItemProcessor();
        PatternMatchingClassifier<ItemProcessor> classifier = new PatternMatchingClassifier<>();
        Map<String, ItemProcessor> patternMap = new HashMap<>();
        patternMap.put("HR*", new HeaderItemProcessor());
        patternMap.put("RD*", new RecordItemProcessor());
        patternMap.put("FR*", new FooterItemProcessor());
        classifier.setPatternMap(patternMap);
        processor.setClassifier(classifier);
        return processor;
    }

I also used BackToBackPatternClassifier but it turns out it has a bug and when I use generics like ItemWriter<Object> I get an exception Couldn't Open File. the question is How can I make ItemProcessor that handles different record types returned from Reader??

1

1 Answers

1
votes

Your issue is that the classifier you use in the ClassifierCompositeItemProcessor is based on a String pattern and not a type. What really should happen is something like:

The reader returns a specific type of items based on the input pattern, something like:

  • HR* -> HRType
  • RD* -> RDType
  • FR* -> FRType

This is what you have basically done on the reader side. Now on the processing side, the processor will receive objects of type HRType, RDType and FRType. So the classifier should not be based on String as input type, but on the item type, something like:

    Map<Object, ItemProcessor> patternMap = new HashMap<>();
    patternMap.put(HRType.class, new HeaderItemProcessor());
    patternMap.put(RDType.class, new RecordItemProcessor());
    patternMap.put(FRType.class, new FooterItemProcessor());

This classifier uses Object type because your ItemReader returns a raw type. I would not recommend using raw types and Object type in the classifier. What you should do is:

  1. create a base class of your items and a specific class for each type
  2. Make the reader return items of type <? extends BaseClass>
  3. Use a org.springframework.classify.SubclassClassifier in your ClassifierCompositeItemProcessor