0
votes

ALL

I run the spring cloud dataflow with local server.when i run the stream which is defined like this

http --port=8787 | transform --expression=payload.getValue('name') |
jdbc --columns=name --table-name=http_table --password=******
--driver-class-name=org.mariadb.jdbc.Driver --username=root --url='jdbc:mysql://127.0.0.1:3306/target_data'

I use a processor transform

transform --expression=payload.getValue('name')

but it got error like that

2017-09-16 20:01:57,561 ERROR -kafka-listener-1 o.s.k.l.LoggingErrorHandler:37 - Error while processing: ConsumerRecord(topic = http_jdbc04.http, partition = 0, offset = 0, CreateTime = 1505563314419, checksum = 1043955671, serialized key size = -1, serialized value size = 102, key = null, value = [B@5b1b4ce3) org.springframework.integration.transformer.MessageTransformationException: Failed to transform Message; nested exception is org.springframework.messaging.MessageHandlingException: nested exception is org.springframework.expression.spel.SpelEvaluationException: EL1004E: Method call: Method getValue(java.lang.String) cannot be found on java.lang.String type, failedMessage=GenericMessage [payload={"name":"world"}, headers={kafka_offset=0, id=0392ca58-6644-91fb-9454-a41e83854955, kafka_receivedPartitionId=0, contentType=application/json;charset=UTF-8, kafka_receivedTopic=http_jdbc04.http, timestamp=1505563317552}]

Here is the doc that i follow spring-cloud-dataflow-docs

And the Spring cloud dataflow version:

1.spring cloud dataflow local server 1.2.3.RELEASE

2.Java: 1.8.0

1

1 Answers

0
votes

I solved the question.

this cannot work:

transform --expression=payload.getValue('name')

you should use the #jsonPath(payload, '<json path expression>') .